Critical carbon input to maintain current soil organic carbon stocks in global wheat systems
Wang, Guocheng; Luo, Zhongkui; Han, Pengfei; Chen, Huansheng; Xu, Jingjing
2016-01-01
Soil organic carbon (SOC) dynamics in croplands is a crucial component of global carbon (C) cycle. Depending on local environmental conditions and management practices, typical C input is generally required to reduce or reverse C loss in agricultural soils. No studies have quantified the critical C input for maintaining SOC at global scale with high resolution. Such information will provide a baseline map for assessing soil C dynamics under potential changes in management practices and climate, and thus enable development of management strategies to reduce C footprint from farm to regional scales. We used the soil C model RothC to simulate the critical C input rates needed to maintain existing soil C level at 0.1° × 0.1° resolution in global wheat systems. On average, the critical C input was estimated to be 2.0 Mg C ha−1 yr−1, with large spatial variability depending on local soil and climatic conditions. Higher C inputs are required in wheat system of central United States and western Europe, mainly due to the higher current soil C stocks present in these regions. The critical C input could be effectively estimated using a summary model driven by current SOC level, mean annual temperature, precipitation, and soil clay content. PMID:26759192
Quantified carbon input for maintaining existing soil organic carbon stocks in global wheat systems
NASA Astrophysics Data System (ADS)
Wang, G.
2017-12-01
Soil organic carbon (SOC) dynamics in croplands is a crucial component of global carbon (C) cycle. Depending on local environmental conditions and management practices, typical C input is generally required to reduce or reverse C loss in agricultural soils. No studies have quantified the critical C input for maintaining SOC at global scale with high resolution. Such information will provide a baseline map for assessing soil C dynamics under potential changes in management practices and climate, and thus enable development of management strategies to reduce C footprint from farm to regional scales. We used the soil C model RothC to simulate the critical C input rates needed to maintain existing soil C level at 0.1°× 0.1° resolution in global wheat systems. On average, the critical C input was estimated to be 2.0 Mg C ha-1 yr-1, with large spatial variability depending on local soil and climatic conditions. Higher C inputs are required in wheat system of central United States and western Europe, mainly due to the higher current soil C stocks present in these regions. The critical C input could be effectively estimated using a summary model driven by current SOC level, mean annual temperature, precipitation, and soil clay content.
Critical Needs of Students Who Are Deaf or Hard of Hearing: A Public Input Summary
ERIC Educational Resources Information Center
Szymanski, Christen; Lutz, Lori; Shahan, Cheryl; Gala, Nicholas
2013-01-01
As mandated by the Education of the Deaf Act (EDA), the Clerc Center is required "to establish and publish priorities for research, development, and demonstration through a process that allows for public input." The public input summarized in this paper informed the Clerc Center's selection of its national priorities for 2013-2018: 1)…
NASA Technical Reports Server (NTRS)
Cotariu, Steven S.
1991-01-01
Pattern recognition may supplement or replace certain navigational aids on spacecraft in docking or landing activities. The need to correctly identify terrain features remains critical in preparation of autonomous planetary landing. One technique that may solve this problem is optical correlation. Correlation has been successfully demonstrated under ideal conditions; however, noise significantly affects the ability of the correlator to accurately identify input signals. Optical correlation in the presence of noise must be successfully demonstrated before this technology can be incorporated into system design. An optical correlator is designed and constructed using a modified 2f configuration. Liquid crystal televisions (LCTV) are used as the spatial light modulators (SLM) for both the input and filter devices. The filter LCTV is characterized and an operating curve is developed. Determination of this operating curve is critical for reduction of input noise. Correlation of live input with a programmable filter is demonstrated.
NASA Astrophysics Data System (ADS)
Cotariu, Steven S.
1991-12-01
Pattern recognition may supplement or replace certain navigational aids on spacecraft in docking or landing activities. The need to correctly identify terrain features remains critical in preparation of autonomous planetary landing. One technique that may solve this problem is optical correlation. Correlation has been successfully demonstrated under ideal conditions; however, noise significantly affects the ability of the correlator to accurately identify input signals. Optical correlation in the presence of noise must be successfully demonstrated before this technology can be incorporated into system design. An optical correlator is designed and constructed using a modified 2f configuration. Liquid crystal televisions (LCTV) are used as the spatial light modulators (SLM) for both the input and filter devices. The filter LCTV is characterized and an operating curve is developed. Determination of this operating curve is critical for reduction of input noise. Correlation of live input with a programmable filter is demonstrated.
7 CFR 3430.904 - Project types and priorities.
Code of Federal Regulations, 2012 CFR
2012-01-01
... ADMINISTRATIVE PROVISIONS New Era Rural Technology Competitive Grants Program § 3430.904 Project types and... the critical needs identified through stakeholder input and deemed appropriate by NIFA. (a) In...
7 CFR 3430.904 - Project types and priorities.
Code of Federal Regulations, 2013 CFR
2013-01-01
... ADMINISTRATIVE PROVISIONS New Era Rural Technology Competitive Grants Program § 3430.904 Project types and... the critical needs identified through stakeholder input and deemed appropriate by NIFA. (a) In...
7 CFR 3430.904 - Project types and priorities.
Code of Federal Regulations, 2014 CFR
2014-01-01
... ADMINISTRATIVE PROVISIONS New Era Rural Technology Competitive Grants Program § 3430.904 Project types and... the critical needs identified through stakeholder input and deemed appropriate by NIFA. (a) In...
7 CFR 3430.904 - Project types and priorities.
Code of Federal Regulations, 2011 CFR
2011-01-01
... ADMINISTRATIVE PROVISIONS New Era Rural Technology Competitive Grants Program § 3430.904 Project types and... on the critical needs identified through stakeholder input and deemed appropriate by CSREES. (a) In...
Nanophotonics-enabled solar membrane distillation for off-grid water purification.
Dongare, Pratiksha D; Alabastri, Alessandro; Pedersen, Seth; Zodrow, Katherine R; Hogan, Nathaniel J; Neumann, Oara; Wu, Jinjian; Wang, Tianxiao; Deshmukh, Akshay; Elimelech, Menachem; Li, Qilin; Nordlander, Peter; Halas, Naomi J
2017-07-03
With more than a billion people lacking accessible drinking water, there is a critical need to convert nonpotable sources such as seawater to water suitable for human use. However, energy requirements of desalination plants account for half their operating costs, so alternative, lower energy approaches are equally critical. Membrane distillation (MD) has shown potential due to its low operating temperature and pressure requirements, but the requirement of heating the input water makes it energy intensive. Here, we demonstrate nanophotonics-enabled solar membrane distillation (NESMD), where highly localized photothermal heating induced by solar illumination alone drives the distillation process, entirely eliminating the requirement of heating the input water. Unlike MD, NESMD can be scaled to larger systems and shows increased efficiencies with decreased input flow velocities. Along with its increased efficiency at higher ambient temperatures, these properties all point to NESMD as a promising solution for household- or community-scale desalination.
Nanophotonics-enabled solar membrane distillation for off-grid water purification
Dongare, Pratiksha D.; Alabastri, Alessandro; Pedersen, Seth; Zodrow, Katherine R.; Hogan, Nathaniel J.; Neumann, Oara; Wu, Jinjian; Wang, Tianxiao; Deshmukh, Akshay; Elimelech, Menachem; Li, Qilin; Nordlander, Peter; Halas, Naomi J.
2017-01-01
With more than a billion people lacking accessible drinking water, there is a critical need to convert nonpotable sources such as seawater to water suitable for human use. However, energy requirements of desalination plants account for half their operating costs, so alternative, lower energy approaches are equally critical. Membrane distillation (MD) has shown potential due to its low operating temperature and pressure requirements, but the requirement of heating the input water makes it energy intensive. Here, we demonstrate nanophotonics-enabled solar membrane distillation (NESMD), where highly localized photothermal heating induced by solar illumination alone drives the distillation process, entirely eliminating the requirement of heating the input water. Unlike MD, NESMD can be scaled to larger systems and shows increased efficiencies with decreased input flow velocities. Along with its increased efficiency at higher ambient temperatures, these properties all point to NESMD as a promising solution for household- or community-scale desalination. PMID:28630307
Rojas Silva, Noelia; Padilla Fortunatti, Cristobal; Molina Muñoz, Yerko; Amthauer Rojas, Macarena
2017-12-01
The admission of a patient to an intensive care unit is an extraordinary event for their family. Although the Critical Care Family Needs Inventory is the most commonly used questionnaire for understanding the needs of relatives of critically ill patients, no Spanish-language version is available. The aim of this study was to culturally adapt and validate theCritical Care Family Needs Inventory in a sample of Chilean relatives of intensive care patients. The back-translated version of the inventory was culturally adapted following input from 12 intensive care and family experts. Then, it was evaluated by 10 relatives of recently transferred ICU patients and pre-tested in 10 relatives of patients that were in the intensive care unit. Psychometric properties were assessed through exploratory factor analysis and Cronbach's α in a sample of 251 relatives of critically ill patients. The Chilean-Spanish version of the Critical Care Family Needs Inventoryhad minimal semantic modifications and no items were deleted. A two factor solution explained the 31% of the total instrument variance. Reliability of the scale was good (α=0.93), as were both factors (α=0.87; α=0.93). The Chilean-Spanish version of theCritical Care Family Needs Inventory was found valid and reliable for understanding the needs of relatives of patients in acute care settings. Copyright © 2017 Elsevier Ltd. All rights reserved.
FACTORS INFLUENCING TOTAL DIETARY EXPOSURES OF YOUNG CHILDREN
A deterministic model was developed to identify the critical input parameters needed to assess dietary intakes of young children. The model was used as a framework for understanding the important factors in data collection and data analysis. Factors incorporated into the model i...
Issues for interpreting external stakeholder feedback on restructuring NCIC's research programs.
Ashbury, F D; Iverson, D C; Shephard, P J
1995-03-01
The National Cancer Institute of Canada surveyed members of its stakeholder groups on a number of issues pertaining to restructuring research programs. While it was hoped that the survey would ensure input from its primary stakeholder groups and thereby facilitate decision-making on critical issues like distribution of funds and research awards, there is reason to believe this may not have occurred. Some of the stakeholder groups seemed to be over-represented in the respondent population and the effect of this on the results was therefore examined. Analysis revealed several important issues: 1) a clear definition of who constitutes a "stakeholder" needs to be developed when stakeholder input-gathering is being contemplated; 2) multi-faceted strategies need to be developed to gain input from stakeholders; 3) potential sources of bias can emerge from the various techniques used to gather feedback from stakeholders; and 4) a clear outline of how the feedback is to be used in the decision-making process needs to be determined.
Braun, Debra; Barnhardt, Kim
2014-01-01
Including end users in evidence-based design is vital to outcomes. The physical environment impacts caregiver efficiencies, safety, satisfaction, and quality of patient outcomes. End users are more than members of the organization: patients should have representation as well. Patients bring value by offering insight from a different perspective. Timing is key; therefore, it is critical in obtaining desired outcomes, to include end users as early as possible, gaining the most insight into the design of the build. Consideration should also be given to best practice standards, regulatory compliance, progressive sciences, and technologies. Another vital factor is education of the end users on their role and expectations for participation in a design team. When end users are educated and understand the significance of input, the design team will be able to conceive a critical care unit that will meet needs for today and be able to adapt to needs for the future.
Theory of optimal information transmission in E. coli chemotaxis pathway
NASA Astrophysics Data System (ADS)
Micali, Gabriele; Endres, Robert G.
Bacteria live in complex microenvironments where they need to make critical decisions fast and reliably. These decisions are inherently affected by noise at all levels of the signaling pathway, and cells are often modeled as an input-output device that transmits extracellular stimuli (input) to internal proteins (channel), which determine the final behavior (output). Increasing the amount of transmitted information between input and output allows cells to better infer extracellular stimuli and respond accordingly. However, in contrast to electronic devices, the separation into input, channel, and output is not always clear in biological systems. Output might feed back into the input, and the channel, made by proteins, normally interacts with the input. Furthermore, a biological channel is affected by mutations and can change under evolutionary pressure. Here, we present a novel approach to maximize information transmission: given cell-external and internal noise, we analytically identify both input distributions and input-output relations that optimally transmit information. Using E. coli chemotaxis as an example, we conclude that its pathway is compatible with an optimal information transmission device despite the ultrasensitive rotary motors.
USDA-ARS?s Scientific Manuscript database
Livestock facilities have received numerous criticisms due to their emissions of odorous air and chemicals. Hence, there is a significant need for odor emission factors and identification of principle odorous chemicals. Odor emission factors are used as inputs to odor setback models, while chemica...
Shih, Peter; Kaul, Brian C; Jagannathan, Sarangapani; Drallmeier, James A
2009-10-01
A novel reinforcement-learning-based output adaptive neural network (NN) controller, which is also referred to as the adaptive-critic NN controller, is developed to deliver the desired tracking performance for a class of nonlinear discrete-time systems expressed in nonstrict feedback form in the presence of bounded and unknown disturbances. The adaptive-critic NN controller consists of an observer, a critic, and two action NNs. The observer estimates the states and output, and the two action NNs provide virtual and actual control inputs to the nonlinear discrete-time system. The critic approximates a certain strategic utility function, and the action NNs minimize the strategic utility function and control inputs. All NN weights adapt online toward minimization of a performance index, utilizing the gradient-descent-based rule, in contrast with iteration-based adaptive-critic schemes. Lyapunov functions are used to show the stability of the closed-loop tracking error, weights, and observer estimates. Separation and certainty equivalence principles, persistency of excitation condition, and linearity in the unknown parameter assumption are not needed. Experimental results on a spark ignition (SI) engine operating lean at an equivalence ratio of 0.75 show a significant (25%) reduction in cyclic dispersion in heat release with control, while the average fuel input changes by less than 1% compared with the uncontrolled case. Consequently, oxides of nitrogen (NO(x)) drop by 30%, and unburned hydrocarbons drop by 16% with control. Overall, NO(x)'s are reduced by over 80% compared with stoichiometric levels.
Prins, Noeline W.; Sanchez, Justin C.; Prasad, Abhishek
2014-01-01
Brain-Machine Interfaces (BMIs) can be used to restore function in people living with paralysis. Current BMIs require extensive calibration that increase the set-up times and external inputs for decoder training that may be difficult to produce in paralyzed individuals. Both these factors have presented challenges in transitioning the technology from research environments to activities of daily living (ADL). For BMIs to be seamlessly used in ADL, these issues should be handled with minimal external input thus reducing the need for a technician/caregiver to calibrate the system. Reinforcement Learning (RL) based BMIs are a good tool to be used when there is no external training signal and can provide an adaptive modality to train BMI decoders. However, RL based BMIs are sensitive to the feedback provided to adapt the BMI. In actor-critic BMIs, this feedback is provided by the critic and the overall system performance is limited by the critic accuracy. In this work, we developed an adaptive BMI that could handle inaccuracies in the critic feedback in an effort to produce more accurate RL based BMIs. We developed a confidence measure, which indicated how appropriate the feedback is for updating the decoding parameters of the actor. The results show that with the new update formulation, the critic accuracy is no longer a limiting factor for the overall performance. We tested and validated the system onthree different data sets: synthetic data generated by an Izhikevich neural spiking model, synthetic data with a Gaussian noise distribution, and data collected from a non-human primate engaged in a reaching task. All results indicated that the system with the critic confidence built in always outperformed the system without the critic confidence. Results of this study suggest the potential application of the technique in developing an autonomous BMI that does not need an external signal for training or extensive calibration. PMID:24904257
Eric Rowell; E. Louise Loudermilk; Carl Seielstad; Joseph O' Brien
2016-01-01
Understanding fine-scale variability in understory fuels is increasingly important as physics-based fire behavior modelsdrive needs for higher-resolution data. Describing fuelbeds 3Dly is critical in determining vertical and horizontal distributions offuel elements and the mass, especially in frequently burned pine ecosystems where fine-scale...
The Learning-Paradigm Campus: From Single- to Double-Loop Learning
ERIC Educational Resources Information Center
Tagg, John
2010-01-01
Since the 1980s, advocates for change in higher education have called for double-loop learning. One of the main criticisms of the evaluation of colleges and universities was that they measured inputs rather than the outputs. Higher education now needs to apply the lessons of learning and change to campus leadership and organization.
Adaptation to sensory input tunes visual cortex to criticality
NASA Astrophysics Data System (ADS)
Shew, Woodrow L.; Clawson, Wesley P.; Pobst, Jeff; Karimipanah, Yahya; Wright, Nathaniel C.; Wessel, Ralf
2015-08-01
A long-standing hypothesis at the interface of physics and neuroscience is that neural networks self-organize to the critical point of a phase transition, thereby optimizing aspects of sensory information processing. This idea is partially supported by strong evidence for critical dynamics observed in the cerebral cortex, but the impact of sensory input on these dynamics is largely unknown. Thus, the foundations of this hypothesis--the self-organization process and how it manifests during strong sensory input--remain unstudied experimentally. Here we show in visual cortex and in a computational model that strong sensory input initially elicits cortical network dynamics that are not critical, but adaptive changes in the network rapidly tune the system to criticality. This conclusion is based on observations of multifaceted scaling laws predicted to occur at criticality. Our findings establish sensory adaptation as a self-organizing mechanism that maintains criticality in visual cortex during sensory information processing.
Physician input: a critical strategic-planning tool.
Rovinsky, Michael
2002-01-01
To establish effective working relationships with medical staff and community physicians, an IDS must adopt a strategic-planning approach that adequately incorporates physicians' needs and expectations. Research shows that most physicians considered the IDS's market position, the degree to which the IDS can offer physicians practice-enhancing capabilities, and physician involvement in IDS governance to be critical factors for the success of an IDS. By establishing a meaningful role for physicians in the organizational strategic-planning process, an IDS can significantly improve its market position and its relationships with physicians.
Simulation models in population breast cancer screening: A systematic review.
Koleva-Kolarova, Rositsa G; Zhan, Zhuozhao; Greuter, Marcel J W; Feenstra, Talitha L; De Bock, Geertruida H
2015-08-01
The aim of this review was to critically evaluate published simulation models for breast cancer screening of the general population and provide a direction for future modeling. A systematic literature search was performed to identify simulation models with more than one application. A framework for qualitative assessment which incorporated model type; input parameters; modeling approach, transparency of input data sources/assumptions, sensitivity analyses and risk of bias; validation, and outcomes was developed. Predicted mortality reduction (MR) and cost-effectiveness (CE) were compared to estimates from meta-analyses of randomized control trials (RCTs) and acceptability thresholds. Seven original simulation models were distinguished, all sharing common input parameters. The modeling approach was based on tumor progression (except one model) with internal and cross validation of the resulting models, but without any external validation. Differences in lead times for invasive or non-invasive tumors, and the option for cancers not to progress were not explicitly modeled. The models tended to overestimate the MR (11-24%) due to screening as compared to optimal RCTs 10% (95% CI - 2-21%) MR. Only recently, potential harms due to regular breast cancer screening were reported. Most scenarios resulted in acceptable cost-effectiveness estimates given current thresholds. The selected models have been repeatedly applied in various settings to inform decision making and the critical analysis revealed high risk of bias in their outcomes. Given the importance of the models, there is a need for externally validated models which use systematical evidence for input data to allow for more critical evaluation of breast cancer screening. Copyright © 2015 Elsevier Ltd. All rights reserved.
Felix II, Richard A.; Gourévitch, Boris; Gómez-Álvarez, Marcelo; Leijon, Sara C. M.; Saldaña, Enrique; Magnusson, Anna K.
2017-01-01
Auditory streaming enables perception and interpretation of complex acoustic environments that contain competing sound sources. At early stages of central processing, sounds are segregated into separate streams representing attributes that later merge into acoustic objects. Streaming of temporal cues is critical for perceiving vocal communication, such as human speech, but our understanding of circuits that underlie this process is lacking, particularly at subcortical levels. The superior paraolivary nucleus (SPON), a prominent group of inhibitory neurons in the mammalian brainstem, has been implicated in processing temporal information needed for the segmentation of ongoing complex sounds into discrete events. The SPON requires temporally precise and robust excitatory input(s) to convey information about the steep rise in sound amplitude that marks the onset of voiced sound elements. Unfortunately, the sources of excitation to the SPON and the impact of these inputs on the behavior of SPON neurons have yet to be resolved. Using anatomical tract tracing and immunohistochemistry, we identified octopus cells in the contralateral cochlear nucleus (CN) as the primary source of excitatory input to the SPON. Cluster analysis of miniature excitatory events also indicated that the majority of SPON neurons receive one type of excitatory input. Precise octopus cell-driven onset spiking coupled with transient offset spiking make SPON responses well-suited to signal transitions in sound energy contained in vocalizations. Targets of octopus cell projections, including the SPON, are strongly implicated in the processing of temporal sound features, which suggests a common pathway that conveys information critical for perception of complex natural sounds. PMID:28620283
Critical Need for Radiation Damage Tools for Space Missions
NASA Astrophysics Data System (ADS)
Tripathi, Ram
2005-04-01
NASA has a new vision for space exploration in the 21st Century encompassing a broad range of human and robotic missions including missions to Moon, Mars and beyond. As a result, there is a focus on long duration space missions. NASA, as much as ever, is committed to the safety of the missions and the crew. Exposure from the hazards of severe space radiation in deep space long duration missions is `the show stopper.' Thus, protection from the hazards of severe space radiation is of paramount importance for the new vision. There is an overwhelming emphasis on the reliability issues for the mission and the habitat. Accurate risk assessments critically depend on the accuracy of the input information about the interaction of ions with materials, electronics and tissues. A huge amount of essential experimental information for all the ions in space, across the periodic table, for a wide range of energies of several (up to a Trillion) orders of magnitude are needed for the radiation protection engineering for space missions that is simply not available (due to the high costs) and probably never will be. Therefore, there is a compelling need to develop reliable accurate models of nuclear reactions and structures that form the basic input ingredients. State-of-the-art nuclear cross sections models have been developed at the NASA Langley Research Center, however a considerable number of tools need to be developed to alleviate the situation. The vital role and importance of nuclear physics for space missions will be discussed.
Pattern Generator for Bench Test of Digital Boards
NASA Technical Reports Server (NTRS)
Berkun, Andrew C.; Chu, Anhua J.
2012-01-01
All efforts to develop electronic equipment reach a stage where they need a board test station for each board. The SMAP digital system consists of three board types that interact with each other using interfaces with critical timing. Each board needs to be tested individually before combining into the integrated digital electronics system. Each board needs critical timing signals from the others to be able to operate. A bench test system was developed to support test of each board. The test system produces all the outputs of the control and timing unit, and is delivered much earlier than the timing unit. Timing signals are treated as data. A large file is generated containing the state of every timing signal at any instant. This file is streamed out to an IO card, which is wired directly to the device-under-test (DUT) input pins. This provides a flexible test environment that can be adapted to any of the boards required to test in a standalone configuration. The problem of generating the critical timing signals is then transferred from a hardware problem to a software problem where it is more easily dealt with.
Tiwari, Vikram; Kumar, Avinash B
2018-01-01
The current system of summative multi-rater evaluations and standardized tests to determine readiness to graduate from critical care fellowships has limitations. We sought to pilot the use of data envelopment analysis (DEA) to assess what aspects of the fellowship program contribute the most to an individual fellow's success. DEA is a nonparametric, operations research technique that uses linear programming to determine the technical efficiency of an entity based on its relative usage of resources in producing the outcome. Retrospective cohort study. Critical care fellows (n = 15) in an Accreditation Council for Graduate Medical Education (ACGME) accredited fellowship at a major academic medical center in the United States. After obtaining institutional review board approval for this retrospective study, we analyzed the data of 15 anesthesiology critical care fellows from academic years 2013-2015. The input-oriented DEA model develops a composite score for each fellow based on multiple inputs and outputs. The inputs included the didactic sessions attended, the ratio of clinical duty works hours to the procedures performed (work intensity index), and the outputs were the Multidisciplinary Critical Care Knowledge Assessment Program (MCCKAP) score and summative evaluations of fellows. A DEA efficiency score that ranged from 0 to 1 was generated for each of the fellows. Five fellows were rated as DEA efficient, and 10 fellows were characterized in the DEA inefficient group. The model was able to forecast the level of effort needed for each inefficient fellow, to achieve similar outputs as their best performing peers. The model also identified the work intensity index as the key element that characterized the best performers in our fellowship. DEA is a feasible method of objectively evaluating peer performance in a critical care fellowship beyond summative evaluations alone and can potentially be a powerful tool to guide individual performance during the fellowship.
Limits on negative information in language input.
Morgan, J L; Travis, L L
1989-10-01
Hirsh-Pasek, Treiman & Schneiderman (1984) and Demetras, Post & Snow (1986) have recently suggested that certain types of parental repetitions and clarification questions may provide children with subtle cues to their grammatical errors. We further investigated this possibility by examining parental responses to inflectional over-regularizations and wh-question auxiliary-verb omission errors in the sets of transcripts from Adam, Eve and Sarah (Brown 1973). These errors were chosen because they are exemplars of overgeneralization, the type of mistake for which negative information is, in theory, most critically needed. Expansions and Clarification Questions occurred more often following ill-formed utterances in Adam's and Eve's input, but not in Sarah's. However, these corrective responses formed only a small proportion of all adult responses following Adam's and Eve's grammatical errors. Moreover, corrective responses appear to drop out of children's input while they continue to make overgeneralization errors. Whereas negative feedback may occasionally be available, in the light of these findings the contention that language input generally incorporates negative information appears to be unfounded.
A Stochastic Simulator of a Blood Product Donation Environment with Demand Spikes and Supply Shocks
An, Ming-Wen; Reich, Nicholas G.; Crawford, Stephen O.; Brookmeyer, Ron; Louis, Thomas A.; Nelson, Kenrad E.
2011-01-01
The availability of an adequate blood supply is a critical public health need. An influenza epidemic or another crisis affecting population mobility could create a critical donor shortage, which could profoundly impact blood availability. We developed a simulation model for the blood supply environment in the United States to assess the likely impact on blood availability of factors such as an epidemic. We developed a simulator of a multi-state model with transitions among states. Weekly numbers of blood units donated and needed were generated by negative binomial stochastic processes. The simulator allows exploration of the blood system under certain conditions of supply and demand rates, and can be used for planning purposes to prepare for sudden changes in the public's health. The simulator incorporates three donor groups (first-time, sporadic, and regular), immigration and emigration, deferral period, and adjustment factors for recruitment. We illustrate possible uses of the simulator by specifying input values for an -week flu epidemic, resulting in a moderate supply shock and demand spike (for example, from postponed elective surgeries), and different recruitment strategies. The input values are based in part on data from a regional blood center of the American Red Cross during –. Our results from these scenarios suggest that the key to alleviating deficit effects of a system shock may be appropriate timing and duration of recruitment efforts, in turn depending critically on anticipating shocks and rapidly implementing recruitment efforts. PMID:21814550
A stochastic simulator of a blood product donation environment with demand spikes and supply shocks.
An, Ming-Wen; Reich, Nicholas G; Crawford, Stephen O; Brookmeyer, Ron; Louis, Thomas A; Nelson, Kenrad E
2011-01-01
The availability of an adequate blood supply is a critical public health need. An influenza epidemic or another crisis affecting population mobility could create a critical donor shortage, which could profoundly impact blood availability. We developed a simulation model for the blood supply environment in the United States to assess the likely impact on blood availability of factors such as an epidemic. We developed a simulator of a multi-state model with transitions among states. Weekly numbers of blood units donated and needed were generated by negative binomial stochastic processes. The simulator allows exploration of the blood system under certain conditions of supply and demand rates, and can be used for planning purposes to prepare for sudden changes in the public's health. The simulator incorporates three donor groups (first-time, sporadic, and regular), immigration and emigration, deferral period, and adjustment factors for recruitment. We illustrate possible uses of the simulator by specifying input values for an 8-week flu epidemic, resulting in a moderate supply shock and demand spike (for example, from postponed elective surgeries), and different recruitment strategies. The input values are based in part on data from a regional blood center of the American Red Cross during 1996-2005. Our results from these scenarios suggest that the key to alleviating deficit effects of a system shock may be appropriate timing and duration of recruitment efforts, in turn depending critically on anticipating shocks and rapidly implementing recruitment efforts.
Participatory Design, User Involvement and Health IT Evaluation.
Kushniruk, Andre; Nøhr, Christian
2016-01-01
End user involvement and input into the design and evaluation of information systems has been recognized as being a critical success factor in the adoption of information systems. Nowhere is this need more critical than in the design of health information systems. Consistent with evidence from the general software engineering literature, the degree of user input into design of complex systems has been identified as one of the most important factors in the success or failure of complex information systems. The participatory approach goes beyond user-centered design and co-operative design approaches to include end users as more active participants in design ideas and decision making. Proponents of participatory approaches argue for greater end user participation in both design and evaluative processes. Evidence regarding the effectiveness of increased user involvement in design is explored in this contribution in the context of health IT. The contribution will discuss several approaches to including users in design and evaluation. Challenges in IT evaluation during participatory design will be described and explored along with several case studies.
Improving outcome of sensorimotor functions after traumatic spinal cord injury.
Dietz, Volker
2016-01-01
In the rehabilitation of a patient suffering a spinal cord injury (SCI), the exploitation of neuroplasticity is well established. It can be facilitated through the training of functional movements with technical assistance as needed and can improve outcome after an SCI. The success of such training in individuals with incomplete SCI critically depends on the presence of physiological proprioceptive input to the spinal cord leading to meaningful muscle activations during movement performances. Some actual preclinical approaches to restore function by compensating for the loss of descending input to spinal networks following complete/incomplete SCI are critically discussed in this report. Electrical and pharmacological stimulation of spinal neural networks is still in the experimental stage, and despite promising repair studies in animal models, translations to humans up to now have not been convincing. It is possible that a combination of techniques targeting the promotion of axonal regeneration is necessary to advance the restoration of function. In the future, refinement of animal models according to clinical conditions and requirements may contribute to greater translational success.
Horizontal and vertical integration in hospital laboratories and the laboratory information system.
Friedman, B A; Mitchell, W
1990-09-01
An understanding of horizontal and vertical integration and their quasi-integration variants is important for pathologists to formulate a competitive strategy for hospital clinical laboratories. These basic organizational concepts, in turn, are based on the need to establish control over critical laboratory inputs and outputs. The pathologist seeks greater control of mission-critical system inputs and outputs to increase the quality and efficiency of the laboratory operations. The LIS produces horizontal integration of the various hospital laboratories by integrating them vertically. Forward vertical quasi-integration of the laboratories is mediated primarily by the LIS through front-end valued-added features such as reporting of results and creating a long-term on-line test result archive. These features increase the value of the information product of pathology for clinicians and increase the cost of switching to another system. The LIS can also serve as a means for customizing the information product of the laboratories to appeal to new market segments such as hospital administrators.
A Validation Metrics Framework for Safety-Critical Software-Intensive Systems
2009-03-01
so does its definition, tools, and techniques, including means for measuring the validation activity, its outputs, and impact on development...independent of the SDLP. When considering the above SDLPs from the safety engineering team’s perspective, there are also large impacts on the way... impact . Interpretation of any actionable metric data will need to be undertaken in the context of the SDLP. 2. Safety Input The software safety
Simpson, Kathleen Rice; Lyndon, Audrey; Wilson, Jane; Ruhl, Catherine
2012-01-01
Objective To solicit input from registered nurse members of the Association of Women’s Health, Obstetric and Neonatal Nurses (AWHONN) on critical considerations for review and revision of existing nurse staffing guidelines. Design Thematic analysis of responses to a cross-sectional on-line survey question: “Please give the staffing task force your input on what they should consider in the development of recommendations for staffing of perinatal units.” Participants N = 884 AWHONN members. Main Outcome Measure Descriptions of staffing concerns that should be considered when evaluating and revising existing perinatal nurse staffing guidelines. Results Consistent themes identified included the need for revision of nurse staffing guidelines due to requirements for safe care, increases in patient acuity and complexity, invisibility of the fetus and newborn as separate and distinct patients, difficulties in providing comprehensive care during labor and for mother-baby couplets under current conditions, challenges in staffing small volume units, and the negative effect of inadequate staffing on nurse satisfaction and retention. Conclusion Participants overwhelmingly indicated current nurse staffing guidelines were inadequate to meet the needs of contemporary perinatal clinical practice and required revision based on significant changes that had occurred since 1983 when the original staffing guidelines were published. PMID:22690743
Neonatal and pediatric regionalized systems in pediatric emergency mass critical care
Barfield, Wanda D.; Krug, Steven E.; Kanter, Robert K.; Gausche-Hill, Marianne; Brantley, Mary D.; Chung, Sarita; Kissoon, Niranjan
2015-01-01
Introduction Improved health outcomes are associated with neonatal and pediatric critical care in well-organized, cohesive, regionalized systems that are prepared to support and rehabilitate critically ill victims of a mass casualty event. However, present systems lack adequate surge capacity for neonatal and pediatric mass critical care. In this document, we outline the present reality and suggest alternative approaches. Methods In May 2008, the Task Force for Mass Critical Care published guidance on provision of mass critical care to adults. Acknowledging that the critical care needs of children during disasters were unaddressed by this effort, a 17-member Steering Committee, assembled by the Oak Ridge Institute for Science and Education with guidance from members of the American Academy of Pediatrics, convened in April 2009 to determine priority topic areas for pediatric emergency mass critical care recommendations. Steering Committee members established subcommittees by topic area and performed literature reviews of MEDLINE and Ovid databases. The Steering Committee produced draft outlines through consensus-based study of the literature and convened October 6–7, 2009, in New York, NY, to review and revise each outline. Eight draft documents were subsequently developed from the revised outlines as well as through searches of MEDLINE updated through March 2010. The Pediatric Emergency Mass Critical Care Task Force, composed of 36 experts from diverse public health, medical, and disaster response fields, convened in Atlanta, GA, on March 29–30, 2010. Feedback on each manuscript was compiled and the Steering Committee revised each document to reflect expert input in addition to the most current medical literature. Task Force Recommendations States and regions (facilitated by federal partners) should review current emergency operations and devise appropriate plans to address the population-based needs of infants and children in large-scale disasters. Action at the state, regional, and federal levels should address legal, operational, and information systems to provide effective pediatric mass critical care through: 1) predisaster/mass casualty planning, management, and assessment with input from child health professionals; 2) close cooperation, agreements, public-private partnerships, and unique delivery systems; and 3) use of existing public health data to assess pediatric populations at risk and to model graded response plans based on increasing patient volume and acuity. PMID:22067921
Integrating Bioethics into Clinical and Translational Science Research: A Roadmap
Shapiro, Robyn S.; Layde, Peter M.
2008-01-01
Abstract Recent initiatives to improve human health emphasize the need to effectively and appropriately translate new knowledge gleaned from basic biomedical and behavioral research to clinical and community application. To maximize the beneficial impact of scientific advances in clinical practice and community health, and to guard against potential deleterious medical and societal consequences of such advances, incorporation of bioethics at each stage of clinical and translational science research is essential. At the earliest stage, bioethics input is critical to address issues such as whether to limit certain areas of scientific inquiry. Subsequently, bioethics input is important to assure not only that human subjects trials are conducted and reported responsibly, but also that results are incorporated into clinical and community practices in a way that promotes and protects bioethical principles. At the final stage of clinical and translational science research, bioethics helps to identify the need and approach for refining clinical practices when safety or other concerns arise. The framework we present depicts how bioethics interfaces with each stage of clinical and translational science research, and suggests an important research agenda for systematically and comprehensively assuring bioethics input into clinical and translational science initiatives. PMID:20443821
Mall, David; Larsen, Ashley E; Martin, Emily A
2018-01-05
Transforming modern agriculture towards both higher yields and greater sustainability is critical for preserving biodiversity in an increasingly populous and variable world. However, the intensity of agricultural practices varies strongly between crop systems. Given limited research capacity, it is crucial to focus efforts to increase sustainability in the crop systems that need it most. In this study, we investigate the match (or mismatch) between the intensity of pesticide use and the availability of knowledge on the ecosystem service of natural pest control across various crop systems. Using a systematic literature search on pest control and publicly available pesticide data, we find that pest control literature is not more abundant in crops where insecticide input per hectare is highest. Instead, pest control literature is most abundant, with the highest number of studies published, in crops with comparatively low insecticide input per hectare but with high world harvested area. These results suggest that a major increase of interest in agroecological research towards crops with high insecticide input, particularly cotton and horticultural crops such as citrus and high value-added vegetables, would help meet knowledge needs for a timely ecointensification of agriculture.
Larsen, Ashley E.
2018-01-01
Transforming modern agriculture towards both higher yields and greater sustainability is critical for preserving biodiversity in an increasingly populous and variable world. However, the intensity of agricultural practices varies strongly between crop systems. Given limited research capacity, it is crucial to focus efforts to increase sustainability in the crop systems that need it most. In this study, we investigate the match (or mismatch) between the intensity of pesticide use and the availability of knowledge on the ecosystem service of natural pest control across various crop systems. Using a systematic literature search on pest control and publicly available pesticide data, we find that pest control literature is not more abundant in crops where insecticide input per hectare is highest. Instead, pest control literature is most abundant, with the highest number of studies published, in crops with comparatively low insecticide input per hectare but with high world harvested area. These results suggest that a major increase of interest in agroecological research towards crops with high insecticide input, particularly cotton and horticultural crops such as citrus and high value-added vegetables, would help meet knowledge needs for a timely ecointensification of agriculture. PMID:29304005
Experimental Criticality Benchmarks for SNAP 10A/2 Reactor Cores
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krass, A.W.
2005-12-19
This report describes computational benchmark models for nuclear criticality derived from descriptions of the Systems for Nuclear Auxiliary Power (SNAP) Critical Assembly (SCA)-4B experimental criticality program conducted by Atomics International during the early 1960's. The selected experimental configurations consist of fueled SNAP 10A/2-type reactor cores subject to varied conditions of water immersion and reflection under experimental control to measure neutron multiplication. SNAP 10A/2-type reactor cores are compact volumes fueled and moderated with the hydride of highly enriched uranium-zirconium alloy. Specifications for the materials and geometry needed to describe a given experimental configuration for a model using MCNP5 are provided. Themore » material and geometry specifications are adequate to permit user development of input for alternative nuclear safety codes, such as KENO. A total of 73 distinct experimental configurations are described.« less
Does the vestibular system contribute to head direction cell activity in the rat?
NASA Technical Reports Server (NTRS)
Brown, J. E.; Yates, B. J.; Taube, J. S.; Oman, C. M. (Principal Investigator)
2002-01-01
Head direction cells (HDC) located in several regions of the brain, including the anterior dorsal nucleus of the thalamus (ADN), postsubiculum (PoS), and lateral mammillary nuclei (LMN), provide the neural substrate for the determination of head direction. Although activity of HDC is influenced by various sensory signals and internally generated cues, lesion studies and some anatomical and physiological evidence suggest that vestibular inputs are critical for the maintenance of directional sensitivity of these cells. However, vestibular inputs must be transformed considerably in order to signal head direction, and the neuronal circuitry that accomplishes this signal processing has not been fully established. Furthermore, it is unclear why the removal of vestibular inputs abolishes the directional sensitivity of HDC, as visual and other sensory inputs and motor feedback signals strongly affect the firing of these neurons and would be expected to maintain their directional-related activity. Further physiological studies will be required to establish the role of vestibular system in producing HDC responses, and anatomical studies are needed to determine the neural circuitry that mediates vestibular influences on determination of head direction.
Criticality meets learning: Criticality signatures in a self-organizing recurrent neural network
Del Papa, Bruno; Priesemann, Viola
2017-01-01
Many experiments have suggested that the brain operates close to a critical state, based on signatures of criticality such as power-law distributed neuronal avalanches. In neural network models, criticality is a dynamical state that maximizes information processing capacities, e.g. sensitivity to input, dynamical range and storage capacity, which makes it a favorable candidate state for brain function. Although models that self-organize towards a critical state have been proposed, the relation between criticality signatures and learning is still unclear. Here, we investigate signatures of criticality in a self-organizing recurrent neural network (SORN). Investigating criticality in the SORN is of particular interest because it has not been developed to show criticality. Instead, the SORN has been shown to exhibit spatio-temporal pattern learning through a combination of neural plasticity mechanisms and it reproduces a number of biological findings on neural variability and the statistics and fluctuations of synaptic efficacies. We show that, after a transient, the SORN spontaneously self-organizes into a dynamical state that shows criticality signatures comparable to those found in experiments. The plasticity mechanisms are necessary to attain that dynamical state, but not to maintain it. Furthermore, onset of external input transiently changes the slope of the avalanche distributions – matching recent experimental findings. Interestingly, the membrane noise level necessary for the occurrence of the criticality signatures reduces the model’s performance in simple learning tasks. Overall, our work shows that the biologically inspired plasticity and homeostasis mechanisms responsible for the SORN’s spatio-temporal learning abilities can give rise to criticality signatures in its activity when driven by random input, but these break down under the structured input of short repeating sequences. PMID:28552964
Multi-flux-transformer MRI detection with an atomic magnetometer.
Savukov, Igor; Karaulanov, Todor
2014-12-01
Recently, anatomical ultra-low field (ULF) MRI has been demonstrated with an atomic magnetometer (AM). A flux-transformer (FT) has been used for decoupling MRI fields and gradients to avoid their negative effects on AM performance. The field of view (FOV) was limited because of the need to compromise between the size of the FT input coil and MRI sensitivity per voxel. Multi-channel acquisition is a well-known solution to increase FOV without significantly reducing sensitivity. In this paper, we demonstrate twofold FOV increase with the use of three FT input coils. We also show that it is possible to use a single atomic magnetometer and single acquisition channel to acquire three independent MRI signals by applying a frequency-encoding gradient along the direction of the detection array span. The approach can be generalized to more channels and can be critical for imaging applications of non-cryogenic ULF MRI where FOV needs to be large, including head, hand, spine, and whole-body imaging. Copyright © 2014 Elsevier Inc. All rights reserved.
Multi-flux-transformer MRI detection with an atomic magnetometer
Savukov, Igor; Karaulanov, Todor
2014-01-01
Recently, anatomical ultra-low field (ULF) MRI has been demonstrated with an atomic magnetometer (AM). A flux-transformer (FT) has been used for decoupling MRI fields and gradients to avoid their negative effects on AM performance. The field of view (FOV) was limited because of the need to compromise between the size of the FT input coil and MRI sensitivity per voxel. Multi-channel acquisition is a well-known solution to increase FOV without significantly reducing sensitivity. In this paper, we demonstrate two-fold FOV increase with the use of three FT input coils. We also show that it is possible to use a single atomic magnetometer and single acquisition channel to acquire three independent MRI signals by applying a frequency-encoding gradient along the direction of the detection array span. The approach can be generalized to more channels and can be critical for imaging applications of non-cryogenic ULF MRI where FOV needs to be large, including head, hand, spine, and whole-body imaging. PMID:25462946
In-service health monitoring of composite structures
NASA Technical Reports Server (NTRS)
Pinto, Gino A.; Ventres, C. S.; Ginty, Carol A.; Chamis, Christos C.
1990-01-01
The aerospace industry is witnessing a vast utilization of composites in critical structural applications and anticipates even more use of them in future aircraft. Therefore, a definite need exists for a composite health monitoring expert system to meet today's current needs and tomorrow's future demands. The primary goal for this conceptual health monitoring system is functional reliably for in-service operation in the environments of various composite structures. The underlying philosophy of this system is to utilize proven vibration techniques to assess the structural integrity of a fibrous composite. Statistical methods are used to determine if the variances in the measured data are acceptable for making a reliable decision on the health status of the composite. The flexible system allows for algorithms describing any composite fatigue or damage behavior characteristic to be provided as an input to the system. Alert thresholds and variances can also be provided as an input to this system and may be updated to allow for future changes/refinements in the composite's structural integrity behavior.
Wang, Lijuan; Zhao, He; Robinson, Brian E.
2017-01-01
With the increases of cropland area and fertilizer nitrogen (N) application rate, general N balance characteristics in regional agroecosystems have been widely documented. However, few studies have quantitatively analyzed the drivers of spatial changes in the N budget. We constructed a mass balance model of the N budget at the soil surface using a database of county-level agricultural statistics to analyze N input, output, and proportional contribution of various factors to the overall N input changes in croplands during 2000–2010 in the Yangtze River Basin, the largest basin and the main agricultural production region in China. Over the period investigated, N input increased by 9%. Of this 87% was from fertilizer N input. In the upper and middle reaches of the basin, the increased synthetic fertilizer N application rate accounted for 84% and 76% of the N input increase, respectively, mainly due to increased N input in the cropland that previously had low synthetic fertilizer N application rate. In lower reaches of the basin, mainly due to urbanization, the decrease in cropland area and synthetic fertilizer N application rate nearly equally contributed to decreases in N input. Quantifying spatial N inputs can provide critical managerial information needed to optimize synthetic fertilizer N application rate and monitor the impacts of urbanization on agricultural production, helping to decrease agricultural environment risk and maintain sustainable agricultural production in different areas. PMID:28678841
Wang, Lijuan; Zheng, Hua; Zhao, He; Robinson, Brian E
2017-01-01
With the increases of cropland area and fertilizer nitrogen (N) application rate, general N balance characteristics in regional agroecosystems have been widely documented. However, few studies have quantitatively analyzed the drivers of spatial changes in the N budget. We constructed a mass balance model of the N budget at the soil surface using a database of county-level agricultural statistics to analyze N input, output, and proportional contribution of various factors to the overall N input changes in croplands during 2000-2010 in the Yangtze River Basin, the largest basin and the main agricultural production region in China. Over the period investigated, N input increased by 9%. Of this 87% was from fertilizer N input. In the upper and middle reaches of the basin, the increased synthetic fertilizer N application rate accounted for 84% and 76% of the N input increase, respectively, mainly due to increased N input in the cropland that previously had low synthetic fertilizer N application rate. In lower reaches of the basin, mainly due to urbanization, the decrease in cropland area and synthetic fertilizer N application rate nearly equally contributed to decreases in N input. Quantifying spatial N inputs can provide critical managerial information needed to optimize synthetic fertilizer N application rate and monitor the impacts of urbanization on agricultural production, helping to decrease agricultural environment risk and maintain sustainable agricultural production in different areas.
2007-05-01
Commission maintains an expert staff of engineers and statisticians to analyze this data in an attempt to reveal troublesome trends in network reliability...response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and...to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE MAY
Stimulated by Novelty? The Role of Psychological Needs and Perceived Creativity
De Jonge, Kiki M. M.; Rietzschel, Eric F.; Van Yperen, Nico W.
2018-01-01
In the current research, we aimed to address the inconsistent finding in the brainstorming literature that cognitive stimulation sometimes results from novel input, yet other times from non-novel input. We expected and found, in three experiments, that the strength and valence of this relationship are moderated by people’s psychological needs for structure and autonomy. Specifically, the effect of novel input (vs. non-novel input), through perceived creativity, on cognitive stimulation was stronger for people who were either low in need for structure or high in need for autonomy. Also, when the input people received did not fit their needs, they experienced less psychological cognitive stimulation from this input (i.e., less task enjoyment and feeling more blocked) compared with when they did not receive any input. Hence, to create the ideal circumstances for people to achieve cognitive stimulation when brainstorming, input novelty should be aligned with their psychological needs. PMID:29405847
Gupta, Himanshu; Schiros, Chun G; Sharifov, Oleg F; Jain, Apurva; Denney, Thomas S
2016-08-31
Recently released American College of Cardiology/American Heart Association (ACC/AHA) guideline recommends the Pooled Cohort equations for evaluating atherosclerotic cardiovascular risk of individuals. The impact of the clinical input variable uncertainties on the estimates of ten-year cardiovascular risk based on ACC/AHA guidelines is not known. Using a publicly available the National Health and Nutrition Examination Survey dataset (2005-2010), we computed maximum and minimum ten-year cardiovascular risks by assuming clinically relevant variations/uncertainties in input of age (0-1 year) and ±10 % variation in total-cholesterol, high density lipoprotein- cholesterol, and systolic blood pressure and by assuming uniform distribution of the variance of each variable. We analyzed the changes in risk category compared to the actual inputs at 5 % and 7.5 % risk limits as these limits define the thresholds for consideration of drug therapy in the new guidelines. The new-pooled cohort equations for risk estimation were implemented in a custom software package. Based on our input variances, changes in risk category were possible in up to 24 % of the population cohort at both 5 % and 7.5 % risk boundary limits. This trend was consistently noted across all subgroups except in African American males where most of the cohort had ≥7.5 % baseline risk regardless of the variation in the variables. The uncertainties in the input variables can alter the risk categorization. The impact of these variances on the ten-year risk needs to be incorporated into the patient/clinician discussion and clinical decision making. Incorporating good clinical practices for the measurement of critical clinical variables and robust standardization of laboratory parameters to more stringent reference standards is extremely important for successful implementation of the new guidelines. Furthermore, ability to customize the risk calculator inputs to better represent unique clinical circumstances specific to individual needs would be highly desirable in the future versions of the risk calculator.
Coats, Heather; Paganelli, Tia; Starks, Helene; Lindhorst, Taryn; Starks Acosta, Anne; Mauksch, Larry; Doorenbos, Ardith
2017-03-01
There is a known shortage of trained palliative care professionals, and an even greater shortage of professionals who have been trained through interprofessional curricula. As part of an institutional Palliative Care Training Center grant, a core team of interprofessional palliative care academic faculty and staff completed a state-wide palliative care educational assessment to determine the needs for an interprofessional palliative care training program. The purpose of this article is to describe the process and results of our community needs assessment of interprofessional palliative care educational needs in Washington state. We approached the needs assessment through a cross-sectional descriptive design by using mixed-method inquiry. Each phase incorporated a variety of settings and subjects. The assessment incorporated multiple phases with diverse methodological approaches: a preparatory phase-identifying key informants; Phase I-key informant interviews; Phase II-survey; and Phase III-steering committee endorsement. The multiple phases of the needs assessment helped create a conceptual framework for the Palliative Care Training Center and developed an interprofessional palliative care curriculum. The input from key informants at multiple phases also allowed us to define priority needs and to refine an interprofessional palliative care curriculum. This curriculum will provide an interprofessional palliative care educational program that crosses disciplinary boundaries to integrate knowledge that is beneficial for all palliative care clinicians. The input from a range of palliative care clinicians and professionals at every phase of the needs assessment was critical for creating an interprofessional palliative care curriculum.
NASA Technical Reports Server (NTRS)
Nagle, Gail; Masotto, Thomas; Alger, Linda
1990-01-01
The need to meet the stringent performance and reliability requirements of advanced avionics systems has frequently led to implementations which are tailored to a specific application and are therefore difficult to modify or extend. Furthermore, many integrated flight critical systems are input/output intensive. By using a design methodology which customizes the input/output mechanism for each new application, the cost of implementing new systems becomes prohibitively expensive. One solution to this dilemma is to design computer systems and input/output subsystems which are general purpose, but which can be easily configured to support the needs of a specific application. The Advanced Information Processing System (AIPS), currently under development has these characteristics. The design and implementation of the prototype I/O communication system for AIPS is described. AIPS addresses reliability issues related to data communications by the use of reconfigurable I/O networks. When a fault or damage event occurs, communication is restored to functioning parts of the network and the failed or damage components are isolated. Performance issues are addressed by using a parallelized computer architecture which decouples Input/Output (I/O) redundancy management and I/O processing from the computational stream of an application. The autonomous nature of the system derives from the highly automated and independent manner in which I/O transactions are conducted for the application as well as from the fact that the hardware redundancy management is entirely transparent to the application.
Electrochemical Probing through a Redox Capacitor To Acquire Chemical Information on Biothiols
2016-01-01
The acquisition of chemical information is a critical need for medical diagnostics, food/environmental monitoring, and national security. Here, we report an electrochemical information processing approach that integrates (i) complex electrical inputs/outputs, (ii) mediators to transduce the electrical I/O into redox signals that can actively probe the chemical environment, and (iii) a redox capacitor that manipulates signals for information extraction. We demonstrate the capabilities of this chemical information processing strategy using biothiols because of the emerging importance of these molecules in medicine and because their distinct chemical properties allow evaluation of hypothesis-driven information probing. We show that input sequences can be tailored to probe for chemical information both qualitatively (step inputs probe for thiol-specific signatures) and quantitatively. Specifically, we observed picomolar limits of detection and linear responses to concentrations over 5 orders of magnitude (1 pM–0.1 μM). This approach allows the capabilities of signal processing to be extended for rapid, robust, and on-site analysis of chemical information. PMID:27385047
Electrochemical Probing through a Redox Capacitor To Acquire Chemical Information on Biothiols.
Liu, Zhengchun; Liu, Yi; Kim, Eunkyoung; Bentley, William E; Payne, Gregory F
2016-07-19
The acquisition of chemical information is a critical need for medical diagnostics, food/environmental monitoring, and national security. Here, we report an electrochemical information processing approach that integrates (i) complex electrical inputs/outputs, (ii) mediators to transduce the electrical I/O into redox signals that can actively probe the chemical environment, and (iii) a redox capacitor that manipulates signals for information extraction. We demonstrate the capabilities of this chemical information processing strategy using biothiols because of the emerging importance of these molecules in medicine and because their distinct chemical properties allow evaluation of hypothesis-driven information probing. We show that input sequences can be tailored to probe for chemical information both qualitatively (step inputs probe for thiol-specific signatures) and quantitatively. Specifically, we observed picomolar limits of detection and linear responses to concentrations over 5 orders of magnitude (1 pM-0.1 μM). This approach allows the capabilities of signal processing to be extended for rapid, robust, and on-site analysis of chemical information.
Polynomic nonlinear dynamical systems - A residual sensitivity method for model reduction
NASA Technical Reports Server (NTRS)
Yurkovich, S.; Bugajski, D.; Sain, M.
1985-01-01
The motivation for using polynomic combinations of system states and inputs to model nonlinear dynamics systems is founded upon the classical theories of analysis and function representation. A feature of such representations is the need to make available all possible monomials in these variables, up to the degree specified, so as to provide for the description of widely varying functions within a broad class. For a particular application, however, certain monomials may be quite superfluous. This paper examines the possibility of removing monomials from the model in accordance with the level of sensitivity displayed by the residuals to their absence. Critical in these studies is the effect of system input excitation, and the effect of discarding monomial terms, upon the model parameter set. Therefore, model reduction is approached iteratively, with inputs redesigned at each iteration to ensure sufficient excitation of remaining monomials for parameter approximation. Examples are reported to illustrate the performance of such model reduction approaches.
Measuring Input Thresholds on an Existing Board
NASA Technical Reports Server (NTRS)
Kuperman, Igor; Gutrich, Daniel G.; Berkun, Andrew C.
2011-01-01
A critical PECL (positive emitter-coupled logic) interface to Xilinx interface needed to be changed on an existing flight board. The new Xilinx input interface used a CMOS (complementary metal-oxide semiconductor) type of input, and the driver could meet its thresholds typically, but not in worst-case, according to the data sheet. The previous interface had been based on comparison with an external reference, but the CMOS input is based on comparison with an internal divider from the power supply. A way to measure what the exact input threshold was for this device for 64 inputs on a flight board was needed. The measurement technique allowed an accurate measurement of the voltage required to switch a Xilinx input from high to low for each of the 64 lines, while only probing two of them. Directly driving an external voltage was considered too risky, and tests done on any other unit could not be used to qualify the flight board. The two lines directly probed gave an absolute voltage threshold calibration, while data collected on the remaining 62 lines without probing gave relative measurements that could be used to identify any outliers. The PECL interface was forced to a long-period square wave by driving a saturated square wave into the ADC (analog to digital converter). The active pull-down circuit was turned off, causing each line to rise rapidly and fall slowly according to the input s weak pull-down circuitry. The fall time shows up as a change in the pulse width of the signal ready by the Xilinx. This change in pulse width is a function of capacitance, pulldown current, and input threshold. Capacitance was known from the different trace lengths, plus a gate input capacitance, which is the same for all inputs. The pull-down current is the same for all inputs including the two that are probed directly. The data was combined, and the Excel solver tool was used to find input thresholds for the 62 lines. This was repeated over different supply voltages and temperatures to show that the interface had voltage margin under all worst case conditions. Gate input thresholds are normally measured at the manufacturer when the device is on a chip tester. A key function of this machine was duplicated on an existing flight board with no modifications to the nets to be tested, with the exception of changes in the FPGA program.
Impacts of prescribed fire on ecosystem C and N cycles at Fort Benning Installation, Georgia
NASA Astrophysics Data System (ADS)
Zhao, S.; Liu, S.; Tieszen, L.
2007-12-01
A critical challenge for the land managers at military installation is to maintain the ecological sustainability of natural resources while meeting the needs of military training. Prescribed ground fire as a land management practice has been used to remove the ground layer plants at Fort Benning for two purposes: to facilitate access for military training, and to maintain and restore fire-adapted longleaf pine communities that are critical habitat for the federally endangered red-cockaded woodpecker (Picoides borealis). Nevertheless, the impacts of prescribed fire on ecosystem processes and health are not well-understood and quantified at the plot to regional scales. Frequent fire may result in ecosystem nitrogen (N) deficiency due to repeated N loss through combustion, volatilization, and leaching, threatening ecosystem sustainability at Fort Benning. On the other hand, N loss may be offset by enhanced symbiotic N2 fixation since fire favors herbaceous legumes by scarifying legume seeds and stimulating germination. Quantifying the impacts of prescribed fire on ecosystem carbon (C) and N cycles is further complicated by interactions and feedbacks among burning, nitrogen inputs, other land use practices (e.g. tree thinning or clear-cutting), and soil properties. In this study, we used the Erosion-Deposition-Carbon Model (EDCM), a process-based biogeochemical model, to simulate C and N dynamic at Fort Benning under different combinations of fire frequency, fire intensity, nitrogen deposition, legume nitrogen input, forest harvesting, and soil sand content. Model simulations indicated that prescribed fire led to nitrogen losses from ecosystems at Fort Benning, especially with high intensity and high frequency fires. Forest harvesting further intensified ecosystem nitrogen limitation, leading to reduced biophysical potential of C sequestration. The adverse impacts of prescribed fire and forest harvesting on C and N cycles were much higher in more sandy soil than in less sandy soil. N inputs from nitrogen deposition and legume N fixation helped replenish N losses to some extent. However, N losses due to fire and harvesting were not balanced or exceeded under current atmospheric N deposition and legume N input rates, suggesting additional N input (e.g., fertilization) may be needed to maintain the sustainability of current ecosystem states and management practices at Fort Benning.
Larsson, D G Joakim; Andremont, Antoine; Bengtsson-Palme, Johan; Brandt, Kristian Koefoed; de Roda Husman, Ana Maria; Fagerstedt, Patriq; Fick, Jerker; Flach, Carl-Fredrik; Gaze, William H; Kuroda, Makoto; Kvint, Kristian; Laxminarayan, Ramanan; Manaia, Celia M; Nielsen, Kaare Magne; Plant, Laura; Ploy, Marie-Cécile; Segovia, Carlos; Simonet, Pascal; Smalla, Kornelia; Snape, Jason; Topp, Edward; van Hengel, Arjon J; Verner-Jeffreys, David W; Virta, Marko P J; Wellington, Elizabeth M; Wernersson, Ann-Sofie
2018-08-01
There is growing understanding that the environment plays an important role both in the transmission of antibiotic resistant pathogens and in their evolution. Accordingly, researchers and stakeholders world-wide seek to further explore the mechanisms and drivers involved, quantify risks and identify suitable interventions. There is a clear value in establishing research needs and coordinating efforts within and across nations in order to best tackle this global challenge. At an international workshop in late September 2017, scientists from 14 countries with expertise on the environmental dimensions of antibiotic resistance gathered to define critical knowledge gaps. Four key areas were identified where research is urgently needed: 1) the relative contributions of different sources of antibiotics and antibiotic resistant bacteria into the environment; 2) the role of the environment, and particularly anthropogenic inputs, in the evolution of resistance; 3) the overall human and animal health impacts caused by exposure to environmental resistant bacteria; and 4) the efficacy and feasibility of different technological, social, economic and behavioral interventions to mitigate environmental antibiotic resistance. 1 . Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
The Influence of Salmon Recolonization on Riparian Communities in the Cedar River, Washington, USA
NASA Astrophysics Data System (ADS)
Moravek, J.; Clipp, H.; Kiffney, P.
2016-02-01
Salmon are a valuable resource throughout the Pacific Northwest, but increasing human activity is degrading coastal ecosystems and threatening local salmon populations. Salmon conservation efforts often focus on habitat restoration, including the re-colonization of salmon into historically obstructed areas such as the Cedar River in Washington, USA. However, to assess the long term implications of salmon re-colonization on a landscape scale, it is critical to consider not only the river ecosystem but also the surrounding riparian habitat. Although prior studies suggest that salmon alter riparian food web dynamics, the riparian community on the Cedar River has not yet been characterized. To investigate possible connections between salmon and the riparian habitat after 12 years of re-colonization, we surveyed riparian spider communities along a gradient of salmon inputs (g/m2). In 10-m transects along the banks of the river, we identified spiders and spider webs, collected prey from webs, and characterized nearby aquatic macroinvertebrate communities. We found that the density of aquatic macroinvertebrates, as well as the density of spider prey, both had significant positive relationships with salmon inputs, supporting the hypothesis that salmon provide energy and nutrients for both aquatic and riparian food webs. We also found that spider diversity significantly decreased with salmon inputs, potentially due to confounding factors such as stream gradient or vegetation structure. Although additional information is needed to fully understand this relationship, the significant connection between salmon inputs and spider diversity is compelling motivation for further studies regarding the link between aquatic and riparian systems on the Cedar River. Understanding the connections between salmon and the riparian community is critical to characterizing the long term, landscape-scale implications of sustainable salmon management in the Pacific Northwest.
The Influence of Salmon Recolonization on Riparian Communities in the Cedar River, Washington, USA
NASA Astrophysics Data System (ADS)
Moravek, J.; Clipp, H.; Kiffney, P.
2015-12-01
Salmon are a valuable cultural and economic resource throughout the Pacific Northwest, but increasing human activity is degrading coastal ecosystems and threatening local salmon populations. Salmon conservation efforts often focus on habitat restoration, including the re-colonization of salmon into historically obstructed areas such as the Cedar River in Washington, USA. However, to assess the implications of salmon re-colonization on a landscape scale, it is critical to consider not only the river ecosystem but also the surrounding riparian habitat. Although prior studies suggest that salmon alter riparian food web dynamics, the riparian community on the Cedar River has not yet been characterized. To investigate possible connections between salmon and the riparian habitat, we surveyed riparian spider communities along a gradient of salmon inputs (g/m2). In 10-m transects along the banks of the river, we identified spiders and spider webs, collected prey from webs, and characterized nearby aquatic macroinvertebrate communities. We found that the density of aquatic macroinvertebrates, as well as the density of spider prey, both had significant positive relationships with salmon inputs, supporting the hypothesis that salmon provide energy and nutrients for both aquatic and riparian food webs. We also found that spider diversity significantly decreased with salmon inputs, potentially due to confounding factors such as stream gradient or vegetation structure. Although additional information is needed to fully understand this relationship, the significant connection between salmon inputs and spider diversity is compelling motivation for further studies regarding the link between aquatic and riparian systems on the Cedar River. Understanding the connections between salmon and the riparian community is critical to characterizing the landscape-scale implications of sustainable salmon management in the Pacific Northwest.
Liu, Yan-Jun; Tang, Li; Tong, Shaocheng; Chen, C L Philip; Li, Dong-Juan
2015-01-01
Based on the neural network (NN) approximator, an online reinforcement learning algorithm is proposed for a class of affine multiple input and multiple output (MIMO) nonlinear discrete-time systems with unknown functions and disturbances. In the design procedure, two networks are provided where one is an action network to generate an optimal control signal and the other is a critic network to approximate the cost function. An optimal control signal and adaptation laws can be generated based on two NNs. In the previous approaches, the weights of critic and action networks are updated based on the gradient descent rule and the estimations of optimal weight vectors are directly adjusted in the design. Consequently, compared with the existing results, the main contributions of this paper are: 1) only two parameters are needed to be adjusted, and thus the number of the adaptation laws is smaller than the previous results and 2) the updating parameters do not depend on the number of the subsystems for MIMO systems and the tuning rules are replaced by adjusting the norms on optimal weight vectors in both action and critic networks. It is proven that the tracking errors, the adaptation laws, and the control inputs are uniformly bounded using Lyapunov analysis method. The simulation examples are employed to illustrate the effectiveness of the proposed algorithm.
Resonant UPS topologies for the emerging hybrid fiber-coaxial networks
NASA Astrophysics Data System (ADS)
Pinheiro, Humberto
Uninterruptible power supply (UPS) systems have been extensively applied to feed critical loads in many areas. Typical examples of critical loads include life-support equipment, computers and telecommunication systems. Although all UPS systems have a common purpose to provide continuous power-to critical loads, the emerging hybrid fiber-coaxial networks have created the need for specific types of UPS topologies. For example, galvanic isolation for the load and the battery, small size, high input power factor, and trapezoidal output voltage waveforms are among the required features of UPS topologies for hybrid fiber-coaxial networks. None of the conventional UPS topologies meet all these requirements. Consequently. this thesis is directed towards the design and analysis of UPS topologies for this new application. Novel UPS topologies are proposed and control techniques are developed to allow operation at high switching frequencies without penalizing the converter efficiency. By the use of resonant converters in the proposed UPS topologies. a high input power factor is achieved without requiring a dedicated power factor correction stage. In addition, a self-sustained oscillation control method is proposed to ensure soft switching under all operating conditions. A detailed analytical treatment of the resonant converters in the proposed UPS topologies is presented and design procedures illustrated. Simulation and experimental results are presented to validate the analyses and to demonstrate the feasibility of the proposed schemes.
A Simple Semaphore Signaling Technique for Ultra-High Frequency Spacecraft Communications
NASA Technical Reports Server (NTRS)
Butman, S.; Satorius, E.; Ilott, P.
2005-01-01
For planetary lander missions such as the upcoming Phoenix mission to Mars, the most challenging phase of the spacecraft-to-ground communications is during the critical phase termed entry, descent, and landing (EDL). At 8.4 GHz (X-band), the signals received by the largest Deep Space Network (DSN) antennas can be too weak for even 1 bit per second (bps) and therefore not able to communicate critical information to Earth. Fortunately, the lander s ultra-high frequency (UHF) link to an orbiting relay can meet the EDL requirements, but the data rate needs to be low enough to fit the capability of the UHF link during some or all of EDL. On Phoenix, the minimum data rate of the as-built UHF radio is 8 kbps and requires a signal level at the Odyssey orbiter of at least -120 dBm. For lower signaling levels, the effective data rate needs to be reduced, but without incurring the cost of rebuilding and requalifying the equipment. To address this scenario, a simple form of frequency-shift keying (FSK) has been devised by appropriately programming the data stream that is input to the UHF transceiver. This article describes this technique and provides performance estimates. Laboratory testing reveals that input signal levels at -140 dBm and lower can routinely be demodulated with the proposed signaling scheme, thereby providing a 20-dB and greater margin over the 8-kbps threshold.
A Simple Semaphore Signaling Technique for Ultra-High Frequency Spacecraft Communications
NASA Astrophysics Data System (ADS)
Butman, S.; Satorius, E.; Illott, P.
2005-11-01
For planetary lander missions such as the upcoming Phoenix mission to Mars, the most challenging phase of the spacecraft-to-ground communications is during the critical phase termed entry, descent, and landing (EDL). At 8.4 GHz (X-band), the signals received by the largest Deep Space Network (DSN) antennas can be too weak for even 1 bit per second (bps) and therefore not able to communicate critical information to Earth. Fortunately, the lander's ultra-high frequency (UHF) link to an orbiting relay can meet the EDL requirements, but the data rate needs to be low enough to fit the capability of the UHF link during some or all of EDL. On Phoenix, the minimum data rate of the as-built UHF radio is 8 kbps and requires a signal level at the Odyssey orbiter of at least minus 120 dBm. For lower signaling levels, the effective data rate needs to be reduced, but without incurring the cost of rebuilding and requalifying the equipment. To address this scenario, a simple form of frequency-shift keying (FSK) has been devised by appropriately programming the data stream that is input to the UHF transceiver. This article describes this technique and provides performance estimates. Laboratory testing reveals that input signal levels at minus 140 dBm and lower can routinely be demodulated with the proposed signaling scheme, thereby providing a 20-dB and greater margin over the 8-kbps threshold.
Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A
2017-12-01
Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.
Sensory noise predicts divisive reshaping of receptive fields
Deneve, Sophie; Gutkin, Boris
2017-01-01
In order to respond reliably to specific features of their environment, sensory neurons need to integrate multiple incoming noisy signals. Crucially, they also need to compete for the interpretation of those signals with other neurons representing similar features. The form that this competition should take depends critically on the noise corrupting these signals. In this study we show that for the type of noise commonly observed in sensory systems, whose variance scales with the mean signal, sensory neurons should selectively divide their input signals by their predictions, suppressing ambiguous cues while amplifying others. Any change in the stimulus context alters which inputs are suppressed, leading to a deep dynamic reshaping of neural receptive fields going far beyond simple surround suppression. Paradoxically, these highly variable receptive fields go alongside and are in fact required for an invariant representation of external sensory features. In addition to offering a normative account of context-dependent changes in sensory responses, perceptual inference in the presence of signal-dependent noise accounts for ubiquitous features of sensory neurons such as divisive normalization, gain control and contrast dependent temporal dynamics. PMID:28622330
Sensory noise predicts divisive reshaping of receptive fields.
Chalk, Matthew; Masset, Paul; Deneve, Sophie; Gutkin, Boris
2017-06-01
In order to respond reliably to specific features of their environment, sensory neurons need to integrate multiple incoming noisy signals. Crucially, they also need to compete for the interpretation of those signals with other neurons representing similar features. The form that this competition should take depends critically on the noise corrupting these signals. In this study we show that for the type of noise commonly observed in sensory systems, whose variance scales with the mean signal, sensory neurons should selectively divide their input signals by their predictions, suppressing ambiguous cues while amplifying others. Any change in the stimulus context alters which inputs are suppressed, leading to a deep dynamic reshaping of neural receptive fields going far beyond simple surround suppression. Paradoxically, these highly variable receptive fields go alongside and are in fact required for an invariant representation of external sensory features. In addition to offering a normative account of context-dependent changes in sensory responses, perceptual inference in the presence of signal-dependent noise accounts for ubiquitous features of sensory neurons such as divisive normalization, gain control and contrast dependent temporal dynamics.
Clustering analysis of moving target signatures
NASA Astrophysics Data System (ADS)
Martone, Anthony; Ranney, Kenneth; Innocenti, Roberto
2010-04-01
Previously, we developed a moving target indication (MTI) processing approach to detect and track slow-moving targets inside buildings, which successfully detected moving targets (MTs) from data collected by a low-frequency, ultra-wideband radar. Our MTI algorithms include change detection, automatic target detection (ATD), clustering, and tracking. The MTI algorithms can be implemented in a real-time or near-real-time system; however, a person-in-the-loop is needed to select input parameters for the clustering algorithm. Specifically, the number of clusters to input into the cluster algorithm is unknown and requires manual selection. A critical need exists to automate all aspects of the MTI processing formulation. In this paper, we investigate two techniques that automatically determine the number of clusters: the adaptive knee-point (KP) algorithm and the recursive pixel finding (RPF) algorithm. The KP algorithm is based on a well-known heuristic approach for determining the number of clusters. The RPF algorithm is analogous to the image processing, pixel labeling procedure. Both algorithms are used to analyze the false alarm and detection rates of three operational scenarios of personnel walking inside wood and cinderblock buildings.
The mammillary bodies and memory: more than a hippocampal relay
Vann, Seralynne D.; Nelson, Andrew J.D.
2015-01-01
Although the mammillary bodies were one of the first neural structures to be implicated in memory, it has long been assumed that their main function was to act primarily as a hippocampal relay, passing information on to the anterior thalamic nuclei and from there to the cingulate cortex. This view not only afforded the mammillary bodies no independent role in memory, it also neglected the potential significance of other, nonhippocampal, inputs to the mammillary bodies. Recent advances have transformed the picture, revealing that projections from the tegmental nuclei of Gudden, and not the hippocampal formation, are critical for sustaining mammillary body function. By uncovering a role for the mammillary bodies that is independent of its subicular inputs, this work signals the need to consider a wider network of structures that form the neural bases of episodic memory. PMID:26072239
Zoefel, Benedikt; ten Oever, Sanne; Sack, Alexander T.
2018-01-01
It is undisputed that presenting a rhythmic stimulus leads to a measurable brain response that follows the rhythmic structure of this stimulus. What is still debated, however, is the question whether this brain response exclusively reflects a regular repetition of evoked responses, or whether it also includes entrained oscillatory activity. Here we systematically present evidence in favor of an involvement of entrained neural oscillations in the processing of rhythmic input while critically pointing out which questions still need to be addressed before this evidence could be considered conclusive. In this context, we also explicitly discuss the potential functional role of such entrained oscillations, suggesting that these stimulus-aligned oscillations reflect, and serve as, predictive processes, an idea often only implicitly assumed in the literature. PMID:29563860
DOE Office of Scientific and Technical Information (OSTI.GOV)
Getman, Dan; Bush, Brian; Inman, Danny
Data used by the National Renewable Energy Laboratory (NREL) in energy analysis are often produced by industry and licensed or purchased for analysis. While this practice provides needed flexibility in selecting data for analysis it presents challenges in understanding the differences among multiple, ostensibly similar, datasets. As options for source data become more varied, it is important to be able to articulate why certain datasets were chosen and to ensure those include the data that best meet the boundaries and/or limitations of a particular analysis. This report represents the first of three phases of research intended to develop methods tomore » quantitatively assess and compare both input datasets and the results of analyses performed at NREL. This capability is critical to identifying tipping points in the costs or benefits of achieving high spatial and temporal resolution of input data.« less
Space station experiment definition: Advanced power system test bed
NASA Technical Reports Server (NTRS)
Pollard, H. E.; Neff, R. E.
1986-01-01
A conceptual design for an advanced photovoltaic power system test bed was provided and the requirements for advanced photovoltaic power system experiments better defined. Results of this study will be used in the design efforts conducted in phase B and phase C/D of the space station program so that the test bed capabilities will be responsive to user needs. Critical PV and energy storage technologies were identified and inputs were received from the idustry (government and commercial, U.S. and international) which identified experimental requirements. These inputs were used to develop a number of different conceptual designs. Pros and cons of each were discussed and a strawman candidate identified. A preliminary evolutionary plan, which included necessary precursor activities, was established and cost estimates presented which would allow for a successful implementation to the space station in the 1994 time frame.
Neuroplasticity and amblyopia: vision at the balance point.
Tailor, Vijay K; Schwarzkopf, D Samuel; Dahlmann-Noor, Annegret H
2017-02-01
New insights into triggers and brakes of plasticity in the visual system are being translated into new treatment approaches which may improve outcomes not only in children, but also in adults. Visual experience-driven plasticity is greatest in early childhood, triggered by maturation of inhibitory interneurons which facilitate strengthening of synchronous synaptic connections, and inactivation of others. Normal binocular development leads to progressive refinement of monocular visual acuity, stereoacuity and fusion of images from both eyes. At the end of the 'critical period', structural and functional brakes such as dampening of acetylcholine receptor signalling and formation of perineuronal nets limit further synaptic remodelling. Imbalanced visual input from the two eyes can lead to imbalanced neural processing and permanent visual deficits, the commonest of which is amblyopia. The efficacy of new behavioural, physical and pharmacological interventions aiming to balance visual input and visual processing have been described in humans, and some are currently under evaluation in randomised controlled trials. Outcomes may change amblyopia treatment for children and adults, but the safety of new approaches will need careful monitoring, as permanent adverse events may occur when plasticity is re-induced after the end of the critical period.Video abstracthttp://links.lww.com/CONR/A42.
Challenges and Issues of Radiation Damage Tools for Space Missions
NASA Astrophysics Data System (ADS)
Tripathi, Ram; Wilson, John
2006-04-01
NASA has a new vision for space exploration in the 21st Century encompassing a broad range of human and robotic missions including missions to Moon, Mars and beyond. Exposure from the hazards of severe space radiation in deep space long duration missions is `the show stopper.' Thus, protection from the hazards of severe space radiation is of paramount importance for the new vision. Accurate risk assessments critically depend on the accuracy of the input information about the interaction of ions with materials, electronics and tissues. A huge amount of essential experimental information for all the ions in space, across the periodic table, for a wide range of energies of several (up to a Trillion) orders of magnitude are needed for the radiation protection engineering for space missions that is simply not available (due to the high costs) and probably never will be. In addition, the accuracy of the input information and database is very critical and of paramount importance for space exposure assessments particularly in view the agency's vision for deep space exploration. The vital role and importance of nuclear physics, related challenges and issues, for space missions will be discussed, and a few examples will be presented for space missions.
A neuromorphic model of motor overflow in focal hand dystonia due to correlated sensory input
NASA Astrophysics Data System (ADS)
Sohn, Won Joon; Niu, Chuanxin M.; Sanger, Terence D.
2016-10-01
Objective. Motor overflow is a common and frustrating symptom of dystonia, manifested as unintentional muscle contraction that occurs during an intended voluntary movement. Although it is suspected that motor overflow is due to cortical disorganization in some types of dystonia (e.g. focal hand dystonia), it remains elusive which mechanisms could initiate and, more importantly, perpetuate motor overflow. We hypothesize that distinct motor elements have low risk of motor overflow if their sensory inputs remain statistically independent. But when provided with correlated sensory inputs, pre-existing crosstalk among sensory projections will grow under spike-timing-dependent-plasticity (STDP) and eventually produce irreversible motor overflow. Approach. We emulated a simplified neuromuscular system comprising two anatomically distinct digital muscles innervated by two layers of spiking neurons with STDP. The synaptic connections between layers included crosstalk connections. The input neurons received either independent or correlated sensory drive during 4 days of continuous excitation. The emulation is critically enabled and accelerated by our neuromorphic hardware created in previous work. Main results. When driven by correlated sensory inputs, the crosstalk synapses gained weight and produced prominent motor overflow; the growth of crosstalk synapses resulted in enlarged sensory representation reflecting cortical reorganization. The overflow failed to recede when the inputs resumed their original uncorrelated statistics. In the control group, no motor overflow was observed. Significance. Although our model is a highly simplified and limited representation of the human sensorimotor system, it allows us to explain how correlated sensory input to anatomically distinct muscles is by itself sufficient to cause persistent and irreversible motor overflow. Further studies are needed to locate the source of correlation in sensory input.
Using Natural Language to Enhance Mission Effectiveness
NASA Technical Reports Server (NTRS)
Trujillo, Anna C.; Meszaros, Erica
2016-01-01
The availability of highly capable, yet relatively cheap, unmanned aerial vehicles (UAVs) is opening up new areas of use for hobbyists and for professional-related activities. The driving function of this research is allowing a non-UAV pilot, an operator, to define and manage a mission. This paper describes the preliminary usability measures of an interface that allows an operator to define the mission using speech to make inputs. An experiment was conducted to begin to enumerate the efficacy and user acceptance of using voice commands to define a multi-UAV mission and to provide high-level vehicle control commands such as "takeoff." The primary independent variable was input type - voice or mouse. The primary dependent variables consisted of the correctness of the mission parameter inputs and the time needed to make all inputs. Other dependent variables included NASA-TLX workload ratings and subjective ratings on a final questionnaire. The experiment required each subject to fill in an online form that contained comparable required information that would be needed for a package dispatcher to deliver packages. For each run, subjects typed in a simple numeric code for the package code. They then defined the initial starting position, the delivery location, and the return location using either pull-down menus or voice input. Voice input was accomplished using CMU Sphinx4-5prealpha for speech recognition. They then inputted the length of the package. These were the option fields. The subject had the system "Calculate Trajectory" and then "Takeoff" once the trajectory was calculated. Later, the subject used "Land" to finish the run. After the voice and mouse input blocked runs, subjects completed a NASA-TLX. At the conclusion of all runs, subjects completed a questionnaire asking them about their experience in inputting the mission parameters, and starting and stopping the mission using mouse and voice input. In general, the usability of voice commands is acceptable. With a relatively well-defined and simple vocabulary, the operator can input the vast majority of the mission parameters using simple, intuitive voice commands. However, voice input may be more applicable to initial mission specification rather than for critical commands such as the need to land immediately due to time and feedback constraints. It would also be convenient to retrieve relevant mission information using voice input. Therefore, further on-going research is looking at using intent from operator utterances to provide the relevant mission information to the operator. The information displayed will be inferred from the operator's utterances just before key phrases are spoken. Linguistic analysis of the context of verbal communication provides insight into the intended meaning of commonly heard phrases such as "What's it doing now?" Analyzing the semantic sphere surrounding these common phrases enables us to predict the operator's intent and supply the operator's desired information to the interface. This paper also describes preliminary investigations into the generation of the semantic space of UAV operation and the success at providing information to the interface based on the operator's utterances.
Where do we store the memory representations that guide attention?
Woodman, Geoffrey F.; Carlisle, Nancy B.; Reinhart, Robert M. G.
2013-01-01
During the last decade one of the most contentious and heavily studied topics in the attention literature has been the role that working memory representations play in controlling perceptual selection. The hypothesis has been advanced that to have attention select a certain perceptual input from the environment, we only need to represent that item in working memory. Here we summarize the work indicating that the relationship between what representations are maintained in working memory and what perceptual inputs are selected is not so simple. First, it appears that attentional selection is also determined by high-level task goals that mediate the relationship between working memory storage and attentional selection. Second, much of the recent work from our laboratory has focused on the role of long-term memory in controlling attentional selection. We review recent evidence supporting the proposal that working memory representations are critical during the initial configuration of attentional control settings, but that after those settings are established long-term memory representations play an important role in controlling which perceptual inputs are selected by mechanisms of attention. PMID:23444390
People and computers--some recent highlights.
Shackel, B
2000-12-01
This paper aims to review selectively a fair proportion of the literature on human-computer interaction (HCI) over the three years since Shackel (J. Am. Soc. Inform. Sci. 48 (11) (1997) 970-986). After a brief note of history I discuss traditional input, output and workplace aspects, the web and 'E-topics', web-related aspects, virtual reality, safety-critical systems, and the need to move from HCI to human-system integration (HSI). Finally I suggest, and consider briefly, some future possibilities and issues including web consequences, embedded ubiquitous computing, and 'back to systems ergonomics?'.
Emissions-critical charge cooling using an organic rankine cycle
Ernst, Timothy C.; Nelson, Christopher R.
2014-07-15
The disclosure provides a system including a Rankine power cycle cooling subsystem providing emissions-critical charge cooling of an input charge flow. The system includes a boiler fluidly coupled to the input charge flow, an energy conversion device fluidly coupled to the boiler, a condenser fluidly coupled to the energy conversion device, a pump fluidly coupled to the condenser and the boiler, an adjuster that adjusts at least one parameter of the Rankine power cycle subsystem to change a temperature of the input charge exiting the boiler, and a sensor adapted to sense a temperature characteristic of the vaporized input charge. The system includes a controller that can determine a target temperature of the input charge sufficient to meet or exceed predetermined target emissions and cause the adjuster to adjust at least one parameter of the Rankine power cycle to achieve the predetermined target emissions.
Degradation and resilience of soils
Lal, R.
1997-01-01
Debate on global soil degradation, its extent and agronomic impact, can only be resolved through understanding of the processes and factors leading to establishment of the cause-effect relationships for major soils, ecoregions, and land uses. Systematic evaluation through long-term experimentation is needed for establishing quantitative criteria of (i) soil quality in relation to specific functions; (ii) soil degradation in relation to critical limits of key soil properties and processes; and (iii) soil resilience in relation to the ease of restoration through judicious management and discriminate use of essential input. Quantitative assessment of soil degradation can be obtained by evaluating its impact on productivity for different land uses and management systems. Interdisciplinary research is needed to quantify soil degradation effects on decrease in productivity, reduction in biomass, and decline in environment quality throught pollution and eutrophication of natural waters and emission of radiatively-active gases from terrestrial ecosystems to the atmosphere. Data from long-term field experiments in principal ecoregions are specifically needed to (i) establish relationships between soil quality versus soil degradation and soil quality versus soil resilience; (ii) identify indicators of soil quality and soil resilience; and (iii) establish critical limits of important properties for soil degradation and soil resilience. There is a need to develop and standardize techniques for measuring soil resilience.
The reform of home care services in Ontario: opportunity lost or lesson learned?
Randall, Glen
2007-06-01
With the release of the Romanow Commission report, Canadian governments are poised to consider the creation of a national home care program. If occupational and physical therapists are to have input in shaping such a program, they will need to learn from lost opportunities of the past. This paper provides an overview of recent reforms to home care in Ontario with an emphasis on rehabilitation services. Data were collected from documents and 28 key informant interviews with rehabilitation professionals. Home care in Ontario has evolved in a piecemeal manner without rehabilitation professionals playing a prominent role in program design. Rehabilitation services play a critical role in facilitating hospital discharges, minimizing readmissions, and improving the quality of peoples' lives. Canadians will benefit if occupational and physical therapists seize the unique opportunity before them to provide meaningful input into creating a national home care program.
Morphological elucidation of basal ganglia circuits contributing reward prediction
Fujiyama, Fumino; Takahashi, Susumu; Karube, Fuyuki
2015-01-01
Electrophysiological studies in monkeys have shown that dopaminergic neurons respond to the reward prediction error. In addition, striatal neurons alter their responsiveness to cortical or thalamic inputs in response to the dopamine signal, via the mechanism of dopamine-regulated synaptic plasticity. These findings have led to the hypothesis that the striatum exhibits synaptic plasticity under the influence of the reward prediction error and conduct reinforcement learning throughout the basal ganglia circuits. The reinforcement learning model is useful; however, the mechanism by which such a process emerges in the basal ganglia needs to be anatomically explained. The actor–critic model has been previously proposed and extended by the existence of role sharing within the striatum, focusing on the striosome/matrix compartments. However, this hypothesis has been difficult to confirm morphologically, partly because of the complex structure of the striosome/matrix compartments. Here, we review recent morphological studies that elucidate the input/output organization of the striatal compartments. PMID:25698913
Dendritic integration: 60 years of progress.
Stuart, Greg J; Spruston, Nelson
2015-12-01
Understanding how individual neurons integrate the thousands of synaptic inputs they receive is critical to understanding how the brain works. Modeling studies in silico and experimental work in vitro, dating back more than half a century, have revealed that neurons can perform a variety of different passive and active forms of synaptic integration on their inputs. But how are synaptic inputs integrated in the intact brain? With the development of new techniques, this question has recently received substantial attention, with new findings suggesting that many of the forms of synaptic integration observed in vitro also occur in vivo, including in awake animals. Here we review six decades of progress, which collectively highlights the complex ways that single neurons integrate their inputs, emphasizing the critical role of dendrites in information processing in the brain.
Multiple-Input Subject-Specific Modeling of Plasma Glucose Concentration for Feedforward Control.
Kotz, Kaylee; Cinar, Ali; Mei, Yong; Roggendorf, Amy; Littlejohn, Elizabeth; Quinn, Laurie; Rollins, Derrick K
2014-11-26
The ability to accurately develop subject-specific, input causation models, for blood glucose concentration (BGC) for large input sets can have a significant impact on tightening control for insulin dependent diabetes. More specifically, for Type 1 diabetics (T1Ds), it can lead to an effective artificial pancreas (i.e., an automatic control system that delivers exogenous insulin) under extreme changes in critical disturbances. These disturbances include food consumption, activity variations, and physiological stress changes. Thus, this paper presents a free-living, outpatient, multiple-input, modeling method for BGC with strong causation attributes that is stable and guards against overfitting to provide an effective modeling approach for feedforward control (FFC). This approach is a Wiener block-oriented methodology, which has unique attributes for meeting critical requirements for effective, long-term, FFC.
NASA Technical Reports Server (NTRS)
Olejarski, Michael; Appleton, Amy; Deltorchio, Stephen
2009-01-01
The Group Capability Model (GCM) is a software tool that allows an organization, from first line management to senior executive, to monitor and track the health (capability) of various groups in performing their contractual obligations. GCM calculates a Group Capability Index (GCI) by comparing actual head counts, certifications, and/or skills within a group. The model can also be used to simulate the effects of employee usage, training, and attrition on the GCI. A universal tool and common method was required due to the high risk of losing skills necessary to complete the Space Shuttle Program and meet the needs of the Constellation Program. During this transition from one space vehicle to another, the uncertainty among the critical skilled workforce is high and attrition has the potential to be unmanageable. GCM allows managers to establish requirements for their group in the form of head counts, certification requirements, or skills requirements. GCM then calculates a Group Capability Index (GCI), where a score of 1 indicates that the group is at the appropriate level; anything less than 1 indicates a potential for improvement. This shows the health of a group, both currently and over time. GCM accepts as input head count, certification needs, critical needs, competency needs, and competency critical needs. In addition, team members are categorized by years of experience, percentage of contribution, ex-members and their skills, availability, function, and in-work requirements. Outputs are several reports, including actual vs. required head count, actual vs. required certificates, CGI change over time (by month), and more. The program stores historical data for summary and historical reporting, which is done via an Excel spreadsheet that is color-coded to show health statistics at a glance. GCM has provided the Shuttle Ground Processing team with a quantifiable, repeatable approach to assessing and managing the skills in their organization. They now have a common frame of reference across NASA/contractor lines to communicate and mitigate any critical skills concerns.
The impact of 14-nm photomask uncertainties on computational lithography solutions
NASA Astrophysics Data System (ADS)
Sturtevant, John; Tejnil, Edita; Lin, Tim; Schultze, Steffen; Buck, Peter; Kalk, Franklin; Nakagawa, Kent; Ning, Guoxiang; Ackmann, Paul; Gans, Fritz; Buergel, Christian
2013-04-01
Computational lithography solutions rely upon accurate process models to faithfully represent the imaging system output for a defined set of process and design inputs. These models, which must balance accuracy demands with simulation runtime boundary conditions, rely upon the accurate representation of multiple parameters associated with the scanner and the photomask. While certain system input variables, such as scanner numerical aperture, can be empirically tuned to wafer CD data over a small range around the presumed set point, it can be dangerous to do so since CD errors can alias across multiple input variables. Therefore, many input variables for simulation are based upon designed or recipe-requested values or independent measurements. It is known, however, that certain measurement methodologies, while precise, can have significant inaccuracies. Additionally, there are known errors associated with the representation of certain system parameters. With shrinking total CD control budgets, appropriate accounting for all sources of error becomes more important, and the cumulative consequence of input errors to the computational lithography model can become significant. In this work, we examine with a simulation sensitivity study, the impact of errors in the representation of photomask properties including CD bias, corner rounding, refractive index, thickness, and sidewall angle. The factors that are most critical to be accurately represented in the model are cataloged. CD Bias values are based on state of the art mask manufacturing data and other variables changes are speculated, highlighting the need for improved metrology and awareness.
Financial Strategies Moderate Weather Impacts on Food Security Outcomes
NASA Astrophysics Data System (ADS)
Brown, M. E.; Niles, M.
2016-12-01
Global food security relies on local agricultural capacity as well as the financial ability to import food from elsewhere. Climate change is likely to affect the ability to grow sufficient food to meet the needs of a growing population in low income countries where population expansion is the greatest. This paper presents an analysis of 2095 household surveys from 12 food insecure countries in West Africa, East Africa and Asia from the Climate Change, Agriculture, and Food Security (CCAFS) program conducted from 2010-2012. Using a multi-level hierarchical random effects model, we estimated the number of months a household was food insecure with information on the rainfall anomaly the year prior to the survey, agricultural input use, cash income, and community group membership. We found that when the rainfall was either one standard deviation above or below the mean, the number of months households experience food insecurity increased by 74%. When there is a significant weather anomaly, agricultural credit and cash income, but not agricultural inputs or social capital, are found to be critical factors reducing food insecurity. This highlights the ongoing and critical importance of risk reduction strategies such as crop insurance, government safety nets, and credit for maintaining food security in the face of climate change.
Effect of Heat Input on the Tensile Damage Evolution in Pulsed Laser Welded Ti6Al4V Titanium Sheets
NASA Astrophysics Data System (ADS)
Liu, Jing; Gao, Xiaolong; Zhang, Jianxun
2016-11-01
The present paper is focused on studying the effect of heat input on the tensile damage evolution of pulsed Nd:YAG laser welding of Ti6Al4V alloy under monotonic loading. To analyze the reasons that the tensile fracture site of the pulsed-laser-welded Ti6Al4V sheet joints changes with the heat input under monotonic loading, the microstructure of the sample with different nominal strain values was investigated by in situ observation. Experiment results show that the tensile ductility and fatigue life of welded joints with low heat input are higher than that of welded joints with high heat input. Under tensile loads, the critical engineering strain for crack initiation is much lower in the welded joint with high heat input than in the welded joints with low and medium heat input. And the microstructural damage accumulation is much faster in the fusion zone than in the base metal for the welded joints with high input, whereas the microstructural damage accumulation is much faster in the base metal than in the fusion zone for the welded joints with low input. Consequently, the welded joints fractured in the fusion zone for the welds with high heat input, whereas the welded joints ruptured in the base metal for the welds with low heat input. It is proved that the fine grain microstructure produced by low heat input can improve the critical nominal strain for crack initiation and the resistance ability of microstructural damage.
NASA Astrophysics Data System (ADS)
Yu, Rong; Ding, Chunling; Wang, Jiangpeng; Zhang, Duo
2017-12-01
We explore the possibility of using an active doubly resonant microtoroid resonator to produce high-efficiency third-harmonic generation (THG) by exploiting optical third-order nonlinearity. In a microresonator, the active fundamental mode is coherently driven with a continuous-wave input laser at the telecommunication wavelength (1550 nm), and then, the visible THG signal (517 nm) is monitored via an individual bus waveguide. We thoroughly compare our results with those obtained from the conventional passive (i.e., loss) microtoroid resonator by a systematic analysis and detailed numerical simulations based on the Heisenberg-Langevin equations of motion. It is shown that the achievable THG spectrum features an ultralow critical input power. The THG power transmission can be significantly enhanced by about three orders of magnitude at a low input power of 0.1 μ W as compared with the obtained results in the passive microtoroid resonator THG system. Moreover, the THG efficiency can reach up to 100% with optical critical input power as low as a few microwatts. In turn, the analytical expressions of the critical intracavity intensity of the light in the microcavity, the critical input pump power, and the maximum THG efficiency are obtained. The enhanced THG power transmission and high conversion efficiency are attributed to a gain-induced loss compensation in the microtoroid resonator, reducing the effective loss felt by the resonator photons. With state-of-the art technologies in the field of solid-state resonators, including but not limited to microtoroids, the proposed THG scheme is experimentally realizable.
National governance of archetypes in Norway.
Ljosland Bakke, Silje
2015-01-01
Norwegian National ICT has implemented a national governance scheme for archetypes. The scheme uses openEHR, and is possibly the first of its kind worldwide. It introduces several new processes and methods for crowd sourcing clinician input. It has spent much of its first year establishing practical processes and recruiting clinicians, and only a few archetypes has been reviewed and approved. Some non-reusable archetypes have emerged while the governance scheme has established itself, which demonstrates the need for a centralised governance. As the mass of clinician involvement reached a critical point at the end of 2014, the rate of archetype review and approval increased.
Actor-critic-based optimal tracking for partially unknown nonlinear discrete-time systems.
Kiumarsi, Bahare; Lewis, Frank L
2015-01-01
This paper presents a partially model-free adaptive optimal control solution to the deterministic nonlinear discrete-time (DT) tracking control problem in the presence of input constraints. The tracking error dynamics and reference trajectory dynamics are first combined to form an augmented system. Then, a new discounted performance function based on the augmented system is presented for the optimal nonlinear tracking problem. In contrast to the standard solution, which finds the feedforward and feedback terms of the control input separately, the minimization of the proposed discounted performance function gives both feedback and feedforward parts of the control input simultaneously. This enables us to encode the input constraints into the optimization problem using a nonquadratic performance function. The DT tracking Bellman equation and tracking Hamilton-Jacobi-Bellman (HJB) are derived. An actor-critic-based reinforcement learning algorithm is used to learn the solution to the tracking HJB equation online without requiring knowledge of the system drift dynamics. That is, two neural networks (NNs), namely, actor NN and critic NN, are tuned online and simultaneously to generate the optimal bounded control policy. A simulation example is given to show the effectiveness of the proposed method.
Training future health providers to care for the underserved: a pilot interprofessional experience.
Hasnain, Memoona; Koronkowski, Michael J; Kondratowicz, Diane M; Goliak, Kristen L
2012-01-01
Interprofessional teamwork is essential for effective delivery of health care to all patients, particularly the vulnerable and underserved. This brief communication describes a pilot interprofessional learning experience designed to introduce medicine and pharmacy students to critical health issues affecting at-risk, vulnerable patients and helping students learn the value of functioning effectively in interprofessional teams. With reflective practice as an overarching principle, readings, writing assignments, a community-based immersion experience, discussion seminars, and presentations were organized to cultivate students' insights into key issues impacting the health and well-being of vulnerable patients. A written program evaluation form was used to gather students' feedback about this learning experience. Participating students evaluated this learning experience positively. Both quantitative and qualitative input indicated the usefulness of this learning experience in stimulating learners' thinking and helping them learn to work collaboratively with peers from another discipline to understand and address health issues for at-risk, vulnerable patients within their community. This pilot educational activity helped medicine and pharmacy students learn the value of functioning effectively in interprofessional teams. Given the importance of interprofessional teamwork and the increasing need to respond to the health needs of underserved populations, integrating interprofessional learning experiences in health professions training is highly relevant, feasible, and critically needed.
NASA Astrophysics Data System (ADS)
Mumbaraddi, Avinash; Yu, Huidan (Whitney); Sawchuk, Alan; Dalsing, Michael
2015-11-01
The objective of this clinical-need driven research is to investigate the effect of renal artery stenosis (RAS) on the blood flow and wall shear stress in renal arteries through 4-D patient-specific computational hemodynamics (PSCH) and search for possible critical RASs that significantly alter the pressure gradient across the stenosis by manually varying the size of RAS from 50% to 95%. The identification of the critical RAS is important to understand the contribution of RAS to the overall renal resistance thus appropriate clinical therapy can be determined in order to reduce the hypertension. Clinical CT angiographic data together with Doppler Ultra sound images of an anonymous patient are used serving as the required inputs of the PSCH. To validate the PSCH, we use both Ansys Fluent and Sim Vascular and compare velocity, pressure, and wall-shear stress under identical conditions. Renal Imaging Technology Development Program (RITDP) Grant.
A Dielectric Rod Antenna for Picosecond Pulse Stimulation of Neurological Tissue
Petrella, Ross A.; Schoenbach, Karl H.; Xiao, Shu
2016-01-01
A dielectrically loaded wideband rod antenna has been studied as a pulse delivery system to subcutaneous tissues. Simulation results applying 100 ps electrical pulse show that it allows us to generate critical electric field for biological effects, such as brain stimulation, in the range of several centimeters. In order to reach the critical electric field for biological effects, which is approximately 20 kV/cm, at a depth of 2 cm, the input voltage needs to be 175 kV. The electric field spot size in the brain at this position is approximately 1 cm2. Experimental studies in free space with a conical antenna (part of the antenna system) with aluminum nitride as the dielectric have confirmed the accuracy of the simulation. These results set the foundation for high voltage in situ experiments on the complete antenna system and the delivery of pulses to biological tissue. PMID:27563160
Ginosar, Daniel M.; Fox, Robert V.
2005-05-03
A process for producing alkyl esters useful in biofuels and lubricants by transesterifying glyceride- or esterifying free fatty acid-containing substances in a single critical phase medium is disclosed. The critical phase medium provides increased reaction rates, decreases the loss of catalyst or catalyst activity and improves the overall yield of desired product. The process involves the steps of dissolving an input glyceride- or free fatty acid-containing substance with an alcohol or water into a critical fluid medium; reacting the glyceride- or free fatty acid-containing substance with the alcohol or water input over either a solid or liquid acidic or basic catalyst and sequentially separating the products from each other and from the critical fluid medium, which critical fluid medium can then be recycled back in the process. The process significantly reduces the cost of producing additives or alternatives to automotive fuels and lubricants utilizing inexpensive glyceride- or free fatty acid-containing substances, such as animal fats, vegetable oils, rendered fats, and restaurant grease.
Moreland, Leslie D; Gore, Fiona M; Andre, Nathalie; Cairncross, Sandy; Ensink, Jeroen H J
2016-08-01
There are significant gaps in information about the inputs required to effectively extend and sustain hygiene promotion activities to improve people's health outcomes through water, sanitation and hygiene (WASH) interventions. We sought to analyse current country and global trends in the use of key inputs required for effective and sustainable implementation of hygiene promotion to help guide hygiene promotion policy and decision-making after 2015. Data collected in response to the GLAAS 2013/2014 survey from 93 countries of 94 were included, and responses were analysed for 12 questions assessing the inputs and enabling environment for hygiene promotion under four thematic areas. Data were included and analysed from 20 External Support Agencies (ESA) of 23 collected through self-administered surveys. Firstly, the data showed a large variation in the way in which hygiene promotion is defined and what constitutes key activities in this area. Secondly, challenges to implement hygiene promotion are considerable: include poor implementation of policies and plans, weak coordination mechanisms, human resource limitations and a lack of available hygiene promotion budget data. Despite the proven benefits of hand washing with soap, a critical hygiene-related factor in minimising infection, GLAAS 2013/2014 survey data showed that hygiene promotion remains a neglected component of WASH. Additional research to identify the context-specific strategies and inputs required to enhance the effectiveness of hygiene promotion at scale are needed. Improved data collection methods are also necessary to advance the availability and reliability of hygiene-specific information. © 2016 John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Gaebler, John A.; Tolson, Robert H.
2010-01-01
In the study of entry, descent, and landing, Monte Carlo sampling methods are often employed to study the uncertainty in the designed trajectory. The large number of uncertain inputs and outputs, coupled with complicated non-linear models, can make interpretation of the results difficult. Three methods that provide statistical insights are applied to an entry, descent, and landing simulation. The advantages and disadvantages of each method are discussed in terms of the insights gained versus the computational cost. The first method investigated was failure domain bounding which aims to reduce the computational cost of assessing the failure probability. Next a variance-based sensitivity analysis was studied for the ability to identify which input variable uncertainty has the greatest impact on the uncertainty of an output. Finally, probabilistic sensitivity analysis is used to calculate certain sensitivities at a reduced computational cost. These methods produce valuable information that identifies critical mission parameters and needs for new technology, but generally at a significant computational cost.
Analysing uncertainties of supply and demand in the future use of hydrogen as an energy vector
NASA Astrophysics Data System (ADS)
Lenel, U. R.; Davies, D. G. S.; Moore, M. A.
An analytical technique (Analysis with Uncertain Qualities), developed at Fulmer, is being used to examine the sensitivity of the outcome to uncertainties in input quantities in order to highlight which input quantities critically affect the potential role of hydrogen. The work presented here includes an outline of the model and the analysis technique, along with basic considerations of the input quantities to the model (demand, supply and constraints). Some examples are given of probabilistic estimates of input quantities.
How Much Input Do You Need to Learn the Most Frequent 9,000 Words?
ERIC Educational Resources Information Center
Nation, Paul
2014-01-01
This study looks at how much input is needed to gain enough repetition of the 1st 9,000 words of English for learning to occur. It uses corpora of various sizes and composition to see how many tokens of input would be needed to gain at least twelve repetitions and to meet most of the words at eight of the nine 1000 word family levels. Corpus sizes…
Final report : PATTON Alliance gazetteer evaluation project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bleakly, Denise Rae
2007-08-01
In 2005 the National Ground Intelligence Center (NGIC) proposed that the PATTON Alliance provide assistance in evaluating and obtaining the Integrated Gazetteer Database (IGDB), developed for the Naval Space Warfare Command Research group (SPAWAR) under Advance Research and Development Activity (ARDA) funds by MITRE Inc., fielded to the text-based search tool GeoLocator, currently in use by NGIC. We met with the developers of GeoLocator and identified their requirements for a better gazetteer. We then validated those requirements by reviewing the technical literature, meeting with other members of the intelligence community (IC), and talking with both the United States Geologic Surveymore » (USGS) and the National Geospatial Intelligence Agency (NGA), the authoritative sources for official geographic name information. We thus identified 12 high-level requirements from users and the broader intelligence community. The IGDB satisfies many of these requirements. We identified gaps and proposed ways of closing these gaps. Three important needs have not been addressed but are critical future needs for the broader intelligence community. These needs include standardization of gazetteer data, a web feature service for gazetteer information that is maintained by NGA and USGS but accessible to users, and a common forum that brings together IC stakeholders and federal agency representatives to provide input to these activities over the next several years. Establishing a robust gazetteer web feature service that is available to all IC users may go a long way toward resolving the gazetteer needs within the IC. Without a common forum to provide input and feedback, community adoption may take significantly longer than anticipated with resulting risks to the war fighter.« less
Parra-Medina, Deborah; Esparza, Laura A.
2014-01-01
Hispanic girls are burdened with high levels of obesity and are less active than the general adolescent population, highlighting the need for creative strategies developed with community input to improve PA behaviors. Involving girls, parents, and the community in the intervention planning process may improve uptake and maintenance of PA. The purpose of this article is to describe how we engaged adolescent girls as partners in community-based intervention planning research. We begin with an overview of the research project and then describe how we used Participatory Photo Mapping (PPM) to engage girls in critical reflection and problems solving. PMID:25423243
Supplies and equipment for pediatric emergency mass critical care.
Bohn, Desmond; Kanter, Robert K; Burns, Jeffrey; Barfield, Wanda D; Kissoon, Niranjan
2011-11-01
Epidemics of acute respiratory disease, such as severe acute respiratory syndrome in 2003, and natural disasters, such as Hurricane Katrina in 2005, have prompted planning in hospitals that offer adult critical care to increase their capacity and equipment inventory for responding to a major demand surge. However, planning at a national, state, or local level to address the particular medical resource needs of children for mass critical care has yet to occur in any coordinated way. This paper presents the consensus opinion of the Task Force regarding supplies and equipment that would be required during a pediatric mass critical care crisis. In May 2008, the Task Force for Mass Critical Care published guidance on provision of mass critical care to adults. Acknowledging that the critical care needs of children during disasters were unaddressed by this effort, a 17-member Steering Committee, assembled by the Oak Ridge Institute for Science and Education with guidance from members of the American Academy of Pediatrics, convened in April 2009 to determine priority topic areas for pediatric emergency mass critical care recommendations.Steering Committee members established subcommittees by topic area and performed literature reviews of MEDLINE and Ovid databases. The Steering Committee produced draft outlines through consensus-based study of the literature and convened October 6-7, 2009, in New York, NY, to review and revise each outline. Eight draft documents were subsequently developed from the revised outlines as well as through searches of MEDLINE updated through March 2010.The Pediatric Emergency Mass Critical Care Task Force, composed of 36 experts from diverse public health, medical, and disaster response fields, convened in Atlanta, GA, on March 29-30, 2010. Feedback on each manuscript was compiled and the Steering Committee revised each document to reflect expert input in addition to the most current medical literature. The Task Force endorsed the view that supplies and equipment must be available for a tripling of capacity above the usual peak pediatric intensive care unit capacity for at least 10 days. The recommended size-specific pediatric mass critical care equipment stockpile for two types of patients is presented in terms of equipment needs per ten mass critical care beds, which would serve 26 patients over a 10-day period. Specific recommendations are made regarding ventilator capacity, including the potential use of high-frequency oscillatory ventilation and extracorporeal membrane oxygenation. Other recommendations include inventories for disposable medical equipment, medications, and staffing levels.
Supplies and equipment for pediatric emergency mass critical care
Bohn, Desmond; Kanter, Robert K.; Burns, Jeffrey; Barfield, Wanda D.; Kissoon, Niranjan
2015-01-01
Introduction Epidemics of acute respiratory disease, such as severe acute respiratory syndrome in 2003, and natural disasters, such as Hurricane Katrina in 2005, have prompted planning in hospitals that offer adult critical care to increase their capacity and equipment inventory for responding to a major demand surge. However, planning at a national, state, or local level to address the particular medical resource needs of children for mass critical care has yet to occur in any coordinated way. This paper presents the consensus opinion of the Task Force regarding supplies and equipment that would be required during a pediatric mass critical care crisis. Methods In May 2008, the Task Force for Mass Critical Care published guidance on provision of mass critical care to adults. Acknowledging that the critical care needs of children during disasters were unaddressed by this effort, a 17-member Steering Committee, assembled by the Oak Ridge Institute for Science and Education with guidance from members of the American Academy of Pediatrics, convened in April 2009 to determine priority topic areas for pediatric emergency mass critical care recommendations. Steering Committee members established subcommittees by topic area and performed literature reviews of MEDLINE and Ovid databases. The Steering Committee produced draft outlines through consensus-based study of the literature and convened October 6 –7, 2009, in New York, NY, to review and revise each outline. Eight draft documents were subsequently developed from the revised outlines as well as through searches of MEDLINE updated through March 2010. The Pediatric Emergency Mass Critical Care Task Force, composed of 36 experts from diverse public health, medical, and disaster response fields, convened in Atlanta, GA, on March 29 –30, 2010. Feedback on each manuscript was compiled and the Steering Committee revised each document to reflect expert input in addition to the most current medical literature. Task Force Recommendations The Task Force endorsed the view that supplies and equipment must be available for a tripling of capacity above the usual peak pediatric intensive care unit capacity for at least 10 days. The recommended size-specific pediatric mass critical care equipment stockpile for two types of patients is presented in terms of equipment needs per ten mass critical care beds, which would serve 26 patients over a 10-day period. Specific recommendations are made regarding ventilator capacity, including the potential use of high-frequency oscillatory ventilation and extracorporeal membrane oxygenation. Other recommendations include inventories for disposable medical equipment, medications, and staffing levels. PMID:22067920
Optimization of replacement and inspection decisions for multiple components on a power system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mauney, D.A.
1994-12-31
The use of optimization on the rescheduling of replacement dates provided a very proactive approach to deciding when components on individual units need to be addressed with a run/repair/replace decision. Including the effects of time value of money and taxes and unit need inside the spreadsheet model allowed the decision maker to concentrate on the effects of engineering input and replacement date decisions on the final net present value (NPV). The personal computer (PC)-based model was applied to a group of 140 forced outage critical fossil plant tube components across a power system. The estimated resulting NPV of the optimizationmore » was in the tens of millions of dollars. This PC spreadsheet model allows the interaction of inputs from structural reliability risk assessment models, plant foreman interviews, and actual failure history on a by component by unit basis across a complete power production system. This model includes not only the forced outage performance of these components caused by tube failures but, in addition, the forecasted need of the individual units on the power system and the expected cost of their replacement power if forced off line. The use of cash flow analysis techniques in the spreadsheet model results in the calculation of an NPV for a whole combination of replacement dates. This allows rapid assessments of {open_quotes}what if{close_quotes} scenarios of major maintenance projects on a systemwide basis and not just on a unit-by-unit basis.« less
Hydrology in a peaty high marsh: hysteretic flow and biogeochemical implications
Terrestrial nutrient input to coastal waters is a critical water quality problem worldwide, and salt marshes may provide a valuable nutrient buffer (either by removal or by smoothing out pulse inputs) between terrestrial sources and sensitive estuarine habitats. One of the major...
Simple Sensitivity Analysis for Orion GNC
NASA Technical Reports Server (NTRS)
Pressburger, Tom; Hoelscher, Brian; Martin, Rodney; Sricharan, Kumar
2013-01-01
The performance of Orion flight software, especially its GNC software, is being analyzed by running Monte Carlo simulations of Orion spacecraft flights. The simulated performance is analyzed for conformance with flight requirements, expressed as performance constraints. Flight requirements include guidance (e.g. touchdown distance from target) and control (e.g., control saturation) as well as performance (e.g., heat load constraints). The Monte Carlo simulations disperse hundreds of simulation input variables, for everything from mass properties to date of launch.We describe in this paper a sensitivity analysis tool (Critical Factors Tool or CFT) developed to find the input variables or pairs of variables which by themselves significantly influence satisfaction of requirements or significantly affect key performance metrics (e.g., touchdown distance from target). Knowing these factors can inform robustness analysis, can inform where engineering resources are most needed, and could even affect operations. The contributions of this paper include the introduction of novel sensitivity measures, such as estimating success probability, and a technique for determining whether pairs of factors are interacting dependently or independently. The tool found that input variables such as moments, mass, thrust dispersions, and date of launch were found to be significant factors for success of various requirements. Examples are shown in this paper as well as a summary and physics discussion of EFT-1 driving factors that the tool found.
Ensuring the validity of calculated subcritical limits
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, H.K.
1977-01-01
The care taken at the Savannah River Laboratory and Plant to ensure the validity of calculated subcritical limits is described. Close attention is given to ANSI N16.1-1975, ''Validation of Calculational Methods for Nuclear Criticality Safety.'' The computer codes used for criticality safety computations, which are listed and are briefly described, have been placed in the SRL JOSHUA system to facilitate calculation and to reduce input errors. A driver module, KOKO, simplifies and standardizes input and links the codes together in various ways. For any criticality safety evaluation, correlations of the calculational methods are made with experiment to establish bias. Occasionallymore » subcritical experiments are performed expressly to provide benchmarks. Calculated subcritical limits contain an adequate but not excessive margin to allow for uncertainty in the bias. The final step in any criticality safety evaluation is the writing of a report describing the calculations and justifying the margin.« less
A policy iteration approach to online optimal control of continuous-time constrained-input systems.
Modares, Hamidreza; Naghibi Sistani, Mohammad-Bagher; Lewis, Frank L
2013-09-01
This paper is an effort towards developing an online learning algorithm to find the optimal control solution for continuous-time (CT) systems subject to input constraints. The proposed method is based on the policy iteration (PI) technique which has recently evolved as a major technique for solving optimal control problems. Although a number of online PI algorithms have been developed for CT systems, none of them take into account the input constraints caused by actuator saturation. In practice, however, ignoring these constraints leads to performance degradation or even system instability. In this paper, to deal with the input constraints, a suitable nonquadratic functional is employed to encode the constraints into the optimization formulation. Then, the proposed PI algorithm is implemented on an actor-critic structure to solve the Hamilton-Jacobi-Bellman (HJB) equation associated with this nonquadratic cost functional in an online fashion. That is, two coupled neural network (NN) approximators, namely an actor and a critic are tuned online and simultaneously for approximating the associated HJB solution and computing the optimal control policy. The critic is used to evaluate the cost associated with the current policy, while the actor is used to find an improved policy based on information provided by the critic. Convergence to a close approximation of the HJB solution as well as stability of the proposed feedback control law are shown. Simulation results of the proposed method on a nonlinear CT system illustrate the effectiveness of the proposed approach. Copyright © 2013 ISA. All rights reserved.
Critical Zone Services as a Measure for Evaluating the Trade-offs in Intensively Managed Landscapes
NASA Astrophysics Data System (ADS)
Richardson, M.; Kumar, P.
2015-12-01
The Critical Zone includes the range of biophysical processes occurring from the top of the vegetation canopy to the weathering zone below the groundwater table. These services (Field et al. 2015) provide a measure to value processes that support the goods and services from our landscapes. In intensively managed landscapes the provisioning and regulating services are being altered through anthropogenic energy inputs so as to derive more agricultural productivity from the landscapes. Land use change and other alterations to the environment result in positive and/or negative net Critical Zone services. Through studies in the Critical Zone Observatory for Intensively Managed Landscapes (IMLCZO), this research seeks to answer questions such as: Are perennial bioenergy crops or annual replaced crops better for the land and surrounding environment? How do we evaluate the products and services from the land for the energy and resources we put in? Before the economic valuation of Critical Zone services, these questions seemed abstract. However, with developments such as Critical Zone services and life cycle assessments, they are more concrete. To evaluate the trade-offs between positive and negative impacts, life cycle assessments are used to create an inventory of all the energy inputs and outputs in a landscape management system. Total energy is computed by summing the mechanical energy used to construct tile drains, fertilizer, and other processes involved in intensely managed landscapes and the chemical energy gained by the production of biofuels from bioenergy crops. A multi-layer canopy model (MLCan) computes soil, water, and nutrient outputs for each crop type, which can be translated into Critical Zone services. These values are then viewed alongside the energy inputs into the system to show the relationship between agricultural practices and their corresponding ecosystem and environmental impacts.
Ethical issues in pediatric emergency mass critical care.
Antommaria, Armand H Matheny; Powell, Tia; Miller, Jennifer E; Christian, Michael D
2011-11-01
As a result of recent events, including natural disasters and pandemics, mass critical care planning has become a priority. In general, planning involves limiting the scope of disasters, increasing the supply of medical resources, and allocating scarce resources. Entities at varying levels have articulated ethical frameworks to inform policy development. In spite of this increased focus, children have received limited attention. Children require special attention because of their unique vulnerabilities and needs. In May 2008, the Task Force for Mass Critical Care published guidance on provision of mass critical care to adults. Acknowledging that the critical care needs of children during disasters were unaddressed by this effort, a 17-member Steering Committee, assembled by the Oak Ridge Institute for Science and Education with guidance from members of the American Academy of Pediatrics, convened in April 2009 to determine priority topic areas for pediatric emergency mass critical care recommendations.Steering Committee members established subgroups by topic area and performed literature reviews of MEDLINE and Ovid databases. Draft documents were subsequently developed and revised based on the feedback from the Task Force. The Pediatric Emergency Mass Critical Care Task Force, composed of 36 experts from diverse public health, medical, and disaster response fields, convened in Atlanta, GA, on March 29-30, 2010. This document reflects expert input from the Task Force in addition to the most current medical literature. The Ethics Subcommittee recommends that surge planning seek to provide resources for children in proportion to their percentage of the population or preferably, if data are available, the percentage of those affected by the disaster. Generally, scarce resources should be allocated on the basis of need, benefit, and the conservation of resources. Estimates of need, benefit, and resource utilization may be more subjective or objective. While the Subcommittee favors more objective methods, pediatrics lacks a simple, validated scoring system to predict benefit or resource utilization. The Subcommittee hesitantly recommends relying on expert opinion while pediatric triage tools are developed. If resources remain inadequate, they should then be allocated based on queuing or lottery. Choosing between these methods is based on ethical, psychological, and practical considerations upon which the Subcommittee could not reach consensus. The Subcommittee unanimously believes the proposal to favor individuals between 15 and 40 yrs of age is inappropriate. Other age-based criteria and criteria based on social role remain controversial. The Subcommittee recommends continued work to engage all stakeholders, especially the public, in deliberation about these issues.
Trusted Computing Technologies, Intel Trusted Execution Technology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guise, Max Joseph; Wendt, Jeremy Daniel
2011-01-01
We describe the current state-of-the-art in Trusted Computing Technologies - focusing mainly on Intel's Trusted Execution Technology (TXT). This document is based on existing documentation and tests of two existing TXT-based systems: Intel's Trusted Boot and Invisible Things Lab's Qubes OS. We describe what features are lacking in current implementations, describe what a mature system could provide, and present a list of developments to watch. Critical systems perform operation-critical computations on high importance data. In such systems, the inputs, computation steps, and outputs may be highly sensitive. Sensitive components must be protected from both unauthorized release, and unauthorized alteration: Unauthorizedmore » users should not access the sensitive input and sensitive output data, nor be able to alter them; the computation contains intermediate data with the same requirements, and executes algorithms that the unauthorized should not be able to know or alter. Due to various system requirements, such critical systems are frequently built from commercial hardware, employ commercial software, and require network access. These hardware, software, and network system components increase the risk that sensitive input data, computation, and output data may be compromised.« less
Project Management Using Modern Guidance, Navigation and Control Theory
NASA Technical Reports Server (NTRS)
Hill, Terry
2010-01-01
The idea of control theory and its application to project management is not new, however literature on the topic and real-world applications is not as readily available and comprehensive in how all the principals of Guidance, Navigation and Control (GN&C) apply. This paper will address how the fundamental principals of modern GN&C Theory have been applied to NASA's Constellation Space Suit project and the results in the ability to manage the project within cost, schedule and budget. A s with physical systems, projects can be modeled and managed with the same guiding principles of GN&C as if it were a complex vehicle, system or software with time-varying processes, at times non-linear responses, multiple data inputs of varying accuracy and a range of operating points. With such systems the classic approach could be applied to small and well-defined projects; however with larger, multi-year projects involving multiple organizational structures, external influences and a multitude of diverse resources, then modern control theory is required to model and control the project. The fundamental principals of G N&C stated that a system is comprised of these basic core concepts: State, Behavior, Control system, Navigation system, Guidance and Planning Logic, Feedback systems. The state of a system is a definition of the aspects of the dynamics of the system that can change, such as position, velocity, acceleration, coordinate-based attitude, temperature, etc. The behavior of the system is more of what changes are possible rather than what can change, which is captured in the state of the system. The behavior of a system is captured in the system modeling and if properly done, will aid in accurate system performance prediction in the future. The Control system understands the state and behavior of the system and feedback systems to adjust the control inputs into the system. The Navigation system takes the multiple data inputs and based upon a priori knowledge of the input, will develop a statistical-based weighting of the input to determine where the system currently is located. Guidance and Planning logic of the system with the understanding of where it is (provided by the navigation system) will in turn determine where it needs to be and how to get there. Lastly, the system Feedback system is the right arm of the control system to allow it to affect change in the overall system and therefore it is critical to not only correctly identify the system feedback inputs but also the system response to the feedback inputs. And with any systems project it is critical that the objective of the system be clearly defined for not only planning but to be used to measure performance and to aid in the guidance of the system or project.
Kriener, Birgit; Helias, Moritz; Rotter, Stefan; Diesmann, Markus; Einevoll, Gaute T
2013-01-01
Pattern formation, i.e., the generation of an inhomogeneous spatial activity distribution in a dynamical system with translation invariant structure, is a well-studied phenomenon in neuronal network dynamics, specifically in neural field models. These are population models to describe the spatio-temporal dynamics of large groups of neurons in terms of macroscopic variables such as population firing rates. Though neural field models are often deduced from and equipped with biophysically meaningful properties, a direct mapping to simulations of individual spiking neuron populations is rarely considered. Neurons have a distinct identity defined by their action on their postsynaptic targets. In its simplest form they act either excitatorily or inhibitorily. When the distribution of neuron identities is assumed to be periodic, pattern formation can be observed, given the coupling strength is supracritical, i.e., larger than a critical weight. We find that this critical weight is strongly dependent on the characteristics of the neuronal input, i.e., depends on whether neurons are mean- or fluctuation driven, and different limits in linearizing the full non-linear system apply in order to assess stability. In particular, if neurons are mean-driven, the linearization has a very simple form and becomes independent of both the fixed point firing rate and the variance of the input current, while in the very strongly fluctuation-driven regime the fixed point rate, as well as the input mean and variance are important parameters in the determination of the critical weight. We demonstrate that interestingly even in "intermediate" regimes, when the system is technically fluctuation-driven, the simple linearization neglecting the variance of the input can yield the better prediction of the critical coupling strength. We moreover analyze the effects of structural randomness by rewiring individual synapses or redistributing weights, as well as coarse-graining on the formation of inhomogeneous activity patterns.
Kriener, Birgit; Helias, Moritz; Rotter, Stefan; Diesmann, Markus; Einevoll, Gaute T.
2014-01-01
Pattern formation, i.e., the generation of an inhomogeneous spatial activity distribution in a dynamical system with translation invariant structure, is a well-studied phenomenon in neuronal network dynamics, specifically in neural field models. These are population models to describe the spatio-temporal dynamics of large groups of neurons in terms of macroscopic variables such as population firing rates. Though neural field models are often deduced from and equipped with biophysically meaningful properties, a direct mapping to simulations of individual spiking neuron populations is rarely considered. Neurons have a distinct identity defined by their action on their postsynaptic targets. In its simplest form they act either excitatorily or inhibitorily. When the distribution of neuron identities is assumed to be periodic, pattern formation can be observed, given the coupling strength is supracritical, i.e., larger than a critical weight. We find that this critical weight is strongly dependent on the characteristics of the neuronal input, i.e., depends on whether neurons are mean- or fluctuation driven, and different limits in linearizing the full non-linear system apply in order to assess stability. In particular, if neurons are mean-driven, the linearization has a very simple form and becomes independent of both the fixed point firing rate and the variance of the input current, while in the very strongly fluctuation-driven regime the fixed point rate, as well as the input mean and variance are important parameters in the determination of the critical weight. We demonstrate that interestingly even in “intermediate” regimes, when the system is technically fluctuation-driven, the simple linearization neglecting the variance of the input can yield the better prediction of the critical coupling strength. We moreover analyze the effects of structural randomness by rewiring individual synapses or redistributing weights, as well as coarse-graining on the formation of inhomogeneous activity patterns. PMID:24501591
Effect of Common Cryoprotectants on Critical Warming Rates and Ice Formation in Aqueous Solutions
Hopkins, Jesse B.; Badeau, Ryan; Warkentin, Matthew; Thorne, Robert E.
2012-01-01
Ice formation on warming is of comparable or greater importance to ice formation on cooling in determining survival of cryopreserved samples. Critical warming rates required for ice-free warming of vitrified aqueous solutions of glycerol, dimethyl sulfoxide, ethylene glycol, polyethylene glycol 200 and sucrose have been measured for warming rates of order 10 to 104 K/s. Critical warming rates are typically one to three orders of magnitude larger than critical cooling rates. Warming rates vary strongly with cooling rates, perhaps due to the presence of small ice fractions in nominally vitrified samples. Critical warming and cooling rate data spanning orders of magnitude in rates provide rigorous tests of ice nucleation and growth models and their assumed input parameters. Current models with current best estimates for input parameters provide a reasonable account of critical warming rates for glycerol solutions at high concentrations/low rates, but overestimate both critical warming and cooling rates by orders of magnitude at lower concentrations and larger rates. In vitrification protocols, minimizing concentrations of potentially damaging cryoprotectants while minimizing ice formation will require ultrafast warming rates, as well as fast cooling rates to minimize the required warming rates. PMID:22728046
Physics opportunities with meson beams
Briscoe, William J.; Doring, Michael; Haberzettl, Helmut; ...
2015-10-20
Over the past two decades, meson photo- and electro-production data of unprecedented quality and quantity have been measured at electromagnetic facilities worldwide. By contrast, the meson-beam data for the same hadronic final states are mostly outdated and largely of poor quality, or even nonexistent, and thus provide inadequate input to help interpret, analyze, and exploit the full potential of the new electromagnetic data. To reap the full benefit of the high-precision electromagnetic data, new high-statistics data from measurements with meson beams, with good angle and energy coverage for a wide range of reactions, are critically needed to advance our knowledgemore » in baryon and meson spectroscopy and other related areas of hadron physics. To address this situation, a state of-the-art meson-beam facility needs to be constructed. Furthermore, the present paper summarizes unresolved issues in hadron physics and outlines the vast opportunities and advances that only become possible with such a facility.« less
Physics opportunities with meson beams
NASA Astrophysics Data System (ADS)
Briscoe, William J.; Döring, Michael; Haberzettl, Helmut; Manley, D. Mark; Naruki, Megumi; Strakovsky, Igor I.; Swanson, Eric S.
2015-10-01
Over the past two decades, meson photo- and electroproduction data of unprecedented quality and quantity have been measured at electromagnetic facilities worldwide. By contrast, the meson-beam data for the same hadronic final states are mostly outdated and largely of poor quality, or even non-existent, and thus provide inadequate input to help interpret, analyze, and exploit the full potential of the new electromagnetic data. To reap the full benefit of the high-precision electromagnetic data, new high-statistics data from measurements with meson beams, with good angle and energy coverage for a wide range of reactions, are critically needed to advance our knowledge in baryon and meson spectroscopy and other related areas of hadron physics. To address this situation, a state-of-the-art meson-beam facility needs to be constructed. The present paper summarizes unresolved issues in hadron physics and outlines the vast opportunities and advances that only become possible with such a facility.
Critical research issues in development of biomathematical models of fatigue and performance.
Dinges, David F
2004-03-01
This article reviews the scientific research needed to ensure the continued development, validation, and operational transition of biomathematical models of fatigue and performance. These models originated from the need to ascertain the formal underlying relationships among sleep and circadian dynamics in the control of alertness and neurobehavioral performance capability. Priority should be given to research that further establishes their basic validity, including the accuracy of the core mathematical formulae and parameters that instantiate the interactions of sleep/wake and circadian processes. Since individuals can differ markedly and reliably in their responses to sleep loss and to countermeasures for it, models must incorporate estimates of these inter-individual differences, and research should identify predictors of them. To ensure models accurately predict recovery of function with sleep of varying durations, dose-response curves for recovery of performance as a function of prior sleep homeostatic load and the number of days of recovery are needed. It is also necessary to establish whether the accuracy of models is affected by using work/rest schedules as surrogates for sleep/wake inputs to models. Given the importance of light as both a circadian entraining agent and an alerting agent, research should determine the extent to which light input could incrementally improve model predictions of performance, especially in persons exposed to night work, jet lag, and prolonged work. Models seek to estimate behavioral capability and/or the relative risk of adverse events in a fatigued state. Research is needed on how best to scale and interpret metrics of behavioral capability, and incorporate factors that amplify or diminish the relationship between model predictions of performance and risk outcomes.
A Practical Risk Assessment Methodology for Safety-Critical Train Control Systems
DOT National Transportation Integrated Search
2009-07-01
This project proposes a Practical Risk Assessment Methodology (PRAM) for analyzing railroad accident data and assessing the risk and benefit of safety-critical train control systems. This report documents in simple steps the algorithms and data input...
Superconducting critical temperature under pressure
NASA Astrophysics Data System (ADS)
González-Pedreros, G. I.; Baquero, R.
2018-05-01
The present record on the critical temperature of a superconductor is held by sulfur hydride (approx. 200 K) under very high pressure (approx. 56 GPa.). As a consequence, the dependence of the superconducting critical temperature on pressure became a subject of great interest and a high number of papers on of different aspects of this subject have been published in the scientific literature since. In this paper, we calculate the superconducting critical temperature as a function of pressure, Tc(P), by a simple method. Our method is based on the functional derivative of the critical temperature with the Eliashberg function, δTc(P)/δα2F(ω). We obtain the needed coulomb electron-electron repulsion parameter, μ*(P) at each pressure in a consistent way by fitting it to the corresponding Tc using the linearized Migdal-Eliashberg equation. This method requires as input the knowledge of Tc at the starting pressure only. It applies to superconductors for which the Migdal-Eliashberg equations hold. We study Al and β - Sn two weak-coupling low-Tc superconductors and Nb, the strong coupling element with the highest critical temperature. For Al, our results for Tc(P) show an excellent agreement with the calculations of Profeta et al. which are known to agree well with experiment. For β - Sn and Nb, we found a good agreement with the experimental measurements reported in several works. This method has also been applied successfully to PdH elsewhere. Our method is simple, computationally light and gives very accurate results.
ERIC Educational Resources Information Center
Nguyen, Minh Thi Thuy; Pham, Hanh Thi; Pham, Tam Minh
2017-01-01
This study investigates the combined effects of input enhancement and recasts on a group of Vietnamese EFL learners' performance of constructive criticism during peer review activities. Particularly, the study attempts to find out whether the instruction works for different aspects of pragmatic learning, including the learners' sociopragmatic and…
The Comprehension and Production of Wh-Questions in Deaf and Hard-of-Hearing Children
ERIC Educational Resources Information Center
Friedmann, Naama; Szterman, Ronit
2011-01-01
Hearing loss during the critical period for language acquisition restricts spoken language input. This input limitation, in turn, may hamper syntactic development. This study examined the comprehension, production, and repetition of Wh-questions in deaf or hard-of-hearing (DHH) children. The participants were 11 orally trained Hebrew-speaking…
Linking dynamics of the inhibitory network to the input structure
Komarov, Maxim
2017-01-01
Networks of inhibitory interneurons are found in many distinct classes of biological systems. Inhibitory interneurons govern the dynamics of principal cells and are likely to be critically involved in the coding of information. In this theoretical study, we describe the dynamics of a generic inhibitory network in terms of low-dimensional, simplified rate models. We study the relationship between the structure of external input applied to the network and the patterns of activity arising in response to that stimulation. We found that even a minimal inhibitory network can generate a great diversity of spatio-temporal patterning including complex bursting regimes with non-trivial ratios of burst firing. Despite the complexity of these dynamics, the network’s response patterns can be predicted from the rankings of the magnitudes of external inputs to the inhibitory neurons. This type of invariant dynamics is robust to noise and stable in densely connected networks with strong inhibitory coupling. Our study predicts that the response dynamics generated by an inhibitory network may provide critical insights about the temporal structure of the sensory input it receives. PMID:27650865
Improved Open-Microphone Speech Recognition
NASA Astrophysics Data System (ADS)
Abrash, Victor
2002-12-01
Many current and future NASA missions make extreme demands on mission personnel both in terms of work load and in performing under difficult environmental conditions. In situations where hands are impeded or needed for other tasks, eyes are busy attending to the environment, or tasks are sufficiently complex that ease of use of the interface becomes critical, spoken natural language dialog systems offer unique input and output modalities that can improve efficiency and safety. They also offer new capabilities that would not otherwise be available. For example, many NASA applications require astronauts to use computers in micro-gravity or while wearing space suits. Under these circumstances, command and control systems that allow users to issue commands or enter data in hands-and eyes-busy situations become critical. Speech recognition technology designed for current commercial applications limits the performance of the open-ended state-of-the-art dialog systems being developed at NASA. For example, today's recognition systems typically listen to user input only during short segments of the dialog, and user input outside of these short time windows is lost. Mistakes detecting the start and end times of user utterances can lead to mistakes in the recognition output, and the dialog system as a whole has no way to recover from this, or any other, recognition error. Systems also often require the user to signal when that user is going to speak, which is impractical in a hands-free environment, or only allow a system-initiated dialog requiring the user to speak immediately following a system prompt. In this project, SRI has developed software to enable speech recognition in a hands-free, open-microphone environment, eliminating the need for a push-to-talk button or other signaling mechanism. The software continuously captures a user's speech and makes it available to one or more recognizers. By constantly monitoring and storing the audio stream, it provides the spoken dialog manager extra flexibility to recognize the signal with no audio gaps between recognition requests, as well as to rerecognize portions of the signal, or to rerecognize speech with different grammars, acoustic models, recognizers, start times, and so on. SRI expects that this new open-mic functionality will enable NASA to develop better error-correction mechanisms for spoken dialog systems, and may also enable new interaction strategies.
Improved Open-Microphone Speech Recognition
NASA Technical Reports Server (NTRS)
Abrash, Victor
2002-01-01
Many current and future NASA missions make extreme demands on mission personnel both in terms of work load and in performing under difficult environmental conditions. In situations where hands are impeded or needed for other tasks, eyes are busy attending to the environment, or tasks are sufficiently complex that ease of use of the interface becomes critical, spoken natural language dialog systems offer unique input and output modalities that can improve efficiency and safety. They also offer new capabilities that would not otherwise be available. For example, many NASA applications require astronauts to use computers in micro-gravity or while wearing space suits. Under these circumstances, command and control systems that allow users to issue commands or enter data in hands-and eyes-busy situations become critical. Speech recognition technology designed for current commercial applications limits the performance of the open-ended state-of-the-art dialog systems being developed at NASA. For example, today's recognition systems typically listen to user input only during short segments of the dialog, and user input outside of these short time windows is lost. Mistakes detecting the start and end times of user utterances can lead to mistakes in the recognition output, and the dialog system as a whole has no way to recover from this, or any other, recognition error. Systems also often require the user to signal when that user is going to speak, which is impractical in a hands-free environment, or only allow a system-initiated dialog requiring the user to speak immediately following a system prompt. In this project, SRI has developed software to enable speech recognition in a hands-free, open-microphone environment, eliminating the need for a push-to-talk button or other signaling mechanism. The software continuously captures a user's speech and makes it available to one or more recognizers. By constantly monitoring and storing the audio stream, it provides the spoken dialog manager extra flexibility to recognize the signal with no audio gaps between recognition requests, as well as to rerecognize portions of the signal, or to rerecognize speech with different grammars, acoustic models, recognizers, start times, and so on. SRI expects that this new open-mic functionality will enable NASA to develop better error-correction mechanisms for spoken dialog systems, and may also enable new interaction strategies.
Infant perceptual development for faces and spoken words: An integrated approach
Watson, Tamara L; Robbins, Rachel A; Best, Catherine T
2014-01-01
There are obvious differences between recognizing faces and recognizing spoken words or phonemes that might suggest development of each capability requires different skills. Recognizing faces and perceiving spoken language, however, are in key senses extremely similar endeavors. Both perceptual processes are based on richly variable, yet highly structured input from which the perceiver needs to extract categorically meaningful information. This similarity could be reflected in the perceptual narrowing that occurs within the first year of life in both domains. We take the position that the perceptual and neurocognitive processes by which face and speech recognition develop are based on a set of common principles. One common principle is the importance of systematic variability in the input as a source of information rather than noise. Experience of this variability leads to perceptual tuning to the critical properties that define individual faces or spoken words versus their membership in larger groupings of people and their language communities. We argue that parallels can be drawn directly between the principles responsible for the development of face and spoken language perception. PMID:25132626
Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling
NASA Technical Reports Server (NTRS)
Hojnicki, Jeffrey S.; Rusick, Jeffrey J.
2005-01-01
Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).
Uncertainty in Ecohydrological Modeling in an Arid Region Determined with Bayesian Methods
Yang, Junjun; He, Zhibin; Du, Jun; Chen, Longfei; Zhu, Xi
2016-01-01
In arid regions, water resources are a key forcing factor in ecosystem circulation, and soil moisture is the critical link that constrains plant and animal life on the soil surface and underground. Simulation of soil moisture in arid ecosystems is inherently difficult due to high variability. We assessed the applicability of the process-oriented CoupModel for forecasting of soil water relations in arid regions. We used vertical soil moisture profiling for model calibration. We determined that model-structural uncertainty constituted the largest error; the model did not capture the extremes of low soil moisture in the desert-oasis ecotone (DOE), particularly below 40 cm soil depth. Our results showed that total uncertainty in soil moisture prediction was improved when input and output data, parameter value array, and structure errors were characterized explicitly. Bayesian analysis was applied with prior information to reduce uncertainty. The need to provide independent descriptions of uncertainty analysis (UA) in the input and output data was demonstrated. Application of soil moisture simulation in arid regions will be useful for dune-stabilization and revegetation efforts in the DOE. PMID:26963523
Evidence review and clinical guidance for the use of ziprasidone in Canada
2013-01-01
While indicated for schizophrenia and acute mania, ziprasidone’s evidence base and use in clinical practice extends beyond these regulatory approvals. We, an invited panel of experts led by a working group of 3, critically examined the evidence and our collective experience regarding the effectiveness, tolerability and safety of ziprasidone across its clinical uses. There was no opportunity for manufacturer input into the content of the review. As anticipated, ziprasidone was found to be effective for its indicated uses, although its utility in mania and mixed states lacked comparative data. Beyond these uses, the available data were either unimpressive or were lacking. An attractive characteristic is its neutral effect on weight thereby providing patients with a non-obesogenic long-term treatment option. Key challenges in practice include the need for dosing on a full stomach and managing its early onset adverse effect of restlessness. Addressing these issues are critical to its long-term success PMID:23347694
Ng, Kara; Szabo, Zoltan; Reilly, Pamela A.; Barringer, Julia; Smalling, Kelly L.
2016-01-01
Mercury (Hg) is considered a contaminant of global concern for coastal environments due to its toxicity, widespread occurrence in sediment, and bioaccumulation in tissue. Coastal New Jersey, USA, is characterized by shallow bays and wetlands that provide critical habitat for wildlife but share space with expanding urban landscapes. This study was designed as an assessment of the magnitude and distribution of Hg in coastal New Jersey sediments and critical species using publicly available data to highlight potential data gaps. Mercury concentrations in estuary sediments can exceed 2 μg/g and correlate with concentrations of other metals. Based on existing data, the concentrations of Hg in mussels in southern New Jersey are comparable to those observed in other urbanized Atlantic Coast estuaries. Lack of methylmercury data for sediments, other media, and tissues are data gaps needing to be filled for a clearer understanding of the impacts of Hg inputs to the ecosystem.
Rural relevant quality measures for critical access hospitals.
Casey, Michelle M; Moscovice, Ira; Klingner, Jill; Prasad, Shailendra
2013-01-01
To identify current and future relevant quality measures for Critical Access Hospitals (CAHs). Three criteria (patient volume, internal usefulness for quality improvement, and external usefulness for public reporting and payment reform) were used to analyze quality measures for their relevance for CAHs. A 6-member panel with expertise in rural hospital quality measurement and improvement provided input regarding the final measure selection. The relevant quality measures for CAHs include measures that are ready for reporting now and measures that need specifications to be finalized and/or a data reporting mechanism to be established. They include inpatient measures for specific medical conditions, global measures that address appropriate care across multiple medical conditions, and Emergency Department measures. All CAHs should publicly report on relevant quality measures. Acceptance of a single consolidated set of quality measures with common specifications for CAHs by all entities involved in regulation, accreditation, and payment; a phased process to implement the relevant measures; and the provision of technical assistance would help CAHs meet the challenge of reporting. © 2012 National Rural Health Association.
Tsai, Tsai-Hsuan; Nash, Robert J; Tseng, Kevin C
2009-05-01
This article presents how the researcher goes about answering the research question, 'how assistive technology impacts computer use among individuals with cervical spinal cord injury?' through an in-depth investigation into the real-life situations among computer operators with cervical spinal cord injuries (CSI). An in-depth survey was carried out to provide an insight into the function abilities and limitation, habitual practice and preference, choices and utilisation of input devices, personal and/or technical assistance, environmental set-up and arrangements and special requirements among 20 experienced computer users with cervical spinal cord injuries. Following the survey findings, a five-layer CSI users' needs hierarchy of input device selection and use was proposed. These needs were ranked in order: beginning with the most basic criterion at the bottom of the pyramid; lower-level criteria must be met before one moves onto the higher level. The users' needs hierarchy for CSI computer users, which had not been applied by previous research work and which has established a rationale for the development of alternative input devices. If an input device achieves the criteria set up in the needs hierarchy, then a good match of person and technology will be achieved.
The impact of 14nm photomask variability and uncertainty on computational lithography solutions
NASA Astrophysics Data System (ADS)
Sturtevant, John; Tejnil, Edita; Buck, Peter D.; Schulze, Steffen; Kalk, Franklin; Nakagawa, Kent; Ning, Guoxiang; Ackmann, Paul; Gans, Fritz; Buergel, Christian
2013-09-01
Computational lithography solutions rely upon accurate process models to faithfully represent the imaging system output for a defined set of process and design inputs. These models rely upon the accurate representation of multiple parameters associated with the scanner and the photomask. Many input variables for simulation are based upon designed or recipe-requested values or independent measurements. It is known, however, that certain measurement methodologies, while precise, can have significant inaccuracies. Additionally, there are known errors associated with the representation of certain system parameters. With shrinking total CD control budgets, appropriate accounting for all sources of error becomes more important, and the cumulative consequence of input errors to the computational lithography model can become significant. In this work, we examine via simulation, the impact of errors in the representation of photomask properties including CD bias, corner rounding, refractive index, thickness, and sidewall angle. The factors that are most critical to be accurately represented in the model are cataloged. CD bias values are based on state of the art mask manufacturing data and other variables changes are speculated, highlighting the need for improved metrology and communication between mask and OPC model experts. The simulations are done by ignoring the wafer photoresist model, and show the sensitivity of predictions to various model inputs associated with the mask. It is shown that the wafer simulations are very dependent upon the 1D/2D representation of the mask and for 3D, that the mask sidewall angle is a very sensitive factor influencing simulated wafer CD results.
Critical Fluctuations in Cortical Models Near Instability
Aburn, Matthew J.; Holmes, C. A.; Roberts, James A.; Boonstra, Tjeerd W.; Breakspear, Michael
2012-01-01
Computational studies often proceed from the premise that cortical dynamics operate in a linearly stable domain, where fluctuations dissipate quickly and show only short memory. Studies of human electroencephalography (EEG), however, have shown significant autocorrelation at time lags on the scale of minutes, indicating the need to consider regimes where non-linearities influence the dynamics. Statistical properties such as increased autocorrelation length, increased variance, power law scaling, and bistable switching have been suggested as generic indicators of the approach to bifurcation in non-linear dynamical systems. We study temporal fluctuations in a widely-employed computational model (the Jansen–Rit model) of cortical activity, examining the statistical signatures that accompany bifurcations. Approaching supercritical Hopf bifurcations through tuning of the background excitatory input, we find a dramatic increase in the autocorrelation length that depends sensitively on the direction in phase space of the input fluctuations and hence on which neuronal subpopulation is stochastically perturbed. Similar dependence on the input direction is found in the distribution of fluctuation size and duration, which show power law scaling that extends over four orders of magnitude at the Hopf bifurcation. We conjecture that the alignment in phase space between the input noise vector and the center manifold of the Hopf bifurcation is directly linked to these changes. These results are consistent with the possibility of statistical indicators of linear instability being detectable in real EEG time series. However, even in a simple cortical model, we find that these indicators may not necessarily be visible even when bifurcations are present because their expression can depend sensitively on the neuronal pathway of incoming fluctuations. PMID:22952464
A conceptual framework: redifining forests soil's critical acid loads under a changing climate
Steven G. McNulty; Johnny L. Boggs
2010-01-01
Federal agencies of several nations have or are currently developing guidelines for critical forest soil acid loads. These guidelines are used to establish regulations designed to maintain atmospheric acid inputs below levels shown to damage forests and streams. Traditionally, when the critical soil acid load exceeds the amount of acid that the ecosystem can absorb, it...
An Automatic Critical Care Urine Meter
Otero, Abraham; Fernández, Roemi; Apalkov, Andrey; Armada, Manuel
2012-01-01
Nowadays patients admitted to critical care units have most of their physiological parameters measured automatically by sophisticated commercial monitoring devices. More often than not, these devices supervise whether the values of the parameters they measure lie within a pre-established range, and issue warning of deviations from this range by triggering alarms. The automation of measuring and supervising tasks not only discharges the healthcare staff of a considerable workload but also avoids human errors in these repetitive and monotonous tasks. Arguably, the most relevant physiological parameter that is still measured and supervised manually by critical care unit staff is urine output (UO). In this paper we present a patent-pending device that provides continuous and accurate measurements of patient's UO. The device uses capacitive sensors to take continuous measurements of the height of the column of liquid accumulated in two chambers that make up a plastic container. The first chamber, where the urine inputs, has a small volume. Once it has been filled it overflows into a second bigger chamber. The first chamber provides accurate UO measures of patients whose UO has to be closely supervised, while the second one avoids the need for frequent interventions by the nursing staff to empty the container. PMID:23201988
Bleakley, Alan
2015-12-01
Inclusion of the humanities in undergraduate medicine curricula remains controversial. Skeptics have placed the burden of proof of effectiveness upon the shoulders of advocates, but this may lead to pursuing measurement of the immeasurable, deflecting attention away from the more pressing task of defining what we mean by the humanities in medicine. While humanities input can offer a fundamental critical counterweight to a potentially reductive biomedical science education, a new wave of thinking suggests that the kinds of arts and humanities currently used in medical education are neither radical nor critical enough to have a deep effect on students' learning and may need to be reformulated. The humanities can certainly educate for tolerance of ambiguity as a basis to learning democratic habits for contemporary team-based clinical work. William Empson's 'seven types of ambiguity' model for analyzing poetry is transposed to medical education to: (a) formulate seven values proffered by the humanities for improving medical education; (b) offer seven ways of measuring impact of medical humanities provision, thereby reducing ambiguity; and (c) --as a counterweight to (b) - celebrate seven types of ambiguity in contemporary medical humanities that critically reconsider issues of proof of impact.
Brain-Machine Interface control of a robot arm using actor-critic rainforcement learning.
Pohlmeyer, Eric A; Mahmoudi, Babak; Geng, Shijia; Prins, Noeline; Sanchez, Justin C
2012-01-01
Here we demonstrate how a marmoset monkey can use a reinforcement learning (RL) Brain-Machine Interface (BMI) to effectively control the movements of a robot arm for a reaching task. In this work, an actor-critic RL algorithm used neural ensemble activity in the monkey's motor cortext to control the robot movements during a two-target decision task. This novel approach to decoding offers unique advantages for BMI control applications. Compared to supervised learning decoding methods, the actor-critic RL algorithm does not require an explicit set of training data to create a static control model, but rather it incrementally adapts the model parameters according to its current performance, in this case requiring only a very basic feedback signal. We show how this algorithm achieved high performance when mapping the monkey's neural states (94%) to robot actions, and only needed to experience a few trials before obtaining accurate real-time control of the robot arm. Since RL methods responsively adapt and adjust their parameters, they can provide a method to create BMIs that are robust against perturbations caused by changes in either the neural input space or the output actions they generate under different task requirements or goals.
Cummings, Tonnie; Blett, Tamara; Porter, Ellen; Geiser, Linda; Graw, Rick; McMurray, Jill; Perakis, Steven S.; Rochefort, Regina
2014-01-01
The National Park Service and U.S. Forest Service manage areas in the states of Idaho, Oregon, and Washington – collectively referred to in this report as the Pacific Northwest - that contain significant natural resources and provide many recreational opportunities. The agencies are mandated to protect the air quality and air pollution-sensitive resources on these federal lands. Human activity has greatly increased the amount of nitrogen emitted to the atmosphere, resulting in elevated amounts of nitrogen being deposited in park and forest ecosystems. There is limited information in the Pacific Northwest about the levels of nitrogen that negatively affect natural systems, i.e., the critical loads. The National Park Service and U.S. Forest Service, with scientific input from the U.S. Geological Survey, have developed an approach for accumulating additional nitrogen critical loads information in the Pacific Northwest and using the data in planning and regulatory arenas. As a first step in that process, this report summarizes the current state of knowledge about nitrogen deposition, effects, and critical loads in the region. It also describes ongoing research efforts and identifies and prioritizes additional data needs.
Guieysse, Benoit; Norvill, Zane N
2014-02-28
When direct wastewater biological treatment is unfeasible, a cost- and resource-efficient alternative to direct chemical treatment consists of combining biological treatment with a chemical pre-treatment aiming to convert the hazardous pollutants into more biodegradable compounds. Whereas the principles and advantages of sequential treatment have been demonstrated for a broad range of pollutants and process configurations, recent progresses (2011-present) in the field provide the basis for refining assessment of feasibility, costs, and environmental impacts. This paper thus reviews recent real wastewater demonstrations at pilot and full scale as well as new process configurations. It also discusses new insights on the potential impacts of microbial community dynamics on process feasibility, design and operation. Finally, it sheds light on a critical issue that has not yet been properly addressed in the field: integration requires complex and tailored optimization and, of paramount importance to full-scale application, is sensitive to uncertainty and variability in the inputs used for process design and operation. Future research is therefore critically needed to improve process control and better assess the real potential of sequential chemical-biological processes for industrial wastewater treatment. Copyright © 2013 Elsevier B.V. All rights reserved.
Critical acid load limits in a changing climate: implications and solutions
Steven G. McNulty
2010-01-01
The federal agencies of the United States are currently developing guidelines for critical nitrogen load limits for U.S. forest ecosystems. These guidelines will be used to develop regulations designed to maintain pollutant inputs below the level shown to damage specified ecosystems.
Encoding context and false recognition memories.
Bruce, Darryl; Phillips-Grant, Kimberly; Conrad, Nicole; Bona, Susan
2004-09-01
False recognition of an extralist word that is thematically related to all words of a study list may reflect internal activation of the theme word during encoding followed by impaired source monitoring at retrieval, that is, difficulty in determining whether the word had actually been experienced or merely thought of. To assist source monitoring, distinctive visual or verbal contexts were added to study words at input. Both types of context produced similar effects: False alarms to theme-word (critical) lures were reduced; remember judgements of critical lures called old were lower; and if contextual information had been added to lists, subjects indicated as much for list items and associated critical foils identified as old. The visual and verbal contexts used in the present studies were held to disrupt semantic categorisation of list words at input and to facilitate source monitoring at output.
Quantization-Based Adaptive Actor-Critic Tracking Control With Tracking Error Constraints.
Fan, Quan-Yong; Yang, Guang-Hong; Ye, Dan
2018-04-01
In this paper, the problem of adaptive actor-critic (AC) tracking control is investigated for a class of continuous-time nonlinear systems with unknown nonlinearities and quantized inputs. Different from the existing results based on reinforcement learning, the tracking error constraints are considered and new critic functions are constructed to improve the performance further. To ensure that the tracking errors keep within the predefined time-varying boundaries, a tracking error transformation technique is used to constitute an augmented error system. Specific critic functions, rather than the long-term cost function, are introduced to supervise the tracking performance and tune the weights of the AC neural networks (NNs). A novel adaptive controller with a special structure is designed to reduce the effect of the NN reconstruction errors, input quantization, and disturbances. Based on the Lyapunov stability theory, the boundedness of the closed-loop signals and the desired tracking performance can be guaranteed. Finally, simulations on two connected inverted pendulums are given to illustrate the effectiveness of the proposed method.
NASA Astrophysics Data System (ADS)
Clawson, Wesley Patrick
Previous studies, both theoretical and experimental, of network level dynamics in the cerebral cortex show evidence for a statistical phenomenon called criticality; a phenomenon originally studied in the context of phase transitions in physical systems and that is associated with favorable information processing in the context of the brain. The focus of this thesis is to expand upon past results with new experimentation and modeling to show a relationship between criticality and the ability to detect and discriminate sensory input. A line of theoretical work predicts maximal sensory discrimination as a functional benefit of criticality, which can then be characterized using mutual information between sensory input, visual stimulus, and neural response,. The primary finding of our experiments in the visual cortex in turtles and neuronal network modeling confirms this theoretical prediction. We show that sensory discrimination is maximized when visual cortex operates near criticality. In addition to presenting this primary finding in detail, this thesis will also address our preliminary results on change-point-detection in experimentally measured cortical dynamics.
Bishop, Steve C; Lunney, Joan K; Pinard-van der Laan, Marie-Hélène; Gay, Cyril G
2011-06-03
The second International Symposium on Animal Genomics for Animal Health held in Paris, France 31 May-2 June, 2010, assembled more than 140 participants representing research organizations from 40 countries. The symposium included a roundtable discussion on critical needs, challenges and opportunities, and a forward look at the potential applications of animal genomics in animal health research. The aim of the roundtable discussion was to foster a dialogue between scientists working at the cutting edge of animal genomics research and animal health scientists. Importantly, stakeholders were included to provide input on priorities and the potential value of animal genomics to the animal health community. In an effort to facilitate the roundtable discussion, the organizers identified four priority areas to advance the use of genome-enabled technologies in animal health research. Contributions were obtained through open discussions and a questionnaire distributed at the start of the symposium. This report provides the outcome of the roundtable discussion for each of the four priority areas. For each priority, problems are identified, including potential solutions and recommendations. This report captures key points made by symposium participants during the roundtable discussion and serves as a roadmap to steer future research priorities in animal genomics research.
Using Models to Inform Policy: Insights from Modeling the Complexities of Global Polio Eradication
NASA Astrophysics Data System (ADS)
Thompson, Kimberly M.
Drawing on over 20 years of experience modeling risks in complex systems, this talk will challenge SBP participants to develop models that provide timely and useful answers to critical policy questions when decision makers need them. The talk will include reflections on the opportunities and challenges associated with developing integrated models for complex problems and communicating their results effectively. Dr. Thompson will focus the talk largely on collaborative modeling related to global polio eradication and the application of system dynamics tools. After successful global eradication of wild polioviruses, live polioviruses will still present risks that could potentially lead to paralytic polio cases. This talk will present the insights of efforts to use integrated dynamic, probabilistic risk, decision, and economic models to address critical policy questions related to managing global polio risks. Using a dynamic disease transmission model combined with probabilistic model inputs that characterize uncertainty for a stratified world to account for variability, we find that global health leaders will face some difficult choices, but that they can take actions that will manage the risks effectively. The talk will emphasize the need for true collaboration between modelers and subject matter experts, and the importance of working with decision makers as partners to ensure the development of useful models that actually get used.
Liu, Derong; Yang, Xiong; Wang, Ding; Wei, Qinglai
2015-07-01
The design of stabilizing controller for uncertain nonlinear systems with control constraints is a challenging problem. The constrained-input coupled with the inability to identify accurately the uncertainties motivates the design of stabilizing controller based on reinforcement-learning (RL) methods. In this paper, a novel RL-based robust adaptive control algorithm is developed for a class of continuous-time uncertain nonlinear systems subject to input constraints. The robust control problem is converted to the constrained optimal control problem with appropriately selecting value functions for the nominal system. Distinct from typical action-critic dual networks employed in RL, only one critic neural network (NN) is constructed to derive the approximate optimal control. Meanwhile, unlike initial stabilizing control often indispensable in RL, there is no special requirement imposed on the initial control. By utilizing Lyapunov's direct method, the closed-loop optimal control system and the estimated weights of the critic NN are proved to be uniformly ultimately bounded. In addition, the derived approximate optimal control is verified to guarantee the uncertain nonlinear system to be stable in the sense of uniform ultimate boundedness. Two simulation examples are provided to illustrate the effectiveness and applicability of the present approach.
Fan, Quan-Yong; Yang, Guang-Hong
2016-01-01
This paper is concerned with the problem of integral sliding-mode control for a class of nonlinear systems with input disturbances and unknown nonlinear terms through the adaptive actor-critic (AC) control method. The main objective is to design a sliding-mode control methodology based on the adaptive dynamic programming (ADP) method, so that the closed-loop system with time-varying disturbances is stable and the nearly optimal performance of the sliding-mode dynamics can be guaranteed. In the first step, a neural network (NN)-based observer and a disturbance observer are designed to approximate the unknown nonlinear terms and estimate the input disturbances, respectively. Based on the NN approximations and disturbance estimations, the discontinuous part of the sliding-mode control is constructed to eliminate the effect of the disturbances and attain the expected equivalent sliding-mode dynamics. Then, the ADP method with AC structure is presented to learn the optimal control for the sliding-mode dynamics online. Reconstructed tuning laws are developed to guarantee the stability of the sliding-mode dynamics and the convergence of the weights of critic and actor NNs. Finally, the simulation results are presented to illustrate the effectiveness of the proposed method.
Simple Sensitivity Analysis for Orion Guidance Navigation and Control
NASA Technical Reports Server (NTRS)
Pressburger, Tom; Hoelscher, Brian; Martin, Rodney; Sricharan, Kumar
2013-01-01
The performance of Orion flight software, especially its GNC software, is being analyzed by running Monte Carlo simulations of Orion spacecraft flights. The simulated performance is analyzed for conformance with flight requirements, expressed as performance constraints. Flight requirements include guidance (e.g. touchdown distance from target) and control (e.g., control saturation) as well as performance (e.g., heat load constraints). The Monte Carlo simulations disperse hundreds of simulation input variables, for everything from mass properties to date of launch. We describe in this paper a sensitivity analysis tool ("Critical Factors Tool" or CFT) developed to find the input variables or pairs of variables which by themselves significantly influence satisfaction of requirements or significantly affect key performance metrics (e.g., touchdown distance from target). Knowing these factors can inform robustness analysis, can inform where engineering resources are most needed, and could even affect operations. The contributions of this paper include the introduction of novel sensitivity measures, such as estimating success probability, and a technique for determining whether pairs of factors are interacting dependently or independently. The tool found that input variables such as moments, mass, thrust dispersions, and date of launch were found to be significant factors for success of various requirements. Examples are shown in this paper as well as a summary and physics discussion of EFT-1 driving factors that the tool found.
Real-time PCR probe optimization using design of experiments approach.
Wadle, S; Lehnert, M; Rubenwolf, S; Zengerle, R; von Stetten, F
2016-03-01
Primer and probe sequence designs are among the most critical input factors in real-time polymerase chain reaction (PCR) assay optimization. In this study, we present the use of statistical design of experiments (DOE) approach as a general guideline for probe optimization and more specifically focus on design optimization of label-free hydrolysis probes that are designated as mediator probes (MPs), which are used in reverse transcription MP PCR (RT-MP PCR). The effect of three input factors on assay performance was investigated: distance between primer and mediator probe cleavage site; dimer stability of MP and target sequence (influenza B virus); and dimer stability of the mediator and universal reporter (UR). The results indicated that the latter dimer stability had the greatest influence on assay performance, with RT-MP PCR efficiency increased by up to 10% with changes to this input factor. With an optimal design configuration, a detection limit of 3-14 target copies/10 μl reaction could be achieved. This improved detection limit was confirmed for another UR design and for a second target sequence, human metapneumovirus, with 7-11 copies/10 μl reaction detected in an optimum case. The DOE approach for improving oligonucleotide designs for real-time PCR not only produces excellent results but may also reduce the number of experiments that need to be performed, thus reducing costs and experimental times.
Paterson, Eric; Midwood, Andrew J; Millard, Peter
2009-01-01
For soils in carbon balance, losses of soil carbon from biological activity are balanced by organic inputs from vegetation. Perturbations, such as climate or land use change, have the potential to disrupt this balance and alter soil-atmosphere carbon exchanges. As the quantification of soil organic matter stocks is an insensitive means of detecting changes, certainly over short timescales, there is a need to apply methods that facilitate a quantitative understanding of the biological processes underlying soil carbon balance. We outline the processes by which plant carbon enters the soil and critically evaluate isotopic methods to quantify them. Then, we consider the balancing CO(2) flux from soil and detail the importance of partitioning the sources of this flux into those from recent plant assimilate and those from native soil organic matter. Finally, we consider the interactions between the inputs of carbon to soil and the losses from soil mediated by biological activity. We emphasize the key functional role of the microbiota in the concurrent processing of carbon from recent plant inputs and native soil organic matter. We conclude that quantitative isotope labelling and partitioning methods, coupled to those for the quantification of microbial community substrate use, offer the potential to resolve the functioning of the microbial control point of soil carbon balance in unprecedented detail.
Predicting neuroblastoma using developmental signals and a logic-based model.
Kasemeier-Kulesa, Jennifer C; Schnell, Santiago; Woolley, Thomas; Spengler, Jennifer A; Morrison, Jason A; McKinney, Mary C; Pushel, Irina; Wolfe, Lauren A; Kulesa, Paul M
2018-07-01
Genomic information from human patient samples of pediatric neuroblastoma cancers and known outcomes have led to specific gene lists put forward as high risk for disease progression. However, the reliance on gene expression correlations rather than mechanistic insight has shown limited potential and suggests a critical need for molecular network models that better predict neuroblastoma progression. In this study, we construct and simulate a molecular network of developmental genes and downstream signals in a 6-gene input logic model that predicts a favorable/unfavorable outcome based on the outcome of the four cell states including cell differentiation, proliferation, apoptosis, and angiogenesis. We simulate the mis-expression of the tyrosine receptor kinases, trkA and trkB, two prognostic indicators of neuroblastoma, and find differences in the number and probability distribution of steady state outcomes. We validate the mechanistic model assumptions using RNAseq of the SHSY5Y human neuroblastoma cell line to define the input states and confirm the predicted outcome with antibody staining. Lastly, we apply input gene signatures from 77 published human patient samples and show that our model makes more accurate disease outcome predictions for early stage disease than any current neuroblastoma gene list. These findings highlight the predictive strength of a logic-based model based on developmental genes and offer a better understanding of the molecular network interactions during neuroblastoma disease progression. Copyright © 2018. Published by Elsevier B.V.
Experiences in Engaging the Public on Biotechnology Advances and Regulation
Quinlan, M. Megan; Smith, Joe; Layton, Raymond; Keese, Paul; Agbagala, Ma. Lorelie U.; Palacpac, Merle B.; Ball, Louise
2016-01-01
Public input is often sought as part of the biosafety decision-making process. Information and communication about the advances in biotechnology are part of the first step to engagement. This step often relies on the developers and introducers of the particular innovation, for example, an industry-funded website has hosted various authorities to respond to questions from the public. Alternative approaches to providing information have evolved, as demonstrated in sub-Saharan Africa where non-governmental organizations and associations play this role in some countries and subregions. Often times, those in the public who choose to participate in engagement opportunities have opinions about the overall biosafety decision process. Case-by-case decisions are made within defined regulatory frameworks, however, and in general, regulatory consultation does not provide the opportunity for input to the overall decision-making process. The various objectives on both sides of engagement can make the experience challenging; there are no clear metrics for success. The situation is challenging because public input occurs within the context of the local legislative framework, regulatory requirements, and the peculiarities of the fairly recent biosafety frameworks, as well as of public opinion and individual values. Public engagement may be conducted voluntarily, or may be driven by legislation. What can be taken into account by the decision makers, and therefore what will be gathered and the timing of consultation, also may be legally defined. Several practical experiences suggest practices for effective engagement within the confines of regulatory mandates: (1) utilizing a range of resources to facilitate public education and opportunities for understanding complex technologies; (2) defining in advance the goal of seeking input; (3) identifying and communicating with the critical public groups from which input is needed; (4) using a clearly defined approach to gathering and assessing what will be used in making the biosafety decision; and (5) communicating using clear and simple language. These practices create a foundation for systematic methods to gather, acknowledge, respond to, and even incorporate public input. Applying such best practices will increase transparency and optimize the value of input from the public. PMID:26870726
Experiences in Engaging the Public on Biotechnology Advances and Regulation.
Quinlan, M Megan; Smith, Joe; Layton, Raymond; Keese, Paul; Agbagala, Ma Lorelie U; Palacpac, Merle B; Ball, Louise
2016-01-01
Public input is often sought as part of the biosafety decision-making process. Information and communication about the advances in biotechnology are part of the first step to engagement. This step often relies on the developers and introducers of the particular innovation, for example, an industry-funded website has hosted various authorities to respond to questions from the public. Alternative approaches to providing information have evolved, as demonstrated in sub-Saharan Africa where non-governmental organizations and associations play this role in some countries and subregions. Often times, those in the public who choose to participate in engagement opportunities have opinions about the overall biosafety decision process. Case-by-case decisions are made within defined regulatory frameworks, however, and in general, regulatory consultation does not provide the opportunity for input to the overall decision-making process. The various objectives on both sides of engagement can make the experience challenging; there are no clear metrics for success. The situation is challenging because public input occurs within the context of the local legislative framework, regulatory requirements, and the peculiarities of the fairly recent biosafety frameworks, as well as of public opinion and individual values. Public engagement may be conducted voluntarily, or may be driven by legislation. What can be taken into account by the decision makers, and therefore what will be gathered and the timing of consultation, also may be legally defined. Several practical experiences suggest practices for effective engagement within the confines of regulatory mandates: (1) utilizing a range of resources to facilitate public education and opportunities for understanding complex technologies; (2) defining in advance the goal of seeking input; (3) identifying and communicating with the critical public groups from which input is needed; (4) using a clearly defined approach to gathering and assessing what will be used in making the biosafety decision; and (5) communicating using clear and simple language. These practices create a foundation for systematic methods to gather, acknowledge, respond to, and even incorporate public input. Applying such best practices will increase transparency and optimize the value of input from the public.
USDA-ARS?s Scientific Manuscript database
Accurately predicting the fate and transport of graphene oxide (GO) in porous media is critical to assess its environmental impact. In this work, sand column experiments were conducted to determine the effect of input concentration and grain size on transport, retention, and size perturbation of GO ...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-25
... stream channel, minimal sedimentation, organic input into caves during rain events, and a sufficient prey..., pp. 111-112; Niemiller et al. 2006, p. 43). Prey availability is related to the organic input that is transported with sediment and other organic materials via sinkholes into stream habitats (Burr et al. 2001, p...
ERIC Educational Resources Information Center
Pippen, Rebecca Gintz
2016-01-01
For decades, accountability for student results has been at the forefront of school reform. While many school-based factors have influence, teacher quality has consistently been identified as the most important school-based factor related to student achievement (Rivkin, Hanushek, & Kain, 2000; Stronge, 2007). Research also suggests that a…
ERIC Educational Resources Information Center
Love, Tracy; Walenski, Matthew; Swinney, David
2009-01-01
The central question underlying this study revolves around how children process co-reference relationships--such as those evidenced by pronouns ("him") and reflexives ("himself")--and how a slowed rate of speech input may critically affect this process. Previous studies of child language processing have demonstrated that typical language…
ERIC Educational Resources Information Center
Wong, Lung-Hsiang; Chai, Ching-Sing; Gao, Ping
2011-01-01
This paper reports an exploratory study on Singapore secondary and primary school students' perceptions and behaviors on using a variety of Chinese input methods for Chinese composition writing. Significant behavioral patterns were uncovered and mapped into a cognitive process, which are potentially critical to the training of students in…
NASA Technical Reports Server (NTRS)
Peck, Charles C.; Dhawan, Atam P.; Meyer, Claudia M.
1991-01-01
A genetic algorithm is used to select the inputs to a neural network function approximator. In the application considered, modeling critical parameters of the space shuttle main engine (SSME), the functional relationship between measured parameters is unknown and complex. Furthermore, the number of possible input parameters is quite large. Many approaches have been used for input selection, but they are either subjective or do not consider the complex multivariate relationships between parameters. Due to the optimization and space searching capabilities of genetic algorithms they were employed to systematize the input selection process. The results suggest that the genetic algorithm can generate parameter lists of high quality without the explicit use of problem domain knowledge. Suggestions for improving the performance of the input selection process are also provided.
Zhang, Z. Fred; White, Signe K.; Bonneville, Alain; ...
2014-12-31
Numerical simulations have been used for estimating CO2 injectivity, CO2 plume extent, pressure distribution, and Area of Review (AoR), and for the design of CO2 injection operations and monitoring network for the FutureGen project. The simulation results are affected by uncertainties associated with numerous input parameters, the conceptual model, initial and boundary conditions, and factors related to injection operations. Furthermore, the uncertainties in the simulation results also vary in space and time. The key need is to identify those uncertainties that critically impact the simulation results and quantify their impacts. We introduce an approach to determine the local sensitivity coefficientmore » (LSC), defined as the response of the output in percent, to rank the importance of model inputs on outputs. The uncertainty of an input with higher sensitivity has larger impacts on the output. The LSC is scalable by the error of an input parameter. The composite sensitivity of an output to a subset of inputs can be calculated by summing the individual LSC values. We propose a local sensitivity coefficient method and applied it to the FutureGen 2.0 Site in Morgan County, Illinois, USA, to investigate the sensitivity of input parameters and initial conditions. The conceptual model for the site consists of 31 layers, each of which has a unique set of input parameters. The sensitivity of 11 parameters for each layer and 7 inputs as initial conditions is then investigated. For CO2 injectivity and plume size, about half of the uncertainty is due to only 4 or 5 of the 348 inputs and 3/4 of the uncertainty is due to about 15 of the inputs. The initial conditions and the properties of the injection layer and its neighbour layers contribute to most of the sensitivity. Overall, the simulation outputs are very sensitive to only a small fraction of the inputs. However, the parameters that are important for controlling CO2 injectivity are not the same as those controlling the plume size. The three most sensitive inputs for injectivity were the horizontal permeability of Mt Simon 11 (the injection layer), the initial fracture-pressure gradient, and the residual aqueous saturation of Mt Simon 11, while those for the plume area were the initial salt concentration, the initial pressure, and the initial fracture-pressure gradient. The advantages of requiring only a single set of simulation results, scalability to the proper parameter errors, and easy calculation of the composite sensitivities make this approach very cost-effective for estimating AoR uncertainty and guiding cost-effective site characterization, injection well design, and monitoring network design for CO2 storage projects.« less
Asymmetric temporal integration of layer 4 and layer 2/3 inputs in visual cortex.
Hang, Giao B; Dan, Yang
2011-01-01
Neocortical neurons in vivo receive concurrent synaptic inputs from multiple sources, including feedforward, horizontal, and feedback pathways. Layer 2/3 of the visual cortex receives feedforward input from layer 4 and horizontal input from layer 2/3. Firing of the pyramidal neurons, which carries the output to higher cortical areas, depends critically on the interaction of these pathways. Here we examined synaptic integration of inputs from layer 4 and layer 2/3 in rat visual cortical slices. We found that the integration is sublinear and temporally asymmetric, with larger responses if layer 2/3 input preceded layer 4 input. The sublinearity depended on inhibition, and the asymmetry was largely attributable to the difference between the two inhibitory inputs. Interestingly, the asymmetric integration was specific to pyramidal neurons, and it strongly affected their spiking output. Thus via cortical inhibition, the temporal order of activation of layer 2/3 and layer 4 pathways can exert powerful control of cortical output during visual processing.
Ignition criterion for heterogeneous energetic materials based on hotspot size-temperature threshold
NASA Astrophysics Data System (ADS)
Barua, A.; Kim, S.; Horie, Y.; Zhou, M.
2013-02-01
A criterion for the ignition of granular explosives (GXs) and polymer-bonded explosives (PBXs) under shock and non-shock loading is developed. The formulation is based on integration of a quantification of the distributions of the sizes and locations of hotspots in loading events using a cohesive finite element method (CFEM) developed recently and the characterization by Tarver et al. [C. M. Tarver et al., "Critical conditions for impact- and shock-induced hot spots in solid explosives," J. Phys. Chem. 100, 5794-5799 (1996)] of the critical size-temperature threshold of hotspots required for chemical ignition of solid explosives. The criterion, along with the CFEM capability to quantify the thermal-mechanical behavior of GXs and PBXs, allows the critical impact velocity for ignition, time to ignition, and critical input energy at ignition to be determined as functions of material composition, microstructure, and loading conditions. The applicability of the relation between the critical input energy (E) and impact velocity of James [H. R. James, "An extension to the critical energy criterion used to predict shock initiation thresholds," Propellants, Explos., Pyrotech. 21, 8-13 (1996)] for shock loading is examined, leading to a modified interpretation, which is sensitive to microstructure and loading condition. As an application, numerical studies are undertaken to evaluate the ignition threshold of granular high melting point eXplosive, octahydro-1,3,5,7-tetranitro-1,2,3,5-tetrazocine (HMX) and HMX/Estane PBX under loading with impact velocities up to 350 ms-1 and strain rates up to 105 s-1. Results show that, for the GX, the time to criticality (tc) is strongly influenced by initial porosity, but is insensitive to grain size. Analyses also lead to a quantification of the differences between the responses of the GXs and PBXs in terms of critical impact velocity for ignition, time to ignition, and critical input energy at ignition. Since the framework permits explicit tracking of the influences of microstructure, loading, and mechanical constraints, the calculations also show the effects of stress wave reflection and confinement condition on the ignition behaviors of GXs and PBXs.
NASA Technical Reports Server (NTRS)
Shields, Michael F.
1993-01-01
The need to manage large amounts of data on robotically controlled devices has been critical to the mission of this Agency for many years. In many respects this Agency has helped pioneer, with their industry counterparts, the development of a number of products long before these systems became commercially available. Numerous attempts have been made to field both robotically controlled tape and optical disk technology and systems to satisfy our tertiary storage needs. Custom developed products were architected, designed, and developed without vendor partners over the past two decades to field workable systems to handle our ever increasing storage requirements. Many of the attendees of this symposium are familiar with some of the older products, such as: the Braegen Automated Tape Libraries (ATL's), the IBM 3850, the Ampex TeraStore, just to name a few. In addition, we embarked on an in-house development of a shared disk input/output support processor to manage our every increasing tape storage needs. For all intents and purposes, this system was a file server by current definitions which used CDC Cyber computers as the control processors. It served us well and was just recently removed from production usage.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fox, P.B.; Yatabe, M.
1987-01-01
In this report the Nuclear Criticality Safety Analytical Methods Resource Center describes a new interactive version of CESAR, a critical experiments storage and retrieval program available on the Nuclear Criticality Information System (NCIS) database at Lawrence Livermore National Laboratory. The original version of CESAR did not include interactive search capabilities. The CESAR database was developed to provide a convenient, readily accessible means of storing and retrieving code input data for the SCALE Criticality Safety Analytical Sequences and the codes comprising those sequences. The database includes data for both cross section preparation and criticality safety calculations. 3 refs., 1 tab.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fox, P.B.; Yatabe, M.
1987-01-01
The Nuclear Criticality Safety Analytical Methods Resource Center announces the availability of a new interactive version of CESAR, a critical experiments storage and retrieval program available on the Nuclear Criticality Information System (NCIS) data base at Lawrence Livermore National Laboratory. The original version of CESAR did not include interactive search capabilities. The CESAR data base was developed to provide a convenient, readily accessible means of storing and retrieving code input data for the SCALE criticality safety analytical sequences and the codes comprising those sequences. The data base includes data for both cross-section preparation and criticality safety calculations.
XML-Based Generator of C++ Code for Integration With GUIs
NASA Technical Reports Server (NTRS)
Hua, Hook; Oyafuso, Fabiano; Klimeck, Gerhard
2003-01-01
An open source computer program has been developed to satisfy a need for simplified organization of structured input data for scientific simulation programs. Typically, such input data are parsed in from a flat American Standard Code for Information Interchange (ASCII) text file into computational data structures. Also typically, when a graphical user interface (GUI) is used, there is a need to completely duplicate the input information while providing it to a user in a more structured form. Heretofore, the duplication of the input information has entailed duplication of software efforts and increases in susceptibility to software errors because of the concomitant need to maintain two independent input-handling mechanisms. The present program implements a method in which the input data for a simulation program are completely specified in an Extensible Markup Language (XML)-based text file. The key benefit for XML is storing input data in a structured manner. More importantly, XML allows not just storing of data but also describing what each of the data items are. That XML file contains information useful for rendering the data by other applications. It also then generates data structures in the C++ language that are to be used in the simulation program. In this method, all input data are specified in one place only, and it is easy to integrate the data structures into both the simulation program and the GUI. XML-to-C is useful in two ways: 1. As an executable, it generates the corresponding C++ classes and 2. As a library, it automatically fills the objects with the input data values.
Cultivating Awareness in Honors: First-Person Noting and Contemplative Practices
ERIC Educational Resources Information Center
Cooke, Kathy J.
2015-01-01
While traditional practices of critical reading, writing, dialogue, and discussion are no doubt essential inputs and outputs of higher education and a means of achieving critical thinking in college students, recent science and pedagogical innovation can help develop additional, unique methodologies that can have more immediate significance for…
Antibodies and antibody-derived analytical biosensors
Sharma, Shikha; Byrne, Hannah
2016-01-01
The rapid diagnosis of many diseases and timely initiation of appropriate treatment are critical determinants that promote optimal clinical outcomes and general public health. Biosensors are now being applied for rapid diagnostics due to their capacity for point-of-care use with minimum need for operator input. Antibody-based biosensors or immunosensors have revolutionized diagnostics for the detection of a plethora of analytes such as disease markers, food and environmental contaminants, biological warfare agents and illicit drugs. Antibodies are ideal biorecognition elements that provide sensors with high specificity and sensitivity. This review describes monoclonal and recombinant antibodies and different immobilization approaches crucial for antibody utilization in biosensors. Examples of applications of a variety of antibody-based sensor formats are also described. PMID:27365031
Eigenvalue Contributon Estimator for Sensitivity Calculations with TSUNAMI-3D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T; Williams, Mark L
2007-01-01
Since the release of the Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) codes in SCALE [1], the use of sensitivity and uncertainty analysis techniques for criticality safety applications has greatly increased within the user community. In general, sensitivity and uncertainty analysis is transitioning from a technique used only by specialists to a practical tool in routine use. With the desire to use the tool more routinely comes the need to improve the solution methodology to reduce the input and computational burden on the user. This paper reviews the current solution methodology of the Monte Carlo eigenvalue sensitivity analysismore » sequence TSUNAMI-3D, describes an alternative approach, and presents results from both methodologies.« less
Hippocampal-prefrontal input supports spatial encoding in working memory.
Spellman, Timothy; Rigotti, Mattia; Ahmari, Susanne E; Fusi, Stefano; Gogos, Joseph A; Gordon, Joshua A
2015-06-18
Spatial working memory, the caching of behaviourally relevant spatial cues on a timescale of seconds, is a fundamental constituent of cognition. Although the prefrontal cortex and hippocampus are known to contribute jointly to successful spatial working memory, the anatomical pathway and temporal window for the interaction of these structures critical to spatial working memory has not yet been established. Here we find that direct hippocampal-prefrontal afferents are critical for encoding, but not for maintenance or retrieval, of spatial cues in mice. These cues are represented by the activity of individual prefrontal units in a manner that is dependent on hippocampal input only during the cue-encoding phase of a spatial working memory task. Successful encoding of these cues appears to be mediated by gamma-frequency synchrony between the two structures. These findings indicate a critical role for the direct hippocampal-prefrontal afferent pathway in the continuous updating of task-related spatial information during spatial working memory.
Generating Performance Models for Irregular Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friese, Ryan D.; Tallent, Nathan R.; Vishnu, Abhinav
2017-05-30
Many applications have irregular behavior --- non-uniform input data, input-dependent solvers, irregular memory accesses, unbiased branches --- that cannot be captured using today's automated performance modeling techniques. We describe new hierarchical critical path analyses for the \\Palm model generation tool. To create a model's structure, we capture tasks along representative MPI critical paths. We create a histogram of critical tasks with parameterized task arguments and instance counts. To model each task, we identify hot instruction-level sub-paths and model each sub-path based on data flow, instruction scheduling, and data locality. We describe application models that generate accurate predictions for strong scalingmore » when varying CPU speed, cache speed, memory speed, and architecture. We present results for the Sweep3D neutron transport benchmark; Page Rank on multiple graphs; Support Vector Machine with pruning; and PFLOTRAN's reactive flow/transport solver with domain-induced load imbalance.« less
NASA Astrophysics Data System (ADS)
Yang, Xiong; Liu, Derong; Wang, Ding
2014-03-01
In this paper, an adaptive reinforcement learning-based solution is developed for the infinite-horizon optimal control problem of constrained-input continuous-time nonlinear systems in the presence of nonlinearities with unknown structures. Two different types of neural networks (NNs) are employed to approximate the Hamilton-Jacobi-Bellman equation. That is, an recurrent NN is constructed to identify the unknown dynamical system, and two feedforward NNs are used as the actor and the critic to approximate the optimal control and the optimal cost, respectively. Based on this framework, the action NN and the critic NN are tuned simultaneously, without the requirement for the knowledge of system drift dynamics. Moreover, by using Lyapunov's direct method, the weights of the action NN and the critic NN are guaranteed to be uniformly ultimately bounded, while keeping the closed-loop system stable. To demonstrate the effectiveness of the present approach, simulation results are illustrated.
Distributed Optimal Consensus Control for Multiagent Systems With Input Delay.
Zhang, Huaipin; Yue, Dong; Zhao, Wei; Hu, Songlin; Dou, Chunxia; Huaipin Zhang; Dong Yue; Wei Zhao; Songlin Hu; Chunxia Dou; Hu, Songlin; Zhang, Huaipin; Dou, Chunxia; Yue, Dong; Zhao, Wei
2018-06-01
This paper addresses the problem of distributed optimal consensus control for a continuous-time heterogeneous linear multiagent system subject to time varying input delays. First, by discretization and model transformation, the continuous-time input-delayed system is converted into a discrete-time delay-free system. Two delicate performance index functions are defined for these two systems. It is shown that the performance index functions are equivalent and the optimal consensus control problem of the input-delayed system can be cast into that of the delay-free system. Second, by virtue of the Hamilton-Jacobi-Bellman (HJB) equations, an optimal control policy for each agent is designed based on the delay-free system and a novel value iteration algorithm is proposed to learn the solutions to the HJB equations online. The proposed adaptive dynamic programming algorithm is implemented on the basis of a critic-action neural network (NN) structure. Third, it is proved that local consensus errors of the two systems and weight estimation errors of the critic-action NNs are uniformly ultimately bounded while the approximated control policies converge to their target values. Finally, two simulation examples are presented to illustrate the effectiveness of the developed method.
A High Input Impedance Low Noise Integrated Front-End Amplifier for Neural Monitoring.
Zhou, Zhijun; Warr, Paul A
2016-12-01
Within neural monitoring systems, the front-end amplifier forms the critical element for signal detection and pre-processing, which determines not only the fidelity of the biosignal, but also impacts power consumption and detector size. In this paper, a novel combined feedback loop-controlled approach is proposed to compensate for input leakage currents generated by low noise amplifiers when in integrated circuit form alongside signal leakage into the input bias network. This loop topology ensures the Front-End Amplifier (FEA) maintains a high input impedance across all manufacturing and operational variations. Measured results from a prototype manufactured on the AMS 0.35 [Formula: see text] CMOS technology is provided. This FEA consumes 3.1 [Formula: see text] in 0.042 [Formula: see text], achieves input impedance of 42 [Formula: see text], and 18.2 [Formula: see text] input-referred noise.
A cortical motor nucleus drives the basal ganglia-recipient thalamus in singing birds
Goldberg, Jesse H.
2012-01-01
The pallido-recipient thalamus transmits information from the basal ganglia (BG) to the cortex and plays a critical role motor initiation and learning. Thalamic activity is strongly inhibited by pallidal inputs from the BG, but the role of non-pallidal inputs, such as excitatory inputs from cortex, is unclear. We have recorded simultaneously from presynaptic pallidal axon terminals and postsynaptic thalamocortical neurons in a BG-recipient thalamic nucleus necessary for vocal variability and learning in zebra finches. We found that song-locked rate modulations in the thalamus could not be explained by pallidal inputs alone, and persisted following pallidal lesion. Instead, thalamic activity was likely driven by inputs from a motor ‘cortical’ nucleus also necessary for singing. These findings suggest a role for cortical inputs to the pallido-recipient thalamus in driving premotor signals important for exploratory behavior and learning. PMID:22327474
NASA Astrophysics Data System (ADS)
Passeport, Elodie; Vidon, Philippe; Forshay, Kenneth J.; Harris, Lora; Kaushal, Sujay S.; Kellogg, Dorothy Q.; Lazar, Julia; Mayer, Paul; Stander, Emilie K.
2013-02-01
Excess nitrogen (N) in freshwater systems, estuaries, and coastal areas has well-documented deleterious effects on ecosystems. Ecological engineering practices (EEPs) may be effective at decreasing nonpoint source N leaching to surface and groundwater. However, few studies have synthesized current knowledge about the functioning principles, performance, and cost of common EEPs used to mitigate N pollution at the watershed scale. Our review describes seven EEPs known to decrease N to help watershed managers select the most effective techniques from among the following approaches: advanced-treatment septic systems, low-impact development (LID) structures, permeable reactive barriers, treatment wetlands, riparian buffers, artificial lakes and reservoirs, and stream restoration. Our results show a broad range of N-removal effectiveness but suggest that all techniques could be optimized for N removal by promoting and sustaining conditions conducive to biological transformations (e.g., denitrification). Generally, N-removal efficiency is particularly affected by hydraulic residence time, organic carbon availability, and establishment of anaerobic conditions. There remains a critical need for systematic empirical studies documenting N-removal efficiency among EEPs and potential environmental and economic tradeoffs associated with the widespread use of these techniques. Under current trajectories of N inputs, land use, and climate change, ecological engineering alone may be insufficient to manage N in many watersheds, suggesting that N-pollution source prevention remains a critical need. Improved understanding of N-removal effectiveness and modeling efforts will be critical in building decision support tools to help guide the selection and application of best EEPs for N management.
Closing the gate in the limbic striatum: prefrontal suppression of hippocampal and thalamic inputs
Calhoon, Gwendolyn G.; O’Donnell, Patricio
2013-01-01
SUMMARY Many brain circuits control behavior by integrating information arising from separate inputs onto a common target neuron. Neurons in the ventral striatum (VS) receive converging excitatory afferents from the prefrontal cortex (PFC), hippocampus (HP), and thalamus, among other structures, and the integration of these inputs is critical for shaping goal-directed behaviors. Although HP inputs have been described as gating PFC throughput in the VS, recent data reveal that the VS desynchronizes from the HP during epochs of burst-like PFC activity related to decision-making. It is therefore possible that PFC inputs locally attenuate responses to other glutamatergic inputs to the VS. Here, we found that delivering trains of stimuli to the PFC suppresses HP- and thalamus-evoked synaptic responses in the VS, in part through activation of inhibitory processes. This interaction may enable the PFC to exert influence on basal ganglia loops during decision-making instances with minimal disturbance from ongoing contextual inputs. PMID:23583113
Magnetic tunnel junction based spintronic logic devices
NASA Astrophysics Data System (ADS)
Lyle, Andrew Paul
The International Technology Roadmap for Semiconductors (ITRS) predicts that complimentary metal oxide semiconductor (CMOS) based technologies will hit their last generation on or near the 16 nm node, which we expect to reach by the year 2025. Thus future advances in computational power will not be realized from ever-shrinking device sizes, but rather by 'outside the box' designs and new physics, including molecular or DNA based computation, organics, magnonics, or spintronic. This dissertation investigates magnetic logic devices for post-CMOS computation. Three different architectures were studied, each relying on a different magnetic mechanism to compute logic functions. Each design has it benefits and challenges that must be overcome. This dissertation focuses on pushing each design from the drawing board to a realistic logic technology. The first logic architecture is based on electrically connected magnetic tunnel junctions (MTJs) that allow direct communication between elements without intermediate sensing amplifiers. Two and three input logic gates, which consist of two and three MTJs connected in parallel, respectively were fabricated and are compared. The direct communication is realized by electrically connecting the output in series with the input and applying voltage across the series connections. The logic gates rely on the fact that a change in resistance at the input modulates the voltage that is needed to supply the critical current for spin transfer torque switching the output. The change in resistance at the input resulted in a voltage margin of 50--200 mV and 250--300 mV for the closest input states for the three and two input designs, respectively. The two input logic gate realizes the AND, NAND, NOR, and OR logic functions. The three input logic function realizes the Majority, AND, NAND, NOR, and OR logic operations. The second logic architecture utilizes magnetostatically coupled nanomagnets to compute logic functions, which is the basis of Magnetic Quantum Cellular Automata (MQCA). MQCA has the potential to be thousands of times more energy efficient than CMOS technology. While interesting, these systems are academic unless they can be interfaced into current technologies. This dissertation pushed past a major hurdle by experimentally demonstrating a spintronic input/output (I/O) interface for the magnetostatically coupled nanomagnets by incorporating MTJs. This spintronic interface allows individual nanomagnets to be programmed using spin transfer torque and read using magneto resistance structure. Additionally the spintronic interface allows statistical data on the reliability of the magnetic coupling utilized for data propagation to be easily measured. The integration of spintronics and MQCA for an electrical interface to achieve a magnetic logic device with low power creates a competitive post-CMOS logic device. The final logic architecture that was studied used MTJs to compute logic functions and magnetic domain walls to communicate between gates. Simulations were used to optimize the design of this architecture. Spin transfer torque was used to compute logic function at each MTJ gate and was used to drive the domain walls. The design demonstrated that multiple nanochannels could be connected to each MTJ to realize fan-out from the logic gates. As a result this logic scheme eliminates the need for intermediate reads and conversions to pass information from one logic gate to another.
GenLocDip: A Generalized Program to Calculate and Visualize Local Electric Dipole Moments.
Groß, Lynn; Herrmann, Carmen
2016-09-30
Local dipole moments (i.e., dipole moments of atomic or molecular subsystems) are essential for understanding various phenomena in nanoscience, such as solvent effects on the conductance of single molecules in break junctions or the interaction between the tip and the adsorbate in atomic force microscopy. We introduce GenLocDip, a program for calculating and visualizing local dipole moments of molecular subsystems. GenLocDip currently uses the Atoms-In-Molecules (AIM) partitioning scheme and is interfaced to various AIM programs. This enables postprocessing of a variety of electronic structure output formats including cube and wavefunction files, and, in general, output from any other code capable of writing the electron density on a three-dimensional grid. It uses a modified version of Bader's and Laidig's approach for achieving origin-independence of local dipoles by referring to internal reference points which can (but do not need to be) bond critical points (BCPs). Furthermore, the code allows the export of critical points and local dipole moments into a POVray readable input format. It is particularly designed for fragments of large systems, for which no BCPs have been calculated for computational efficiency reasons, because large interfragment distances prevent their identification, or because a local partitioning scheme different from AIM was used. The program requires only minimal user input and is written in the Fortran90 programming language. To demonstrate the capabilities of the program, examples are given for covalently and non-covalently bound systems, in particular molecular adsorbates. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Trusiak, M.; Patorski, K.; Tkaczyk, T.
2014-12-01
We propose a fast, simple and experimentally robust method for reconstructing background-rejected optically-sectioned microscopic images using two-shot structured illumination approach. Innovative data demodulation technique requires two grid-illumination images mutually phase shifted by π (half a grid period) but precise phase displacement value is not critical. Upon subtraction of the two frames the input pattern with increased grid modulation is computed. The proposed demodulation procedure comprises: (1) two-dimensional data processing based on the enhanced, fast empirical mode decomposition (EFEMD) method for the object spatial frequency selection (noise reduction and bias term removal), and (2) calculating high contrast optically-sectioned image using the two-dimensional spiral Hilbert transform (HS). The proposed algorithm effectiveness is compared with the results obtained for the same input data using conventional structured-illumination (SIM) and HiLo microscopy methods. The input data were collected for studying highly scattering tissue samples in reflectance mode. In comparison with the conventional three-frame SIM technique we need one frame less and no stringent requirement on the exact phase-shift between recorded frames is imposed. The HiLo algorithm outcome is strongly dependent on the set of parameters chosen manually by the operator (cut-off frequencies for low-pass and high-pass filtering and η parameter value for optically-sectioned image reconstruction) whereas the proposed method is parameter-free. Moreover very short processing time required to efficiently demodulate the input pattern predestines proposed method for real-time in-vivo studies. Current implementation completes full processing in 0.25s using medium class PC (Inter i7 2,1 GHz processor and 8 GB RAM). Simple modification employed to extract only first two BIMFs with fixed filter window size results in reducing the computing time to 0.11s (8 frames/s).
KENO-VI Primer: A Primer for Criticality Calculations with SCALE/KENO-VI Using GeeWiz
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowman, Stephen M
2008-09-01
The SCALE (Standardized Computer Analyses for Licensing Evaluation) computer software system developed at Oak Ridge National Laboratory is widely used and accepted around the world for criticality safety analyses. The well-known KENO-VI three-dimensional Monte Carlo criticality computer code is one of the primary criticality safety analysis tools in SCALE. The KENO-VI primer is designed to help a new user understand and use the SCALE/KENO-VI Monte Carlo code for nuclear criticality safety analyses. It assumes that the user has a college education in a technical field. There is no assumption of familiarity with Monte Carlo codes in general or with SCALE/KENO-VImore » in particular. The primer is designed to teach by example, with each example illustrating two or three features of SCALE/KENO-VI that are useful in criticality analyses. The primer is based on SCALE 6, which includes the Graphically Enhanced Editing Wizard (GeeWiz) Windows user interface. Each example uses GeeWiz to provide the framework for preparing input data and viewing output results. Starting with a Quickstart section, the primer gives an overview of the basic requirements for SCALE/KENO-VI input and allows the user to quickly run a simple criticality problem with SCALE/KENO-VI. The sections that follow Quickstart include a list of basic objectives at the beginning that identifies the goal of the section and the individual SCALE/KENO-VI features that are covered in detail in the sample problems in that section. Upon completion of the primer, a new user should be comfortable using GeeWiz to set up criticality problems in SCALE/KENO-VI. The primer provides a starting point for the criticality safety analyst who uses SCALE/KENO-VI. Complete descriptions are provided in the SCALE/KENO-VI manual. Although the primer is self-contained, it is intended as a companion volume to the SCALE/KENO-VI documentation. (The SCALE manual is provided on the SCALE installation DVD.) The primer provides specific examples of using SCALE/KENO-VI for criticality analyses; the SCALE/KENO-VI manual provides information on the use of SCALE/KENO-VI and all its modules. The primer also contains an appendix with sample input files.« less
Network analysis of Chinese provincial economies
NASA Astrophysics Data System (ADS)
Sun, Xiaoqi; An, Haizhong; Liu, Xiaojia
2018-02-01
Global economic system is a huge network formed by national subnetworks that contains the provincial networks. As the second largest world economy, China has "too big to fail" impact on the interconnected global economy. Detecting the critical sectors and vital linkages inside Chinese economic network is meaningful for understanding the origin of this Chinese impact. Different from tradition network research at national level, this paper focuses on the provincial networks and inter-provincial network. Using Chinese inter-regional input-output table to construct 30 provincial input-output networks and one inter-provincial input-output network, we identify central sectors and vital linkages, as well as analyze economic structure similarity. Results show that (1) Communication Devices sector in Guangdong and that in Jiangsu, Transportation and Storage sector in Shanghai play critical roles in Chinese economy. (2) Advanced manufactures and services industry occupy the central positions in eastern provincial economies, while Construction sector, Heavy industry, and Wholesale and Retail Trades sector are influential in middle and western provinces. (3) The critical monetary flow paths in Chinese economy are Communication Devices sector to Communication Devices sector in Guangdong, Metals Mining sector to Iron and Steel Smelting sector in Henan, Communication Devices sector to Communication Devices sector in Jiangsu, as well as Petroleum Mining sector in Heilongjiang to Petroleum Processing sector in Liaoning. (4) Collective influence results suggest that Finance sector, Transportation and Storage sector, Production of Electricity and Heat sector, and Rubber and Plastics sector in Hainan are strategic influencers, despite being weakly connected. These sectors and input-output relations are worthy of close attention for monitoring Chinese economy.
Universal coverage challenges require health system approaches; the case of India.
Duran, Antonio; Kutzin, Joseph; Menabde, Nata
2014-02-01
This paper uses the case of India to demonstrate that Universal Health Coverage (UHC) is about not only health financing; personal and population services production issues, stewardship of the health system and generation of the necessary resources and inputs need to accompany the health financing proposals. In order to help policy makers address UHC in India and sort out implementation issues, the framework developed by the World Health Organization (WHO) in the World Health Report 2000 and its subsequent extensions are advocated. The framework includes final goals, generic intermediate objectives and four inter-dependent functions which interact as a system; it can be useful by diagnosing current shortcomings and facilitating the filling up of gaps between functions and goals. Different positions are being defended in India re the preconditions for UHC to succeed. This paper argues that more (public) money will be important, but not enough; it needs to be supplemented with broad interventions at various health system levels. The paper analyzes some of the most important issues in relation to the functions of service production, generation of inputs and the necessary stewardship. It also pays attention to reform implementation, as different from its design, and suggests critical aspects emanating from a review of recent health system reforms. Precisely because of the lack of comparative reference for India, emphasis is made on the need to accompany implementation with analysis, so that the "solutions" ("what to do?", "how to do it?") are found through policy analysis and research embedded into flexible implementation. Strengthening "evidence-to-policy" links and the intelligence dimension of stewardship/leadership as well as accountability during implementation are considered paramount. Countries facing similar challenges to those faced by India can also benefit from the above approaches. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Miller, M. E.; Elliot, W.; Billmire, M.; Robichaud, P. R.; Banach, D. M.
2017-12-01
We have built a Rapid Response Erosion Database (RRED, http://rred.mtri.org/rred/) for the continental United States to allow land managers to access properly formatted spatial model inputs for the Water Erosion Prediction Project (WEPP). Spatially-explicit process-based models like WEPP require spatial inputs that include digital elevation models (DEMs), soil, climate and land cover. The online database delivers either a 10m or 30m USGS DEM, land cover derived from the Landfire project, and soil data derived from SSURGO and STATSGO datasets. The spatial layers are projected into UTM coordinates and pre-registered for modeling. WEPP soil parameter files are also created along with linkage files to match both spatial land cover and soils data with the appropriate WEPP parameter files. Our goal is to make process-based models more accessible by preparing spatial inputs ahead of time allowing modelers to focus on addressing scenarios of concern. The database provides comprehensive support for post-fire hydrological modeling by allowing users to upload spatial soil burn severity maps, and within moments returns spatial model inputs. Rapid response is critical following natural disasters. After moderate and high severity wildfires, flooding, erosion, and debris flows are a major threat to life, property and municipal water supplies. Mitigation measures must be rapidly implemented if they are to be effective, but they are expensive and cannot be applied everywhere. Fire, runoff, and erosion risks also are highly heterogeneous in space, creating an urgent need for rapid, spatially-explicit assessment. The database has been used to help assess and plan remediation on over a dozen wildfires in the Western US. Future plans include expanding spatial coverage, improving model input data and supporting additional models. Our goal is to facilitate the use of the best possible datasets and models to support the conservation of soil and water.
NASA Astrophysics Data System (ADS)
Garen, D. C.; Kahl, A.; Marks, D. G.; Winstral, A. H.
2012-12-01
In mountainous catchments, it is well known that meteorological inputs, such as precipitation, air temperature, humidity, etc. vary greatly with elevation, spatial location, and time. Understanding and monitoring catchment inputs is necessary in characterizing and predicting hydrologic response to these inputs. This is true all of the time, but it is the most dramatically critical during large storms, when the input to the stream system due to rain and snowmelt creates the potential for flooding. Besides such crisis events, however, proper estimation of catchment inputs and their spatial distribution is also needed in more prosaic but no less important water and related resource management activities. The first objective of this study is to apply a geostatistical spatial interpolation technique (elevationally detrended kriging) to precipitation and dew point temperature on an hourly basis and explore its characteristics, accuracy, and other issues. The second objective is to use these spatial fields to determine precipitation phase (rain or snow) during a large, dynamic winter storm. The catchment studied is the data-rich Reynolds Creek Experimental Watershed near Boise, Idaho. As part of this analysis, precipitation-elevation lapse rates are examined for spatial and temporal consistency. A clear dependence of lapse rate on precipitation amount exists. Certain stations, however, are outliers from these relationships, showing that significant local effects can be present and raising the question of whether such stations should be used for spatial interpolation. Experiments with selecting subsets of stations demonstrate the importance of elevation range and spatial placement on the interpolated fields. Hourly spatial fields of precipitation and dew point temperature are used to distinguish precipitation phase during a large rain-on-snow storm in December 2005. This application demonstrates the feasibility of producing hourly spatial fields and the importance of doing so to support an accurate determination of precipitation phase for assessing catchment hydrologic response to the storm.
Biodegradable Plastic Mulch Films: Impacts on Soil Microbial Communities and Ecosystem Functions.
Bandopadhyay, Sreejata; Martin-Closas, Lluis; Pelacho, Ana M; DeBruyn, Jennifer M
2018-01-01
Agricultural plastic mulch films are widely used in specialty crop production systems because of their agronomic benefits. Biodegradable plastic mulches (BDMs) offer an environmentally sustainable alternative to conventional polyethylene (PE) mulch. Unlike PE films, which need to be removed after use, BDMs are tilled into soil where they are expected to biodegrade. However, there remains considerable uncertainty about long-term impacts of BDM incorporation on soil ecosystems. BDMs potentially influence soil microbial communities in two ways: first, as a surface barrier prior to soil incorporation, indirectly affecting soil microclimate and atmosphere (similar to PE films) and second, after soil incorporation, as a direct input of physical fragments, which add carbon, microorganisms, additives, and adherent chemicals. This review summarizes the current literature on impacts of plastic mulches on soil biological and biogeochemical processes, with a special emphasis on BDMs. The combined findings indicated that when used as a surface barrier, plastic mulches altered soil microbial community composition and functioning via microclimate modification, though the nature of these alterations varied between studies. In addition, BDM incorporation into soil can result in enhanced microbial activity and enrichment of fungal taxa. This suggests that despite the fact that total carbon input from BDMs is minuscule, a stimulatory effect on microbial activity may ultimately affect soil organic matter dynamics. To address the current knowledge gaps, long term studies and a better understanding of impacts of BDMs on nutrient biogeochemistry are needed. These are critical to evaluating BDMs as they relate to soil health and agroecosystem sustainability.
Keys, Yolanda; Silverman, Susan R; Evans, Jennie
2017-10-01
The purpose of this study was to collect the perceptions of design professionals and clinicians regarding design process success strategies and elements of interprofessional engagement and communication during healthcare design and construction projects. Additional objectives were to gather best practices to maximize clinician engagement and provide tools and techniques to improve interdisciplinary collaboration for future projects. Strategies are needed to enhance the design and construction process and create interactions that benefit not only the project but the individuals working to see its completion. Meaningful interprofessional collaboration is essential to any healthcare design project and making sure the various players communicate is a critical element. This was a qualitative study conducted via an online survey. Respondents included architects, construction managers, interior designers, and healthcare personnel who had recently been involved in a building renovation or new construction project for a healthcare facility. Responses to open-ended questions were analyzed for themes, and descriptive statistics were used to provide insight into participant demographics. Information on the impressions, perceptions, and opportunities related to clinician involvement in design projects was collected from nurses, architects, interior designers, and construction managers. Qualitative analysis revealed themes of clinician input, organizational dynamics, and a variety of communication strategies to be the most frequently mentioned elements of successful interprofessional collaboration. This study validates the need to include clinician input in the design process, to consider the importance of organizational dynamics on design team functioning, and to incorporate effective communication strategies during design and construction projects.
Reasons For Physicians Not Adopting Clinical Decision Support Systems: Critical Analysis.
Khairat, Saif; Marc, David; Crosby, William; Al Sanousi, Ali
2018-04-18
Clinical decision support systems (CDSSs) are an integral component of today's health information technologies. They assist with interpretation, diagnosis, and treatment. A CDSS can be embedded throughout the patient safety continuum providing reminders, recommendations, and alerts to health care providers. Although CDSSs have been shown to reduce medical errors and improve patient outcomes, they have fallen short of their full potential. User acceptance has been identified as one of the potential reasons for this shortfall. The purpose of this paper was to conduct a critical review and task analysis of CDSS research and to develop a new framework for CDSS design in order to achieve user acceptance. A critical review of CDSS papers was conducted with a focus on user acceptance. To gain a greater understanding of the problems associated with CDSS acceptance, we conducted a task analysis to identify and describe the goals, user input, system output, knowledge requirements, and constraints from two different perspectives: the machine (ie, the CDSS engine) and the user (ie, the physician). Favorability of CDSSs was based on user acceptance of clinical guidelines, reminders, alerts, and diagnostic suggestions. We propose two models: (1) the user acceptance and system adaptation design model, which includes optimizing CDSS design based on user needs/expectations, and (2) the input-process-output-engagemodel, which reveals to users the processes that govern CDSS outputs. This research demonstrates that the incorporation of the proposed models will improve user acceptance to support the beneficial effects of CDSSs adoption. Ultimately, if a user does not accept technology, this not only poses a threat to the use of the technology but can also pose a threat to the health and well-being of patients. ©Saif Khairat, David Marc, William Crosby, Ali Al Sanousi. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 18.04.2018.
DOT National Transportation Integrated Search
2012-04-01
The purpose of this report is to document the stakeholder input received at the February 8, 2012, stakeholder workshop at the Hall of States in Washington, D.C. on goals, performance measures, transformative performance targets, and high-level user n...
Do Colleges Cultivate Critical Thinking, Problem Solving, Writing and Interpersonal Skills?
ERIC Educational Resources Information Center
Saavedra, Anna Rosefsky; Saavedra, Juan Esteban
2011-01-01
We investigate how much value college enrollment adds to students' critical thinking, problem-solving and communication skills, and the role college inputs play in developing these competencies, using data from a 2009 collegiate assessment pilot study in Colombia. Relative to observationally similar first year students, students in their final…
Preface [to special section on recent Loch Vale Watershed research
Baron, Jill S.; Williams, Mark W.
2000-01-01
Catchment-scale intensive and extensive research conducted over the last decade shows that our understanding of the biogeochemical and hydrologic processes in subalpine and alpine basins is not yet sufficiently mature to model and predict how biogeochemical transformations and surface water quality will change in response to climatic or human-driven changes in energy, water, and chemicals. A better understanding of these processes is needed for input to decision-making regulatory agencies and federal land managers. In recognition of this problem the National Research Council [1998] has identified as a critical research need an improved understanding of how global change will affect biogeochemical interactions with the hydrologic cycle and biogeochemical controls over the transport of water, nutrients, and materials from land to freshwater ecosystems. Improved knowledge of alpine and subalpine ecosystems is particularly important since high-elevation catchments are very sensitive to small changes in the flux of energy, chemicals, and water. Furthermore, alpine ecosystems may act as early warning indicators for ecosystem changes at lower elevations.
Assessing and managing breast cancer risk: clinicians' current practice and future needs.
Collins, Ian M; Steel, Emma; Mann, G Bruce; Emery, Jon D; Bickerstaffe, Adrian; Trainer, Alison; Butow, Phyllis; Pirotta, Marie; Antoniou, Antonis C; Cuzick, Jack; Hopper, John; Phillips, Kelly-Anne; Keogh, Louise A
2014-10-01
Decision support tools for the assessment and management of breast cancer risk may improve uptake of prevention strategies. End-user input in the design of such tools is critical to increase clinical use. Before developing such a computerized tool, we examined clinicians' practice and future needs. Twelve breast surgeons, 12 primary care physicians and 5 practice nurses participated in 4 focus groups. These were recorded, coded, and analyzed to identify key themes. Participants identified difficulties assessing risk, including a lack of available tools to standardize practice. Most expressed confidence identifying women at potentially high risk, but not moderate risk. Participants felt a tool could especially reassure young women at average risk. Desirable features included: evidence-based, accessible (e.g. web-based), and displaying absolute (not relative) risks in multiple formats. The potential to create anxiety was a concern. Development of future tools should address these issues to optimize translation of knowledge into clinical practice. Copyright © 2014 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Piepel, Gregory F.; Amidan, Brett G.; Hu, Rebecca
2011-11-28
This report summarizes previous laboratory studies to characterize the performance of methods for collecting, storing/transporting, processing, and analyzing samples from surfaces contaminated by Bacillus anthracis or related surrogates. The focus is on plate culture and count estimates of surface contamination for swab, wipe, and vacuum samples of porous and nonporous surfaces. Summaries of the previous studies and their results were assessed to identify gaps in information needed as inputs to calculate key parameters critical to risk management in biothreat incidents. One key parameter is the number of samples needed to make characterization or clearance decisions with specified statistical confidence. Othermore » key parameters include the ability to calculate, following contamination incidents, the (1) estimates of Bacillus anthracis contamination, as well as the bias and uncertainties in the estimates, and (2) confidence in characterization and clearance decisions for contaminated or decontaminated buildings. Gaps in knowledge and understanding identified during the summary of the studies are discussed and recommendations are given for future studies.« less
Strengthening Connections between Dendrohydrology and Water Management in the Mediterranean Basin
NASA Astrophysics Data System (ADS)
Touchan, R.; Freitas, R. J.
2017-12-01
Dendrochronology can provide the knowledge upon which to base sound decisions for water resources. In general, water managers are limited to using short continuous instrumental records for forecasting streamflows and reservoir levels. Longer hydrological records are required. Proxy data such as annual tree-ring growth provide us with knowledge of the past frequency and severity of climatic anomalies, such as drought and wet periods, and can be used to improve probability calculations of future events. By improving probability input to these plans, water managers can use this information for water allocations, water conservation measures, and water efficiency methods. Accurate planning is critical in water deficit regions with histories of conflict over land and limited water. Here, we link the science of dendrohydrology with water management, and identify appropriate forums for scientists, policy decision makers, and water managers to collaborate in translating science into effective actions anticipating extreme events, such drought or floods. We will present examples of several dendrohydrological reconstructions from the eastern Mediterranean and North Africa as input for water management plans. Different disciplines are needed to work together, and we identify possible mechanisms to collaborate in order to reach this crucial necessity to use scarce water wisely.
Short-term plasticity and long-term potentiation mimicked in single inorganic synapses
NASA Astrophysics Data System (ADS)
Ohno, Takeo; Hasegawa, Tsuyoshi; Tsuruoka, Tohru; Terabe, Kazuya; Gimzewski, James K.; Aono, Masakazu
2011-08-01
Memory is believed to occur in the human brain as a result of two types of synaptic plasticity: short-term plasticity (STP) and long-term potentiation (LTP; refs , , , ). In neuromorphic engineering, emulation of known neural behaviour has proven to be difficult to implement in software because of the highly complex interconnected nature of thought processes. Here we report the discovery of a Ag2S inorganic synapse, which emulates the synaptic functions of both STP and LTP characteristics through the use of input pulse repetition time. The structure known as an atomic switch, operating at critical voltages, stores information as STP with a spontaneous decay of conductance level in response to intermittent input stimuli, whereas frequent stimulation results in a transition to LTP. The Ag2S inorganic synapse has interesting characteristics with analogies to an individual biological synapse, and achieves dynamic memorization in a single device without the need of external preprogramming. A psychological model related to the process of memorizing and forgetting is also demonstrated using the inorganic synapses. Our Ag2S element indicates a breakthrough in mimicking synaptic behaviour essential for the further creation of artificial neural systems that emulate characteristics of human memory.
Continuous attractor network models of grid cell firing based on excitatory–inhibitory interactions
Shipston‐Sharman, Oliver; Solanka, Lukas
2016-01-01
Abstract Neurons in the medial entorhinal cortex encode location through spatial firing fields that have a grid‐like organisation. The challenge of identifying mechanisms for grid firing has been addressed through experimental and theoretical investigations of medial entorhinal circuits. Here, we discuss evidence for continuous attractor network models that account for grid firing by synaptic interactions between excitatory and inhibitory cells. These models assume that grid‐like firing patterns are the result of computation of location from velocity inputs, with additional spatial input required to oppose drift in the attractor state. We focus on properties of continuous attractor networks that are revealed by explicitly considering excitatory and inhibitory neurons, their connectivity and their membrane potential dynamics. Models at this level of detail can account for theta‐nested gamma oscillations as well as grid firing, predict spatial firing of interneurons as well as excitatory cells, show how gamma oscillations can be modulated independently from spatial computations, reveal critical roles for neuronal noise, and demonstrate that only a subset of excitatory cells in a network need have grid‐like firing fields. Evaluating experimental data against predictions from detailed network models will be important for establishing the mechanisms mediating grid firing. PMID:27870120
Illusions of team working in health care.
West, Michael A; Lyubovnikova, Joanne
2013-01-01
The ubiquity and value of teams in healthcare are well acknowledged. However, in practice, healthcare teams vary dramatically in their structures and effectiveness in ways that can damage team processes and patient outcomes. The aim of this paper is to highlight these characteristics and to extrapolate several important aspects of teamwork that have a powerful impact on team effectiveness across healthcare contexts. The paper draws upon the literature from health services management and organisational behaviour to provide an overview of the current science of healthcare teams. Underpinned by the input-process-output framework of team effectiveness, team composition, team task, and organisational support are viewed as critical inputs that influence key team processes including team objectives, leadership and reflexivity, which in turn impact staff and patient outcomes. Team training interventions and care pathways can facilitate more effective interdisciplinary teamwork. The paper argues that the prevalence of the term "team" in healthcare makes the synthesis and advancement of the scientific understanding of healthcare teams a challenge. Future research therefore needs to better define the fundamental characteristics of teams in studies in order to ensure that findings based on real teams, rather than pseudo-like groups, are accumulated.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-07
...The Food and Drug Administration (FDA) is announcing a 2-day public hearing to obtain input from interested persons on FDA's scope and direction in modernizing the regulations, policies, and practices that apply to the conduct of clinical trials of FDA-regulated products. Clinical trials are a critical source of evidence to inform medical policy and practice, and effective regulatory oversight is needed to ensure that human subjects are protected and resulting clinical trial data are credible and accurate. FDA is aware of concerns within the clinical trial community that certain regulations and policies applicable to the conduct of clinical trials may result in inefficiencies or increased cost and may not facilitate the use of innovative methods and technological advances to improve clinical trial quality. The Agency is involved in an effort to modernize the regulatory framework that governs clinical trials and approaches to good clinical practice (GCP). The purpose of this hearing is to solicit public input from a broad group of stakeholders on the scope and direction of this effort, including encouraging the use of innovative models that may enhance the effectiveness and efficiency of the clinical trial enterprise.
Rossitsa River Basin: Flood Hazard and Risk Identification
NASA Astrophysics Data System (ADS)
Mavrova-Guirguinova, Maria; Pencheva, Denislava
2017-04-01
The process of Flood Risk Management Planning and adaptation of measures for flood risk reduction as the Early Warning provoke the necessity of surveys involving Identification aspects. This project presents risk identification combining two lines of analysis: (1) Creation a mathematical model of rainfall-runoff processes in a watershed based on limited number of observed input and output variables; (2) Procedures for determination of critical thresholds - discharges/water levels corresponding to certain consequences. The pilot region is Rossitsa river basin, Sevlievo, Bulgaria. The first line of analysis follows next steps: (a) Creation and calibration of Unit Hydrograph Models based on limited number of observed data for discharge and precipitation; The survey at the selected region has 22 observations for excess rainfall and discharge. (b) The relations of UHM coefficients from the input parameters have been determined statistically, excluding the ANN model of the run-off coefficient as a function of 3 parameters (amount of precipitation two days before, soil condition, intensity of the rainfall) where a feedforward neural network is used. (c) Additional simulations with UHM aiming at generation of synthetic data for rainfall-runoff events, which extend the range of observed data; (d) Training, validation and testing a generalized regional ANN Model for discharge forecasting with 4 input parameters, where the training data set consists of synthetic data, validation and testing data sets consists of observations. A function between consequences and discharges has been reached in the second line of analysis concerning critical hazard levels determination. Unsteady simulations with the hydraulic model using three typical hydrographs for determination of the existing time for reaction from one to upper critical threshold are made. Correction of the critical thresholds aiming at providing necessary time for reaction between the thresholds and probability analysis of the finally determined critical thresholds are made. The result of the described method is a Catalogue for off-line flood hazard and risk identification. It can be used as interactive computer system, based on simulations of the ANN "Catalogue". Flood risk identification of the future rainfall event is made in a multi-dimensional space for each kind of soil conditions (dry, average wet and wet condition) and observed amount of precipitation two days before. Rainfall-runoff scenarios in case of intensive rainfall or sustained rainfall (more than 6 hours) are taken into account. Critical thresholds and hazard zones needed of specific operative activities (rescue and recovery) corresponded to each of the regulated flood protection levels (unite, municipality, regional or national) are presented. The Catalogue gives the opportunity for flood hazard scenarios extraction. Regarding that, the Catalogue is useful on the prevention stage of flood protection planning (emergency operations, measures and resources for their implementation planning) and creation of scenarios for training the Emergency Plans. Concerning application for Early Warning, it gives approximate forecast for flood hazard. The Catalogue supplies the necessary time for reaction of about 24 hours. Thus, Early Warning is possible to the responsible authorities, all parts if the Unified Rescue System, members of suitable Headquarters for disaster protection (on municipality, region or national level).
Restoring tactile and proprioceptive sensation through a brain interface
Tabot, Gregg A.; Kim, Sung Shin; Winberry, Jeremy E.; Bensmaia, Sliman J.
2014-01-01
Somatosensation plays a critical role in the dexterous manipulation of objects, in emotional communication, and in the embodiment of our limbs. For upper-limb neuroprostheses to be adopted by prospective users, prosthetic limbs will thus need to provide sensory information about the position of the limb in space and about objects grasped in the hand. One approach to restoring touch and proprioception consists of electrically stimulating neurons in somatosensory cortex in the hopes of eliciting meaningful sensations to support the dexterous use of the hands, promote their embodiment, and perhaps even restore the affective dimension of touch. In this review, we discuss the importance of touch and proprioception in everyday life, then describe approaches to providing artificial somatosensory feedback through intracortical microstimulation (ICMS). We explore the importance of biomimicry – the elicitation of naturalistic patterns of neuronal activation – and that of adaptation – the brain’s ability to adapt to novel sensory input, and argue that both biomimicry and adaptation will play a critical role in the artificial restoration of somatosensation. We also propose that the documented re-organization that occurs after injury does not pose a significant obstacle to brain interfaces. While still at an early stage of development, sensory restoration is a critical step in transitioning upper-limb neuroprostheses from the laboratory to the clinic. PMID:25201560
Restoring tactile and proprioceptive sensation through a brain interface.
Tabot, Gregg A; Kim, Sung Shin; Winberry, Jeremy E; Bensmaia, Sliman J
2015-11-01
Somatosensation plays a critical role in the dexterous manipulation of objects, in emotional communication, and in the embodiment of our limbs. For upper-limb neuroprostheses to be adopted by prospective users, prosthetic limbs will thus need to provide sensory information about the position of the limb in space and about objects grasped in the hand. One approach to restoring touch and proprioception consists of electrically stimulating neurons in somatosensory cortex in the hopes of eliciting meaningful sensations to support the dexterous use of the hands, promote their embodiment, and perhaps even restore the affective dimension of touch. In this review, we discuss the importance of touch and proprioception in everyday life, then describe approaches to providing artificial somatosensory feedback through intracortical microstimulation (ICMS). We explore the importance of biomimicry--the elicitation of naturalistic patterns of neuronal activation--and that of adaptation--the brain's ability to adapt to novel sensory input, and argue that both biomimicry and adaptation will play a critical role in the artificial restoration of somatosensation. We also propose that the documented re-organization that occurs after injury does not pose a significant obstacle to brain interfaces. While still at an early stage of development, sensory restoration is a critical step in transitioning upper-limb neuroprostheses from the laboratory to the clinic. Copyright © 2014 Elsevier Inc. All rights reserved.
Assessing the effect of elevated carbon dioxide on soil carbon: a comparison of four meta-analyses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hungate, B. A.; van Groenigen, K.; Six, J.
2009-08-01
Soil is the largest reservoir of organic carbon (C) in the terrestrial biosphere and soil C has a relatively long mean residence time. Rising atmospheric carbon dioxide (CO{sub 2}) concentrations generally increase plant growth and C input to soil, suggesting that soil might help mitigate atmospheric CO{sub 2} rise and global warming. But to what extent mitigation will occur is unclear. The large size of the soil C pool not only makes it a potential buffer against rising atmospheric CO{sub 2}, but also makes it difficult to measure changes amid the existing background. Meta-analysis is one tool that can overcomemore » the limited power of single studies. Four recent meta-analyses addressed this issue but reached somewhat different conclusions about the effect of elevated CO{sub 2} on soil C accumulation, especially regarding the role of nitrogen (N) inputs. Here, we assess the extent of differences between these conclusions and propose a new analysis of the data. The four meta-analyses included different studies, derived different effect size estimates from common studies, used different weighting functions and metrics of effect size, and used different approaches to address nonindependence of effect sizes. Although all factors influenced the mean effect size estimates and subsequent inferences, the approach to independence had the largest influence. We recommend that meta-analysts critically assess and report choices about effect size metrics and weighting functions, and criteria for study selection and independence. Such decisions need to be justified carefully because they affect the basis for inference. Our new analysis, with a combined data set, confirms that the effect of elevated CO{sub 2} on net soil C accumulation increases with the addition of N fertilizers. Although the effect at low N inputs was not significant, statistical power to detect biogeochemically important effect sizes at low N is limited, even with meta-analysis, suggesting the continued need for long-term experiments.« less
NASA Technical Reports Server (NTRS)
Goldberg, Robert K.; Carney, Kelly S.; DuBois, Paul; Hoffarth, Canio; Rajan, Subramaniam; Blackenhorn, Gunther
2015-01-01
The need for accurate material models to simulate the deformation, damage and failure of polymer matrix composites under impact conditions is becoming critical as these materials are gaining increased usage in the aerospace and automotive industries. While there are several composite material models currently available within commercial transient dynamic finite element codes, several features have been identified as being lacking in the currently available material models that could substantially enhance the predictive capability of the impact simulations. A specific desired feature pertains to the incorporation of both plasticity and damage within the material model. Another desired feature relates to using experimentally based tabulated stress-strain input to define the evolution of plasticity and damage as opposed to specifying discrete input properties (such as modulus and strength) and employing analytical functions to track the response of the material. To begin to address these needs, a combined plasticity and damage model suitable for use with both solid and shell elements is being developed for implementation within the commercial code LS-DYNA. The plasticity model is based on extending the Tsai-Wu composite failure model into a strain-hardening based orthotropic plasticity model with a non-associative flow rule. The evolution of the yield surface is determined based on tabulated stress-strain curves in the various normal and shear directions and is tracked using the effective plastic strain. The effective plastic strain is computed by using the non-associative flow rule in combination with appropriate numerical methods. To compute the evolution of damage, a strain equivalent semi-coupled formulation is used, in which a load in one direction results in a stiffness reduction in multiple coordinate directions. A specific laminated composite is examined to demonstrate the process of characterizing and analyzing the response of a composite using the developed model.
NASA Technical Reports Server (NTRS)
Briggs, Maxwell; Schifer, Nicholas
2011-01-01
Test hardware used to validate net heat prediction models. Problem: Net Heat Input cannot be measured directly during operation. Net heat input is a key parameter needed in prediction of efficiency for convertor performance. Efficiency = Electrical Power Output (Measured) divided by Net Heat Input (Calculated). Efficiency is used to compare convertor designs and trade technology advantages for mission planning.
Measuring Nanomaterial Release from Carbon Nanotube Composites: Review of the State of the Science
NASA Astrophysics Data System (ADS)
Harper, Stacey; Wohlleben, Wendel; Doa, Maria; Nowack, Bernd; Clancy, Shaun; Canady, Richard; Maynard, Andrew
2015-05-01
Hazard studies of “as-produced” nanomaterials are increasingly available, yet a critical gap exists in exposure science that may impede safe development of nanomaterials. The gap is that we do not understand what is actually released because nanomaterials can change when released in ways that are not understood. We also generally do not have methods capable of quantitatively measuring what is released to support dose assessment. This review presents a case study of multi-walled carbon nanotubes (MWCNTs) for the measurement challenge to bridge this gap. As the use and value of MWCNTs increases, methods to measure what is released in ways relevant to risk evaluation are critically needed if products containing these materials are to be economically, environmentally, and socially sustainable. This review draws on the input of over 50 experts engaged in a program of workshops and technical report writing to address the release of MWCNTs from nanocomposite materials across their life cycle. The expert analyses reveals that new and sophisticated methods are required to measure and assess MWCNT exposures for realistic exposure scenarios. Furthermore, method requirements vary with the materials and conditions of release across life cycle stages of products. While review shows that the likelihood of significant release of MWCNTs appears to be low for many stages of composite life cycle, measurement methods are needed so that exposures from MWCNT-composites are understood and managed. In addition, there is an immediate need to refocus attention from study of “as-produced” nanomaterials to coordinated research on actual release scenarios.
Graphical Visualization of Human Exploration Capabilities
NASA Technical Reports Server (NTRS)
Rodgers, Erica M.; Williams-Byrd, Julie; Arney, Dale C.; Simon, Matthew A.; Williams, Phillip A.; Barsoum, Christopher; Cowan, Tyler; Larman, Kevin T.; Hay, Jason; Burg, Alex
2016-01-01
NASA's pioneering space strategy will require advanced capabilities to expand the boundaries of human exploration on the Journey to Mars (J2M). The Evolvable Mars Campaign (EMC) architecture serves as a framework to identify critical capabilities that need to be developed and tested in order to enable a range of human exploration destinations and missions. Agency-wide System Maturation Teams (SMT) are responsible for the maturation of these critical exploration capabilities and help formulate, guide and resolve performance gaps associated with the EMC-identified capabilities. Systems Capability Organization Reporting Engine boards (SCOREboards) were developed to integrate the SMT data sets into cohesive human exploration capability stories that can be used to promote dialog and communicate NASA's exploration investments. Each SCOREboard provides a graphical visualization of SMT capability development needs that enable exploration missions, and presents a comprehensive overview of data that outlines a roadmap of system maturation needs critical for the J2M. SCOREboards are generated by a computer program that extracts data from a main repository, sorts the data based on a tiered data reduction structure, and then plots the data according to specified user inputs. The ability to sort and plot varying data categories provides the flexibility to present specific SCOREboard capability roadmaps based on customer requests. This paper presents the development of the SCOREboard computer program and shows multiple complementary, yet different datasets through a unified format designed to facilitate comparison between datasets. Example SCOREboard capability roadmaps are presented followed by a discussion of how the roadmaps are used to: 1) communicate capability developments and readiness of systems for future missions, and 2) influence the definition of NASA's human exploration investment portfolio through capability-driven processes. The paper concludes with a description of planned future work to modify the computer program to include additional data and of alternate capability roadmap formats currently under consideration.
Critical Success Factors for the North Carolina Community College System. A Background Paper.
ERIC Educational Resources Information Center
Smith, Kathryn Baker
In response to legislative mandate, the North Carolina State Board of Community Colleges developed a list of Critical Success Factors (CSF) to help define statewide measures of accountability for all community colleges. Developed by staff members of the State Board with input from the state's community college presidents, the CSFs emphasize…
ERIC Educational Resources Information Center
Hamilton, Robert
2014-01-01
In this study, the prototype of a new type of bilingual picture book was field-tested with two sets of mother-son subject pairs. This picture book was designed as a possible tool for providing children with comprehensible input during their critical period for second language acquisition. Context is provided by visual cues and both Japanese and…
Biological soil crusts as an organizing principle in drylands: Chapter 1
Belnap, Jayne; Weber, Bettina; Büdel, Burkhard; Weber, Bettina; Buedel, Burkhard; Belnap, Jayne
2016-01-01
Biological soil crusts (biocrusts) have been present on Earth’s terrestrial surfaces for billions of years. They are a critical part of ecosystem processes in dryland regions, as they cover most of the soil surface and thus mediate almost all inputs and outputs from soils in these areas. There are many intriguing, but understudied, roles these communities may play in drylands. These include their function in nutrient capture and transformation, influence on the movement and distribution of nutrients and water within dryland soils, ability to structure vascular plant communities, role in creating biodiversity hotspots, and the possibility that they can be used as indicators of soil health. There are still many fascinating aspects of these communities that need study, and we hope that this chapter will facilitate such efforts.
Foreign direct investment, development, and overshoot.
McKinney, Laura A
2014-09-01
Overshoot of the earth's carrying capacity is an acute concern for sustainability initiatives that seek to equalize access to the natural resources that are requisite to meet the basic needs of humanity. Demands on nature that exceed ecological capacities compromise critical ecosystem functions that provision the inputs necessary for life. This paper draws on concepts and analytical frameworks from the natural, physical, and social sciences to assess the drivers of sustainability at the global and national level. Integrative theoretical predictions are tested in a structural equation model that advances empirical research on overshoot and outflows of foreign investments that is relatively lacking in the literature. Findings highlight the differential impacts of key aspects of economic globalization on both development and overshoot across nations. Copyright © 2014 Elsevier Inc. All rights reserved.
Solar electric propulsion thrust subsystem development
NASA Technical Reports Server (NTRS)
Masek, T. D.
1973-01-01
The Solar Electric Propulsion System developed under this program was designed to demonstrate all the thrust subsystem functions needed on an unmanned planetary vehicle. The demonstration included operation of the basic elements, power matching input and output voltage regulation, three-axis thrust vector control, subsystem automatic control including failure detection and correction capability (using a PDP-11 computer), operation of critical elements in thermal-vacuum-, zero-gravity-type propellant storage, and data outputs from all subsystem elements. The subsystem elements, functions, unique features, and test setup are described. General features and capabilities of the test-support data system are also presented. The test program culminated in a 1500-h computer-controlled, system-functional demonstration. This included simultaneous operation of two thruster/power conditioner sets. The results of this testing phase satisfied all the program goals.
Research Priorities from Animal Behaviour for Maximising Conservation Progress.
Greggor, Alison L; Berger-Tal, Oded; Blumstein, Daniel T; Angeloni, Lisa; Bessa-Gomes, Carmen; Blackwell, Bradley F; St Clair, Colleen Cassady; Crooks, Kevin; de Silva, Shermin; Fernández-Juricic, Esteban; Goldenberg, Shifra Z; Mesnick, Sarah L; Owen, Megan; Price, Catherine J; Saltz, David; Schell, Christopher J; Suarez, Andrew V; Swaisgood, Ronald R; Winchell, Clark S; Sutherland, William J
2016-12-01
Poor communication between academic researchers and wildlife managers limits conservation progress and innovation. As a result, input from overlapping fields, such as animal behaviour, is underused in conservation management despite its demonstrated utility as a conservation tool and countless papers advocating its use. Communication and collaboration across these two disciplines are unlikely to improve without clearly identified management needs and demonstrable impacts of behavioural-based conservation management. To facilitate this process, a team of wildlife managers and animal behaviour researchers conducted a research prioritisation exercise, identifying 50 key questions that have great potential to resolve critical conservation and management problems. The resulting agenda highlights the diversity and extent of advances that both fields could achieve through collaboration. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
DOT National Transportation Integrated Search
2009-01-01
In the MechanisticEmpirical Pavement Design Guide (M-EPDG), prediction of flexible pavement response and performance needs an input of dynamic modulus of hot-mix asphalt (HMA) at all three levels of hierarchical inputs. This study was intended to ...
NASA Astrophysics Data System (ADS)
Lutfalla, Suzanne; Skalsky, Rastislav; Martin, Manuel; Balkovic, Juraj; Havlik, Petr; Soussana, Jean-François
2017-04-01
The 4 per 1000 Initiative underlines the role of soil organic matter in addressing the three-fold challenge of food security, adaptation of the land sector to climate change, and mitigation of human-induced GHG emissions. It sets an ambitious global target of a 0.4% (4/1000) annual increase in top soil organic carbon (SOC) stock. The present collaborative project between the 4 per 1000 research program, INRA and IIASA aims at providing a first global assessment of the translation of this soil organic carbon sequestration target into the equivalent organic matter inputs target. Indeed, soil organic carbon builds up in the soil through different processes leading to an increased input of carbon to the system (by increasing returns to the soil for instance) or a decreased output of carbon from the system (mainly by biodegradation and mineralization processes). Here we answer the question of how much extra organic matter must be added to agricultural soils every year (in otherwise unchanged climatic conditions) in order to guarantee a 0.4% yearly increase of total soil organic carbon stocks (40cm soil depth is considered). We use the RothC model of soil organic matter turnover on a spatial grid over 10 years to model two situations for croplands: a first situation where soil organic carbon remains constant (system at equilibrium) and a second situation where soil organic matter increases by 0.4% every year. The model accounts for the effects of soil type, temperature, moisture content and plant cover on the turnover process, it is run on a monthly time step, and it can simulate the needed organic input to sustain a certain SOC stock (or evolution of SOC stock). These two SOC conditions lead to two average yearly plant inputs over 10 years. The difference between the two simulated inputs represent the additional yearly input needed to reach the 4 per 1000 objective (input_eq for inputs needed for SOC to remain constant; input_4/1000 for inputs needed for SOC to reach the 4 per 1000 target). A spatial representation of this difference shows the distribution of the required returns to the soil. This first tool will provide the basis for the next steps: choosing and implementing practices to obtain the required additional input. Results will be presented from simulations at the regional scale (country: Slovakia) and at the global scale (0,5° grid resolution). Soil input data comes from the HWSD, climatic input data comes from AgMERRA climate dataset averaged of a 30 years period (1980-2010). They show that, at the global scale, given some data corrections which will be presented and discussed, the 4 per 1000 increase in top soil organic carbon can be reached with a median additional input of +0.89 tC/ha/year for cropland soils.
Critical success factors for competitiveness of construction companies: A critical review
NASA Astrophysics Data System (ADS)
Hanafi, Abdul Ghafur; Nawi, Mohd Nasrun Mohd
2016-08-01
Making progress basically, a fundamental issue for the construction companies to get by in a highly competitive industry. From time to time, industry players are facing stiff and tough competition due to large number of players, whether existing or new players involved from various background and track record. Furthermore, the large numbers of component deciding the competitiveness of contractors, whose organization structures and governance have turned out to be more muddled. Different construction companies have their own unique criteria which may differ from one to another. The enormous amount of issues needs to bring down to manageable numbers so that measures can be identified and scrutinized to enhance competitiveness. This paper discusses the result from the critical investigation from past studies in the Asian countries, namely China, India, Thailand, Singapore and Malaysia. Several fundamental factors have been identified as CSFs in construction companies in respective country. Also highlighted a critical survey based upon various literatures written on this subject where critical success factors (CSFs) as a yardstick to gauge the relationship among CSFs in various construction companies in the Asian region. Far reaching estimation of an organization's performance and resulting input to its supervision is crucial for business change. Estimation additionally empowers organizations to be contrasted from one another on the premise of institutionalized data, permitting best practices to be distinguished and connected more widely. Different countries have their own set of critical success factors (CSFs) which may differ in term of priority and at the same time share common elements of success factor in accomplishment as a construction companies. The study, which is exploratory in nature, embraced the content investigation and inductive technique to accomplish its objectives.
The Role Of Basal Forebrain Cholinergic Neurons In Fear and Extinction Memory
Knox, Dayan
2016-01-01
Cholinergic input to the neocortex, dorsal hippocampus (dHipp), and basolateral amygdala (BLA) is critical for neural function and synaptic plasticity in these brain regions. Synaptic plasticity in the neocortex, dHipp, ventral Hipp (vHipp), and BLA has also been implicated in fear and extinction memory. This finding raises the possibility that basal forebrain (BF) cholinergic neurons, the predominant source of acetylcholine in these brain regions, have an important role in mediating fear and extinction memory. While empirical studies support this hypothesis, there are interesting inconsistencies among these studies that raise questions about how best to define the role of BF cholinergic neurons in fear and extinction memory. Nucleus basalis magnocellularis (NBM) cholinergic neurons that project to the BLA are critical for fear memory and contextual fear extinction memory. NBM cholinergic neurons that project to the neocortex are critical for cued and contextual fear conditioned suppression, but are not critical for fear memory in other behavioral paradigms and in the inhibitory avoidance paradigm may even inhibit contextual fear memory formation. Medial septum and diagonal band of Broca cholinergic neurons are critical for contextual fear memory and acquisition of cued fear extinction. Thus, even though the results of previous studies suggest BF cholinergic neurons modulate fear and extinction memory, inconsistent findings among these studies necessitates more research to better define the neural circuits and molecular processes through which BF cholinergic neurons modulate fear and extinction memory. Furthermore, studies determining if BF cholinergic neurons can be manipulated in such a manner so as to treat excessive fear in anxiety disorders are needed. PMID:27264248
Burton, Shawn D.
2015-01-01
Granule cell-mediated inhibition is critical to patterning principal neuron activity in the olfactory bulb, and perturbation of synaptic input to granule cells significantly alters olfactory-guided behavior. Despite the critical role of granule cells in olfaction, little is known about how sensory input recruits granule cells. Here, we combined whole-cell patch-clamp electrophysiology in acute mouse olfactory bulb slices with biophysical multicompartmental modeling to investigate the synaptic basis of granule cell recruitment. Physiological activation of sensory afferents within single glomeruli evoked diverse modes of granule cell activity, including subthreshold depolarization, spikelets, and suprathreshold responses with widely distributed spike latencies. The generation of these diverse activity modes depended, in part, on the asynchronous time course of synaptic excitation onto granule cells, which lasted several hundred milliseconds. In addition to asynchronous excitation, each granule cell also received synchronous feedforward inhibition. This inhibition targeted both proximal somatodendritic and distal apical dendritic domains of granule cells, was reliably recruited across sniff rhythms, and scaled in strength with excitation as more glomeruli were activated. Feedforward inhibition onto granule cells originated from deep short-axon cells, which responded to glomerular activation with highly reliable, short-latency firing consistent with tufted cell-mediated excitation. Simulations showed that feedforward inhibition interacts with asynchronous excitation to broaden granule cell spike latency distributions and significantly attenuates granule cell depolarization within local subcellular compartments. Collectively, our results thus identify feedforward inhibition onto granule cells as a core feature of olfactory bulb circuitry and establish asynchronous excitation and feedforward inhibition as critical regulators of granule cell activity. SIGNIFICANCE STATEMENT Inhibitory granule cells are involved critically in shaping odor-evoked principal neuron activity in the mammalian olfactory bulb, yet little is known about how sensory input activates granule cells. Here, we show that sensory input to the olfactory bulb evokes a barrage of asynchronous synaptic excitation and highly reliable, short-latency synaptic inhibition onto granule cells via a disynaptic feedforward inhibitory circuit involving deep short-axon cells. Feedforward inhibition attenuates local depolarization within granule cell dendritic branches, interacts with asynchronous excitation to suppress granule cell spike-timing precision, and scales in strength with excitation across different levels of sensory input to normalize granule cell firing rates. PMID:26490853
Critical Seismic Vector Random Excitations for Multiply Supported Structures
NASA Astrophysics Data System (ADS)
Sarkar, A.; Manohar, C. S.
1998-05-01
A method for determining critical power spectral density matrix models for earthquake excitations which maximize steady response variance of linear multiply supported extended structures and which also satisfy constraints on input variance, zero crossing rates, frequency content and transmission time lag has been developed. The optimization problem is shown to be non-linear in nature and solutions are obtained by using an iterative technique which is based on linear programming method. A constraint on entropy rate as a measure of uncertainty which can be expected in realistic earthquake ground motions is proposed which makes the critical excitations more realistic. Two special cases are also considered. Firstly, when knowledge of autospectral densities is available, the critical response is shown to be produced by fully coherent excitations which are neither in-phase nor out-of-phase. The critical phase between the excitation components depends on structural parameters, but independent of the auto-spectral densities of the excitations. Secondly, when the knowledge of autospectral densities and phase spectrum of the excitations is available, the critical response is shown to be produced by a system dependent coherence function representing neither fully coherent nor fully incoherent ground motions. The applications of these special cases are discussed in the context of land-based extended structures and secondary systems such as nuclear piping assembly. Illustrative examples on critical inputs and response of sdof and a long-span suspended cable which demonstrated the various features of the approach developed are presented.
Cohen, Jeremy D; Bolstad, Mark; Lee, Albert K
2017-01-01
The hippocampus is critical for producing stable representations of familiar spaces. How these representations arise is poorly understood, largely because changes to hippocampal inputs have not been measured during spatial learning. Here, using intracellular recording, we monitored inputs and plasticity-inducing complex spikes (CSs) in CA1 neurons while mice explored novel and familiar virtual environments. Inputs driving place field spiking increased in amplitude – often suddenly – during novel environment exploration. However, these increases were not sustained in familiar environments. Rather, the spatial tuning of inputs became increasingly similar across repeated traversals of the environment with experience – both within fields and throughout the whole environment. In novel environments, CSs were not necessary for place field formation. Our findings support a model in which initial inhomogeneities in inputs are amplified to produce robust place field activity, then plasticity refines this representation into one with less strongly modulated, but more stable, inputs for long-term storage. DOI: http://dx.doi.org/10.7554/eLife.23040.001 PMID:28742496
Multiple-Input Multiple-Output (MIMO) Linear Systems Extreme Inputs/Outputs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smallwood, David O.
2007-01-01
A linear structure is excited at multiple points with a stationary normal random process. The response of the structure is measured at multiple outputs. If the autospectral densities of the inputs are specified, the phase relationships between the inputs are derived that will minimize or maximize the trace of the autospectral density matrix of the outputs. If the autospectral densities of the outputs are specified, the phase relationships between the outputs that will minimize or maximize the trace of the input autospectral density matrix are derived. It is shown that other phase relationships and ordinary coherence less than one willmore » result in a trace intermediate between these extremes. Least favorable response and some classes of critical response are special cases of the development. It is shown that the derivation for stationary random waveforms can also be applied to nonstationary random, transients, and deterministic waveforms.« less
Deal, Shanley B; Stefanidis, Dimitrios; Brunt, L Michael; Alseidi, Adnan
2017-05-01
We sought to determine the feasibility of developing a multimedia educational tutorial to teach learners to assess the critical view of safety using input from expert surgeons, non-surgeons and crowd-sourcing. We intended to develop a tutorial that would teach learners how to identify the basic anatomy and physiology of the gallbladder, identify the components of the critical view of safety criteria, and understand its significance for performing a safe gallbladder removal. Using rounds of assessment with experts, laypersons and crowd-workers we developed an educational video with improving comprehension after each round of revision. We demonstrate that the development of a multimedia educational tool to educate learners of various backgrounds is feasible using an iterative review process that incorporates the input of experts and crowd sourcing. When planning the development of an educational tutorial, a step-wise approach as described herein should be considered. Copyright © 2017 Elsevier Inc. All rights reserved.
Miller, Derek M; DeMayo, William M; Bourdages, George H; Wittman, Samuel R; Yates, Bill J; McCall, Andrew A
2017-04-01
The integration of inputs from vestibular and proprioceptive sensors within the central nervous system is critical to postural regulation. We recently demonstrated in both decerebrate and conscious cats that labyrinthine and hindlimb inputs converge onto vestibular nucleus neurons. The pontomedullary reticular formation (pmRF) also plays a key role in postural control, and additionally participates in regulating locomotion. Thus, we hypothesized that like vestibular nucleus neurons, pmRF neurons integrate inputs from the limb and labyrinth. To test this hypothesis, we recorded the responses of pmRF neurons to passive ramp-and-hold movements of the hindlimb and to whole-body tilts, in both decerebrate and conscious felines. We found that pmRF neuronal activity was modulated by hindlimb movement in the rostral-caudal plane. Most neurons in both decerebrate (83% of units) and conscious (61% of units) animals encoded both flexion and extension movements of the hindlimb. In addition, hindlimb somatosensory inputs converged with vestibular inputs onto pmRF neurons in both preparations. Pontomedullary reticular formation neurons receiving convergent vestibular and limb inputs likely participate in balance control by governing reticulospinal outflow.
Miller, Derek M.; DeMayo, William M.; Bourdages, George H.; Wittman, Samuel; Yates, Bill J.; McCall, Andrew A.
2017-01-01
The integration of inputs from vestibular and proprioceptive sensors within the central nervous system is critical to postural regulation. We recently demonstrated in both decerebrate and conscious cats that labyrinthine and hindlimb inputs converge onto vestibular nucleus neurons. The pontomedullary reticular formation (pmRF) also plays a key role in postural control, and additionally participates in regulating locomotion. Thus, we hypothesized that like vestibular nucleus neurons, pmRF neurons integrate inputs from the limb and labyrinth. To test this hypothesis, we recorded the responses of pmRF neurons to passive ramp-and-hold movements of the hindlimb and to whole-body tilts, in both decerebrate and conscious felines. We found that pmRF neuronal activity was modulated by hindlimb movement in the rostral-caudal plane. Most neurons in both decerebrate (83% of units) and conscious (61% of units) animals encoded both flexion and extension movements of the hindlimb. Additionally, hindlimb somatosensory inputs converged with vestibular inputs onto pmRF neurons in both preparations. Pontomedullary reticular formation neurons receiving convergent vestibular and limb inputs likely participate in balance control by governing reticulospinal outflow. PMID:28188328
Enhancing Seasonal Water Outlooks: Needs and Opportunities in the Critical Runoff Season
NASA Astrophysics Data System (ADS)
Ray, A. J.; Barsugli, J. J.; Yocum, H.; Stokes, M.; Miskus, D.
2017-12-01
The runoff season is a critical period for the management of water supply in the western U.S., where in many places over 70% of the annual runoff occurs in the snowmelt period. Managing not only the volume, but the intra-seasonal timing of the runoff is important for optimizing storage, as well as achieving other goals such as mitigating flood risk, and providing peak flows for riparian habitat management, for example, for endangered species. Western river forecast centers produce volume forecasts for western reservoirs that are key input into many water supply decisions, and also short term river forecasts out to 10 days. The early volume forecasts each year typically begin in December, and are updated throughout the winter and into the runoff season (April-July for many areas, but varies). This presentation will discuss opportunities for enhancing this existing suite of RFC water outlooks, including the needs for and potential use for "intraseasonal" products beyond those provided by the Ensemble Streamflow Prediction system and the volume forecasts. While precipitation outlooks have little skill for many areas and seasons, and may not contribute significantly to the outlook, late winter and spring temperature forecasts have meaningful skill in certain areas and sub-seasonal to seasonal time scales. This current skill in CPC temperature outlooks is an opportunity to translate these products into information about the snowpack and potential runoff timing, even where the skill in precipitation is low. Temperature is important for whether precipitation falls as snow or rain, which is critical for streamflow forecasts, especially in the melt season in snowpack-dependent watersheds. There is a need for better outlooks of the evolution of snowpack, conditions influencing the April-July runoff, and the timing of spring peak or shape of the spring hydrograph. The presentation will also discuss a our work with stakeholders of the River Forecast Centers and the NIDIS Drought Early Warning Systems to refine stakeholder needs and create a refined decision calendar for upper Colorado River reservoirs that details decisions in the runoff period.
Morita, Kenji; Tsumoto, Kunichika; Aihara, Kazuyuki
2005-06-01
Recent in vitro experiments revealed that the GABAA reversal potential is about 10 mV higher than the resting potential in mature mammalian neocortical pyramidal cells; thus GABAergic inputs could have facilitatory, rather than inhibitory, effects on action potential generation under certain conditions. However, how the relationship between excitatory input conductances and the output firing rate is modulated by such depolarizing GABAergic inputs under in vivo circumstances has not yet been understood. We examine herewith the input-output relationship in a simple conductance-based model of cortical neurons with the depolarized GABAA reversal potential, and show that a tonic depolarizing GABAergic conductance up to a certain amount does not change the relationship between a tonic glutamatergic driving conductance and the output firing rate, whereas a higher GABAergic conductance prevents spike generation. When the tonic glutamatergic and GABAergic conductances are replaced by in vivo-like highly fluctuating inputs, on the other hand, the effect of depolarizing GABAergic inputs on the input-output relationship critically depends on the degree of coincidence between glutamatergic input events and GABAergic ones. Although a wide range of depolarizing GABAergic inputs hardly changes the firing rate of a neuron driven by noncoincident glutamatergic inputs, a certain range of these inputs considerably decreases the firing rate if a large number of driving glutamatergic inputs are coincident with them. These results raise the possibility that the depolarized GABAA reversal potential is not a paradoxical mystery, but is instead a sophisticated device for discriminative firing rate modulation.
Inputs and spatial distribution patterns of Cr in Jiaozhou Bay
NASA Astrophysics Data System (ADS)
Yang, Dongfang; Miao, Zhenqing; Huang, Xinmin; Wei, Linzhen; Feng, Ming
2018-03-01
Cr pollution in marine bays has been one of the critical environmental issues, and understanding the input and spatial distribution patterns is essential to pollution control. In according to the source strengths of the major pollution sources, the input patterns of pollutants to marine bay include slight, moderate and heavy, and the spatial distribution are corresponding to three block models respectively. This paper analyzed input patterns and distributions of Cr in Jiaozhou Bay, eastern China based on investigation on Cr in surface waters during 1979-1983. Results showed that the input strengths of Cr in Jiaozhou Bay could be classified as moderate input and slight input, and the input strengths were 32.32-112.30 μg L-1 and 4.17-19.76 μg L-1, respectively. The input patterns of Cr included two patterns of moderate input and slight input, and the horizontal distributions could be defined by means of Block Model 2 and Block Model 3, respectively. In case of moderate input pattern via overland runoff, Cr contents were decreasing from the estuaries to the bay mouth, and the distribution pattern was parallel. In case of moderate input pattern via marine current, Cr contents were decreasing from the bay mouth to the bay, and the distribution pattern was parallel to circular. The Block Models were able to reveal the transferring process of various pollutants, and were helpful to understand the distributions of pollutants in marine bay.
NASA Astrophysics Data System (ADS)
Bielski, Conrad; Lemoine, Guido; Syryczynski, Jacek
2009-09-01
High Performance Computing (HPC) hardware solutions such as grid computing and General Processing on a Graphics Processing Unit (GPGPU) are now accessible to users with general computing needs. Grid computing infrastructures in the form of computing clusters or blades are becoming common place and GPGPU solutions that leverage the processing power of the video card are quickly being integrated into personal workstations. Our interest in these HPC technologies stems from the need to produce near real-time maps from a combination of pre- and post-event satellite imagery in support of post-disaster management. Faster processing provides a twofold gain in this situation: 1. critical information can be provided faster and 2. more elaborate automated processing can be performed prior to providing the critical information. In our particular case, we test the use of the PANTEX index which is based on analysis of image textural measures extracted using anisotropic, rotation-invariant GLCM statistics. The use of this index, applied in a moving window, has been shown to successfully identify built-up areas in remotely sensed imagery. Built-up index image masks are important input to the structuring of damage assessment interpretation because they help optimise the workload. The performance of computing the PANTEX workflow is compared on two different HPC hardware architectures: (1) a blade server with 4 blades, each having dual quad-core CPUs and (2) a CUDA enabled GPU workstation. The reference platform is a dual CPU-quad core workstation and the PANTEX workflow total computing time is measured. Furthermore, as part of a qualitative evaluation, the differences in setting up and configuring various hardware solutions and the related software coding effort is presented.
Sienko, K H; Whitney, S L; Carender, W J; Wall, C
2017-01-01
This narrative review highlights findings from the sensory augmentation field for people with vestibular deficits and addresses the outstanding questions that are critical to the translation of this technology into clinical and/or personal use. Prior research has demonstrated that the real-time use of visual, vibrotactile, auditory, and multimodal sensory augmentation technologies can improve balance during static and dynamic stance tasks within a laboratory setting. However, its application in improving gait requires additional investigation, as does its efficacy as a rehabilitation device for people with vestibular deficits. In some locomotor studies involving sensory augmentation, gait velocity decreased and secondary task performance worsened, and subjects negatively altered their segmental control strategies when cues were provided following short training sessions. A further question is whether the retention and/or carry-over effects of training with a sensory augmentation technology exceed the retention and/or carry-over effects of training alone, thereby supporting its use as a rehabilitation device. Preliminary results suggest that there are short-term improvements in balance performance following a small number of training sessions with a sensory augmentation device. Long-term clinical and home-based controlled training studies are needed. It is hypothesized that sensory augmentation provides people with vestibular deficits with additional sensory input to promote central compensation during a specific exercise/activity; however, research is needed to substantiate this theory. Major obstacles standing in the way of its use for these critical applications include determining exercise/activity specific feedback parameters and dosage strategies. This paper summarizes the reported findings that support sensory augmentation as a balance aid and rehabilitation device, but does not critically examine efficacy or the quality of the research methods used in the reviewed studies.
Critical Transition in Critical Zone of Intensively Managed Landscapes
NASA Astrophysics Data System (ADS)
Kumar, P.
2017-12-01
Intensification of industrial agriculture has resulted in severe unintended global impacts, including degradation of arable land and eutrophication of receiving water bodies. Modern agricultural practices rely on significant direct and indirect human energy inputs, which have created imbalances between increased rates of biogeochemical processes related to production and background rates of natural processes. These imbalances have cascaded through the deep inter-dependencies between carbon, soil, water, nutrient and ecological processes, resulting in a critical transition of the Critical Zone and creating emergent dynamics and evolutionary trajectories. Understanding of these novel organization and function of the Critical Zone is vital for developing sustainable agricultural practices.
Listen, Listen, Listen and Listen: Building a Comprehension Corpus and Making It Comprehensible
ERIC Educational Resources Information Center
Mordaunt, Owen G.; Olson, Daniel W.
2010-01-01
Listening comprehension input is necessary for language learning and acculturation. One approach to developing listening comprehension skills is through exposure to massive amounts of naturally occurring spoken language input. But exposure to this input is not enough; learners also need to make the comprehension corpus meaningful to their learning…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woods, Jason; Winkler, Jon
Moisture buffering of building materials has a significant impact on the building's indoor humidity, and building energy simulations need to model this buffering to accurately predict the humidity. Researchers requiring a simple moisture-buffering approach typically rely on the effective-capacitance model, which has been shown to be a poor predictor of actual indoor humidity. This paper describes an alternative two-layer effective moisture penetration depth (EMPD) model and its inputs. While this model has been used previously, there is a need to understand the sensitivity of this model to uncertain inputs. In this paper, we use the moisture-adsorbent materials exposed to themore » interior air: drywall, wood, and carpet. We use a global sensitivity analysis to determine which inputs are most influential and how the model's prediction capability degrades due to uncertainty in these inputs. We then compare the model's humidity prediction with measured data from five houses, which shows that this model, and a set of simple inputs, can give reasonable prediction of the indoor humidity.« less
Woods, Jason; Winkler, Jon
2018-01-31
Moisture buffering of building materials has a significant impact on the building's indoor humidity, and building energy simulations need to model this buffering to accurately predict the humidity. Researchers requiring a simple moisture-buffering approach typically rely on the effective-capacitance model, which has been shown to be a poor predictor of actual indoor humidity. This paper describes an alternative two-layer effective moisture penetration depth (EMPD) model and its inputs. While this model has been used previously, there is a need to understand the sensitivity of this model to uncertain inputs. In this paper, we use the moisture-adsorbent materials exposed to themore » interior air: drywall, wood, and carpet. We use a global sensitivity analysis to determine which inputs are most influential and how the model's prediction capability degrades due to uncertainty in these inputs. We then compare the model's humidity prediction with measured data from five houses, which shows that this model, and a set of simple inputs, can give reasonable prediction of the indoor humidity.« less
Modeling Nearshore Waves for Hurricane Katrina
2007-08-01
sensitivity of the STWAVE results to critical input, three sets of sensitivity runs were made: wind input, degradation of the Chandeleurs Islands, and...is approximately 0.2 to 0.3 m. There are larger differences outside the Chandeleurs (increase of 0.6 – 0.9 m for the plus 5 percent winds and 0.5...possible exception to this is wave attenuation across the barrier islands, which protect the areas in their shadow. The Chandeleur Islands
Validation of smoke plume rise models using ground based lidar
Cyle E. Wold; Shawn Urbanski; Vladimir Kovalev; Alexander Petkov; Wei Min Hao
2010-01-01
Biomass fires can significantly degrade regional air quality. Plume rise height is one of the critical factors determining the impact of fire emissions on air quality. Plume rise models are used to prescribe the vertical distribution of fire emissions which are critical input for smoke dispersion and air quality models. The poor state of model evaluation is due in...
Modares, Hamidreza; Lewis, Frank L; Naghibi-Sistani, Mohammad-Bagher
2013-10-01
This paper presents an online policy iteration (PI) algorithm to learn the continuous-time optimal control solution for unknown constrained-input systems. The proposed PI algorithm is implemented on an actor-critic structure where two neural networks (NNs) are tuned online and simultaneously to generate the optimal bounded control policy. The requirement of complete knowledge of the system dynamics is obviated by employing a novel NN identifier in conjunction with the actor and critic NNs. It is shown how the identifier weights estimation error affects the convergence of the critic NN. A novel learning rule is developed to guarantee that the identifier weights converge to small neighborhoods of their ideal values exponentially fast. To provide an easy-to-check persistence of excitation condition, the experience replay technique is used. That is, recorded past experiences are used simultaneously with current data for the adaptation of the identifier weights. Stability of the whole system consisting of the actor, critic, system state, and system identifier is guaranteed while all three networks undergo adaptation. Convergence to a near-optimal control law is also shown. The effectiveness of the proposed method is illustrated with a simulation example.
NASA Astrophysics Data System (ADS)
Garcia-Fernandez, Mariano; Assatourians, Karen; Jimenez, Maria-Jose
2018-01-01
Extreme natural hazard events have the potential to cause significant disruption to critical infrastructure (CI) networks. Among them, earthquakes represent a major threat as sudden-onset events with limited, if any, capability of forecast, and high damage potential. In recent years, the increased exposure of interdependent systems has heightened concern, motivating the need for a framework for the management of these increased hazards. The seismic performance level and resilience of existing non-nuclear CIs can be analyzed by identifying the ground motion input values leading to failure of selected key elements. Main interest focuses on the ground motions exceeding the original design values, which should correspond to low probability occurrence. A seismic hazard methodology has been specifically developed to consider low-probability ground motions affecting elongated CI networks. The approach is based on Monte Carlo simulation, which allows for building long-duration synthetic earthquake catalogs to derive low-probability amplitudes. This approach does not affect the mean hazard values and allows obtaining a representation of maximum amplitudes that follow a general extreme-value distribution. This facilitates the analysis of the occurrence of extremes, i.e., very low probability of exceedance from unlikely combinations, for the development of, e.g., stress tests, among other applications. Following this methodology, extreme ground-motion scenarios have been developed for selected combinations of modeling inputs including seismic activity models (source model and magnitude-recurrence relationship), ground motion prediction equations (GMPE), hazard levels, and fractiles of extreme ground motion. The different results provide an overview of the effects of different hazard modeling inputs on the generated extreme motion hazard scenarios. This approach to seismic hazard is at the core of the risk analysis procedure developed and applied to European CI transport networks within the framework of the European-funded INFRARISK project. Such an operational seismic hazard framework can be used to provide insight in a timely manner to make informed risk management or regulating further decisions on the required level of detail or on the adoption of measures, the cost of which can be balanced against the benefits of the measures in question.
Reconfigurable Fault Tolerance for FPGAs
NASA Technical Reports Server (NTRS)
Shuler, Robert, Jr.
2010-01-01
The invention allows a field-programmable gate array (FPGA) or similar device to be efficiently reconfigured in whole or in part to provide higher capacity, non-redundant operation. The redundant device consists of functional units such as adders or multipliers, configuration memory for the functional units, a programmable routing method, configuration memory for the routing method, and various other features such as block RAM, I/O (random access memory, input/output) capability, dedicated carry logic, etc. The redundant device has three identical sets of functional units and routing resources and majority voters that correct errors. The configuration memory may or may not be redundant, depending on need. For example, SRAM-based FPGAs will need some type of radiation-tolerant configuration memory, or they will need triple-redundant configuration memory. Flash or anti-fuse devices will generally not need redundant configuration memory. Some means of loading and verifying the configuration memory is also required. These are all components of the pre-existing redundant FPGA. This innovation modifies the voter to accept a MODE input, which specifies whether ordinary voting is to occur, or if redundancy is to be split. Generally, additional routing resources will also be required to pass data between sections of the device created by splitting the redundancy. In redundancy mode, the voters produce an output corresponding to the two inputs that agree, in the usual fashion. In the split mode, the voters select just one input and convey this to the output, ignoring the other inputs. In a dual-redundant system (as opposed to triple-redundant), instead of a voter, there is some means to latch or gate a state update only when both inputs agree. In this case, the invention would require modification of the latch or gate so that it would operate normally in redundant mode, and would separately latch or gate the inputs in non-redundant mode.
Hospital influenza pandemic stockpiling needs: A computer simulation.
Abramovich, Mark N; Hershey, John C; Callies, Byron; Adalja, Amesh A; Tosh, Pritish K; Toner, Eric S
2017-03-01
A severe influenza pandemic could overwhelm hospitals but planning guidance that accounts for the dynamic interrelationships between planning elements is lacking. We developed a methodology to calculate pandemic supply needs based on operational considerations in hospitals and then tested the methodology at Mayo Clinic in Rochester, MN. We upgraded a previously designed computer modeling tool and input carefully researched resource data from the hospital to run 10,000 Monte Carlo simulations using various combinations of variables to determine resource needs across a spectrum of scenarios. Of 10,000 iterations, 1,315 fell within the parameters defined by our simulation design and logical constraints. From these valid iterations, we projected supply requirements by percentile for key supplies, pharmaceuticals, and personal protective equipment requirements needed in a severe pandemic. We projected supplies needs for a range of scenarios that use up to 100% of Mayo Clinic-Rochester's surge capacity of beds and ventilators. The results indicate that there are diminishing patient care benefits for stockpiling on the high side of the range, but that having some stockpile of critical resources, even if it is relatively modest, is most important. We were able to display the probabilities of needing various supply levels across a spectrum of scenarios. The tool could be used to model many other hospital preparedness issues, but validation in other settings is needed. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Havens, Scott; Marks, Danny; Kormos, Patrick; Hedrick, Andrew
2017-12-01
In the Western US and many mountainous regions of the world, critical water resources and climate conditions are difficult to monitor because the observation network is generally very sparse. The critical resource from the mountain snowpack is water flowing into streams and reservoirs that will provide for irrigation, flood control, power generation, and ecosystem services. Water supply forecasting in a rapidly changing climate has become increasingly difficult because of non-stationary conditions. In response, operational water supply managers have begun to move from statistical techniques towards the use of physically based models. As we begin to transition physically based models from research to operational use, we must address the most difficult and time-consuming aspect of model initiation: the need for robust methods to develop and distribute the input forcing data. In this paper, we present a new open source framework, the Spatial Modeling for Resources Framework (SMRF), which automates and simplifies the common forcing data distribution methods. It is computationally efficient and can be implemented for both research and operational applications. We present an example of how SMRF is able to generate all of the forcing data required to a run physically based snow model at 50-100 m resolution over regions of 1000-7000 km2. The approach has been successfully applied in real time and historical applications for both the Boise River Basin in Idaho, USA and the Tuolumne River Basin in California, USA. These applications use meteorological station measurements and numerical weather prediction model outputs as input. SMRF has significantly streamlined the modeling workflow, decreased model set up time from weeks to days, and made near real-time application of a physically based snow model possible.
NASA Astrophysics Data System (ADS)
Buisset, Christophe; Prasit, Apirat; Lépine, Thierry; Poshyajinda, Saran
2015-09-01
The first astronomical images obtained at the 2.4 m Thai National Telescope (TNT) during observations in bright moon conditions were contaminated by high levels of light scattered by the telescope structure. We identified that the origins of this scattered light were the M3 folding mirror baffle and the tube placed inside the fork between the M3 and the M4 mirrors. We thus decided to design and install a new baffle. In a first step, we calculated the optical and mechanical inputs needed to define the baffle optical design. These inputs were: the maximum length of the baffle, the maximum dimensions of the vanes and the incident beam diameter between M3 and M4 mirrors. In a second step, we defined the number, the position and the diameter of the vanes to remove the critical objects from the detector's FOV by using a targeted method. Then, we verified that the critical objects were moved away from the detector's view. In a third step, we designed and manufactured the baffle. The mechanical design is made of 21 sections (1 section for each vane) and comprises an innovative mechanism for the adjustment of the baffle position. The baffle installation and adjustment is performed in less than 20 minutes by 2 operators. In a fourth step, we installed and characterized the baffle by using a pinhole camera. We quantified the performance improvement and we identified the baffle areas at the origin of the residual stray light signal. Finally, we performed targeted on-sky observations to test the baffle in real conditions.
Avoiding Praetorian Societies: Focusing U.S. Strategy on Political Development
2014-03-01
the centrality of political development, understand the critical role of input institutions in political stability , and make efforts to foster these institutions in stability and reconstruction operations.
Biodegradable Plastic Mulch Films: Impacts on Soil Microbial Communities and Ecosystem Functions
Bandopadhyay, Sreejata; Martin-Closas, Lluis; Pelacho, Ana M.; DeBruyn, Jennifer M.
2018-01-01
Agricultural plastic mulch films are widely used in specialty crop production systems because of their agronomic benefits. Biodegradable plastic mulches (BDMs) offer an environmentally sustainable alternative to conventional polyethylene (PE) mulch. Unlike PE films, which need to be removed after use, BDMs are tilled into soil where they are expected to biodegrade. However, there remains considerable uncertainty about long-term impacts of BDM incorporation on soil ecosystems. BDMs potentially influence soil microbial communities in two ways: first, as a surface barrier prior to soil incorporation, indirectly affecting soil microclimate and atmosphere (similar to PE films) and second, after soil incorporation, as a direct input of physical fragments, which add carbon, microorganisms, additives, and adherent chemicals. This review summarizes the current literature on impacts of plastic mulches on soil biological and biogeochemical processes, with a special emphasis on BDMs. The combined findings indicated that when used as a surface barrier, plastic mulches altered soil microbial community composition and functioning via microclimate modification, though the nature of these alterations varied between studies. In addition, BDM incorporation into soil can result in enhanced microbial activity and enrichment of fungal taxa. This suggests that despite the fact that total carbon input from BDMs is minuscule, a stimulatory effect on microbial activity may ultimately affect soil organic matter dynamics. To address the current knowledge gaps, long term studies and a better understanding of impacts of BDMs on nutrient biogeochemistry are needed. These are critical to evaluating BDMs as they relate to soil health and agroecosystem sustainability. PMID:29755440
Rama, Jennifer A.; Campbell, Judith R.; Balmer, Dorene F.; Turner, Teri L.; Hsu, Deborah C.
2015-01-01
Background The experience of transitioning to an academic faculty position can be improved with standardized educational interventions. Although a number of such interventions have been described, few utilize an evaluation framework, describe a robust evaluation process, and address why their interventions were successful. In this article, the authors apply a logic model to describe their efforts to develop, implement, evaluate, and revise a comprehensive academic career development curriculum among pediatric subspecialty fellows. They describe inputs, activities, outputs, and outcomes using quantitative data from fellow evaluations and qualitative data from faculty interviews. Methods Methods are described under the input and activities sections. The curriculum started with collaboration among educational leadership and conducting a needs assessment. Using the needs assessment results and targeted learning objectives, we piloted the curriculum and then implemented the full curriculum 1 year later. Results Results are described under the outputs and outcomes sections. We present immediate, short-term, and 6-month evaluation data. Cumulative data over 3 years reveal that fellows consistently acquired knowledge relevant to transitioning and that they applied acquired knowledge to prepare for finding jobs and career advancement. The curriculum also benefits faculty instructors who gain a sense of reward by filling a critical knowledge gap and fostering fellows’ professional growth. Conclusion The authors relate the success and effectiveness of the curriculum to principles of adult learning, and share lessons learned, including the importance of buy-in from junior and senior fellows and faculty, collaboration, and designating the time to teach and learn. PMID:25861876
An open, object-based modeling approach for simulating subsurface heterogeneity
NASA Astrophysics Data System (ADS)
Bennett, J.; Ross, M.; Haslauer, C. P.; Cirpka, O. A.
2017-12-01
Characterization of subsurface heterogeneity with respect to hydraulic and geochemical properties is critical in hydrogeology as their spatial distribution controls groundwater flow and solute transport. Many approaches of characterizing subsurface heterogeneity do not account for well-established geological concepts about the deposition of the aquifer materials; those that do (i.e. process-based methods) often require forcing parameters that are difficult to derive from site observations. We have developed a new method for simulating subsurface heterogeneity that honors concepts of sequence stratigraphy, resolves fine-scale heterogeneity and anisotropy of distributed parameters, and resembles observed sedimentary deposits. The method implements a multi-scale hierarchical facies modeling framework based on architectural element analysis, with larger features composed of smaller sub-units. The Hydrogeological Virtual Reality simulator (HYVR) simulates distributed parameter models using an object-based approach. Input parameters are derived from observations of stratigraphic morphology in sequence type-sections. Simulation outputs can be used for generic simulations of groundwater flow and solute transport, and for the generation of three-dimensional training images needed in applications of multiple-point geostatistics. The HYVR algorithm is flexible and easy to customize. The algorithm was written in the open-source programming language Python, and is intended to form a code base for hydrogeological researchers, as well as a platform that can be further developed to suit investigators' individual needs. This presentation will encompass the conceptual background and computational methods of the HYVR algorithm, the derivation of input parameters from site characterization, and the results of groundwater flow and solute transport simulations in different depositional settings.
Computing Functions by Approximating the Input
ERIC Educational Resources Information Center
Goldberg, Mayer
2012-01-01
In computing real-valued functions, it is ordinarily assumed that the input to the function is known, and it is the output that we need to approximate. In this work, we take the opposite approach: we show how to compute the values of some transcendental functions by approximating the input to these functions, and obtaining exact answers for their…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-01
... frequency of use and frequency of condition) as well as expert input, a better approach for mass outreach... process, including transparency, stakeholder input, and leadership; and Expert involvement to inform and... BPCA Web site, http://bpca.nichd.nih.gov . As a final step in the process, the NICHD, with input from...
[Mapping Critical Loads of Heavy Metals for Soil Based on Different Environmental Effects].
Shi, Ya-xing; Wu, Shao-hua; Zhou, Sheng-lu; Wang, Chun-hui; Chen, Hao
2015-12-01
China's rapid development of industrialization and urbanization causes the growing problem of heavy metal pollution of soil, threatening environment and human health. Therefore, prevention and management of heavy metal pollution become particularly important. Critical loads of heavy metals are an important management tool that can be utilized to prevent the occurrence of heavy metal pollution. Our study was based on three cases: status balance, water environmental effects and health risks. We used the steady-state mass balance equation to calculate the critical loads of Cd, Cu, Pb, Zn at different effect levels and analyze the values and spatial variation of critical loads. In addition, we used the annual input fluxes of heavy metals of the agro-ecosystem in the Yangtze River delta and China to estimate the proportion of area with exceedance of critical loads. The results demonstrated that the critical load value of Cd was the minimum, and the values of Cu and Zn were lager. There were spatial differences among the critical loads of four elements in the study area, lower critical loads areas mainly occurred in woodland and high value areas distributed in the east and southwest of the study area, while median values and the medium high areas mainly occurred in farmland. Comparing the input fluxes of heavy metals, we found that Pb and Zn in more than 90% of the area exceeded the critical loads under different environmental effects in the study area. The critical load exceedance of Cd mainly occurred under the status balance and the water environmental effect, while Cu under the status balance and water environmental effect with a higher proportion of exceeded areas. Critical loads of heavy metals at different effect levels in this study could serve as a reference from effective control of the emissions of heavy metals and to prevent the occurrence of heavy metal pollution.
A statistical survey of heat input parameters into the cusp thermosphere
NASA Astrophysics Data System (ADS)
Moen, J. I.; Skjaeveland, A.; Carlson, H. C.
2017-12-01
Based on three winters of observational data, we present those ionosphere parameters deemed most critical to realistic space weather ionosphere and thermosphere representation and prediction, in regions impacted by variability in the cusp. The CHAMP spacecraft revealed large variability in cusp thermosphere densities, measuring frequent satellite drag enhancements, up to doublings. The community recognizes a clear need for more realistic representation of plasma flows and electron densities near the cusp. Existing average-value models produce order of magnitude errors in these parameters, resulting in large under estimations of predicted drag. We fill this knowledge gap with statistics-based specification of these key parameters over their range of observed values. The EISCAT Svalbard Radar (ESR) tracks plasma flow Vi , electron density Ne, and electron, ion temperatures Te, Ti , with consecutive 2-3 minute windshield-wipe scans of 1000x500 km areas. This allows mapping the maximum Ti of a large area within or near the cusp with high temporal resolution. In magnetic field-aligned mode the radar can measure high-resolution profiles of these plasma parameters. By deriving statistics for Ne and Ti , we enable derivation of thermosphere heating deposition under background and frictional-drag-dominated magnetic reconnection conditions. We separate our Ne and Ti profiles into quiescent and enhanced states, which are not closely correlated due to the spatial structure of the reconnection foot point. Use of our data-based parameter inputs can make order of magnitude corrections to input data driving thermosphere models, enabling removal of previous two fold drag errors.
Multi-party quantum summation without a trusted third party based on single particles
NASA Astrophysics Data System (ADS)
Zhang, Cai; Situ, Haozhen; Huang, Qiong; Yang, Pingle
We propose multi-party quantum summation protocols based on single particles, in which participants are allowed to compute the summation of their inputs without the help of a trusted third party and preserve the privacy of their inputs. Only one participant who generates the source particles needs to perform unitary operations and only single particles are needed in the beginning of the protocols.
NASA Technical Reports Server (NTRS)
Shontz, W. D.; Records, R. M.; Antonelli, D. R.
1992-01-01
The focus of this project is on alerting pilots to impending events in such a way as to provide the additional time required for the crew to make critical decisions concerning non-normal operations. The project addresses pilots' need for support in diagnosis and trend monitoring of faults as they affect decisions that must be made within the context of the current flight. Monitoring and diagnostic modules developed under the NASA Faultfinder program were restructured and enhanced using input data from an engine model and real engine fault data. Fault scenarios were prepared to support knowledge base development activities on the MONITAUR and DRAPhyS modules of Faultfinder. An analysis of the information requirements for fault management was included in each scenario. A conceptual framework was developed for systematic evaluation of the impact of context variables on pilot action alternatives as a function of event/fault combinations.
Personalized medicine: the absence of 'model-changing' financial incentives.
Keeling, Peter
2007-02-01
This perspective biases on the side that personalized medicine can contribute to a more efficient collective model; however, the hard economics need and deserve significantly more critical analysis and new data input than they are currently being given, to determine their role, or not, in driving change. Put simply, as with the birth of all new and promising developments in healthcare, myth, hope and trend-spotting are driving this market forward, rather than any hard evidence of a sustainable commercial business model for all stakeholders. While there are clear economic benefits to aspects of delivery along the way to personalized care, there may in fact be no compelling economic drivers for radical change for payers and the pharmaceutical industry. The best they can hope to achieve is that the balance sheet is, just that, in balance.
Requirements for long-life mechanical cryocoolers for space applications
NASA Technical Reports Server (NTRS)
Ross, R. G., Jr.
1990-01-01
The growing demand for long wavelength infrared and submillimeter imaging instruments for space observational applications, together with the emergence of the multiyear life Oxford University Stirling cycle cooler, has led to a rapidly expanding near term commitment to mechanical cryocoolers throughout the subkelvin to 150 K temperature range for long-life space missions. To satisfy this growing commitment, emerging cryocoolers must successfully address not only the input power, cooling power, and mass constraints of the spacecraft and instruments, but also the broad array of complex interface requirements that critically affect successful integration to the sensitive instrument detectors. Generic requirements are presented for each of the cryocooler requirement areas, which are then contrasted with the projected capabilities of emerging space cryocoolers. The degree of match is used to highlight both the strengths of existing technologies and the areas in need of increased development.
Computer modeling of photodegradation
NASA Technical Reports Server (NTRS)
Guillet, J.
1986-01-01
A computer program to simulate the photodegradation of materials exposed to terrestrial weathering environments is being developed. Input parameters would include the solar spectrum, the daily levels and variations of temperature and relative humidity, and materials such as EVA. A brief description of the program, its operating principles, and how it works was initially described. After that, the presentation focuses on the recent work of simulating aging in a normal, terrestrial day-night cycle. This is significant, as almost all accelerated aging schemes maintain a constant light illumination without a dark cycle, and this may be a critical factor not included in acceleration aging schemes. For outdoor aging, the computer model is indicating that the night dark cycle has a dramatic influence on the chemistry of photothermal degradation, and hints that a dark cycle may be needed in an accelerated aging scheme.
Lebel, Louis; Tri, Nguyen Hoang; Saengnoree, Amnuay; Pasong, Suparb; Buatama, Urasa; Thoa, Le Kim
2002-06-01
Shrimp aquaculture in Vietnam is in the process of being transformed into a major industry around the intensification of the production system. The experiences of other countries in the region, especially in Thailand where high input production systems dominate, suggests that now is a critical time for intervention to redirect industry into pathways that are more sustainable ecologically, socially, and economically. In Thailand, years of experience with intensified systems and a complex industrial organization has not led to sustainable solutions. The challenge here is for society to regain control and then to redirect the transformation along more efficient and benign pathways. Our analyses suggest that current pathways in both countries are unlikely to lead to a sustainable industry. A complete transformation of the way shrimp are grown, fed, processed, distributed, and regulated is needed.
A Framework for Business Process Change Requirements Analysis
NASA Astrophysics Data System (ADS)
Grover, Varun; Otim, Samuel
The ability to quickly and continually adapt business processes to accommodate evolving requirements and opportunities is critical for success in competitive environments. Without appropriate linkage between redesign decisions and strategic inputs, identifying processes that need to be modified will be difficult. In this paper, we draw attention to the analysis of business process change requirements in support of process change initiatives. Business process redesign is a multifaceted phenomenon involving processes, organizational structure, management systems, human resource architecture, and many other aspects of organizational life. To be successful, the business process initiative should focus not only on identifying the processes to be redesigned, but also pay attention to various enablers of change. Above all, a framework is just a blueprint; management must lead change. We hope our modest contribution will draw attention to the broader framing of requirements for business process change.
Leonard, Laurence; McCutcheon, Karen; Rogers, Katherine M A
2016-01-01
In recent years UK university-based nurse educators have seen a reduction in their responsibilities for nursing students' practice-based assessments. Many university-based nurse educators feel that this lack of input into students' clinical assessments leaves them open to criticism as they are perceived to be less "in-touch" with clinical practice and that their knowledge to teach nursing students is diminished as a result. This paper examines and debates some interpretations of the term "recent clinical practice" and challenges the misconception among many in the profession, as well as government and professional bodies, that university-based nurse educators require recent clinical practice to effectively teach students and enhance the student learning experience in the academic university setting. Copyright © 2015 Elsevier Ltd. All rights reserved.
Analysis of Critical Infrastructure Dependencies and Interdependencies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petit, Frederic; Verner, Duane; Brannegan, David
2015-06-01
The report begins by defining dependencies and interdependencies and exploring basic concepts of dependencies in order to facilitate a common understanding and consistent analytical approaches. Key concepts covered include; Characteristics of dependencies: upstream dependencies, internal dependencies, and downstream dependencies; Classes of dependencies: physical, cyber, geographic, and logical; and Dimensions of dependencies: operating environment, coupling and response behavior, type of failure, infrastructure characteristics, and state of operations From there, the report proposes a multi-phase roadmap to support dependency and interdependency assessment activities nationwide, identifying a range of data inputs, analysis activities, and potential products for each phase, as well as keymore » steps needed to progress from one phase to the next. The report concludes by outlining a comprehensive, iterative, and scalable framework for analyzing dependencies and interdependencies that stakeholders can integrate into existing risk and resilience assessment efforts.« less
Low-noise cryogenic transmission line
NASA Technical Reports Server (NTRS)
Norris, D.
1987-01-01
New low-noise cryogenic input transmission lines have been developed for the Deep Space Network (DSN) at 1.668 GHz for cryogenically cooled Field Effect Transistors (FET) and High Electron Mobility Transistor (HEMT) amplifiers. These amplifiers exhibit very low noise temperatures of 5 K to 15 K, making the requirements for a low-noise input transmission line critical. Noise contribution to the total amplifier system from the low-noise line is less than 0.5 K for both the 1.668-GHz and 2.25-GHz FET systems. The 1.668-GHz input line was installed in six FET systems which were implemented in the DSN for the Venus Balloon Experiment. The 2.25-GHz input line has been implemented in three FET systems for the DSN 34-m HEF antennas, and the design is currently being considered for use at higher frequencies.
Fisher, Simon D.; Reynolds, John N. J.
2014-01-01
Anatomical investigations have revealed connections between the intralaminar thalamic nuclei and areas such as the superior colliculus (SC) that receive short latency input from visual and auditory primary sensory areas. The intralaminar nuclei in turn project to the major input nucleus of the basal ganglia, the striatum, providing this nucleus with a source of subcortical excitatory input. Together with a converging input from the cerebral cortex, and a neuromodulatory dopaminergic input from the midbrain, the components previously found necessary for reinforcement learning in the basal ganglia are present. With this intralaminar sensory input, the basal ganglia are thought to play a primary role in determining what aspect of an organism’s own behavior has caused salient environmental changes. Additionally, subcortical loops through thalamic and basal ganglia nuclei are proposed to play a critical role in action selection. In this mini review we will consider the anatomical and physiological evidence underlying the existence of these circuits. We will propose how the circuits interact to modulate basal ganglia output and solve common behavioral learning problems of agency determination and action selection. PMID:24765070
Weber, Scott; Crago, Elizabeth A; Sherwood, Paula R; Smith, Tara
2009-11-01
The aim of this study was to explore the experiences of nurses and physicians who use a clinical decision support system (CDSS) in the critical care area, focusing on clinicians' motives and values related to decisions to either use or not use this optional technology. Information technology (IT) has been demonstrated to positively impact quality of patient care. Decision-support technology serves as an adjunct to, not as a replacement for, actual clinical decision making. Nurse administrators play an imperative role in the planning and implementation of IT projects and can benefit from understanding clinicians' affective considerations and approaches to the technology. This qualitative study used grounded theory methods. A total of 33 clinicians participated in in-depth structured interviews probing their professional concerns with how the technology is used. Data were analyzed using the constant comparative method. Medical staff were frustrated by perceived lack of planning input before system implementation. Both nurse and physician cohort groups were dissatisfied with preimplementation education. Barriers to system use were identified in significant detail by the participants. Both nurses and physicians should be involved in preimplementation planning and ongoing evaluation of CDSSs. There is a need for a systematic review or Cochrane meta-analysis describing the affective aspects of successful implementations of decisional technology in critical care, specifically from the perspective of nursing administrators.
A literature review of comfort in the paediatric critical care patient.
Bosch-Alcaraz, Alejandro; Falcó-Pegueroles, Anna; Jordan, Iolanda
2018-03-08
To investigate the meaning of comfort and to contextualise it within the framework of paediatric critical care. The concept of comfort is closely linked to care in all health contexts. However, in specific settings such as the paediatric critical care unit, it takes on particular importance. A literature review was conducted. A literature search was performed of articles in English and Spanish in international health science databases, from 1992-March 2017, applying the quality standards established by the PRISMA methodology and the Joanna Briggs Institute. A total of 1,203 publications were identified in the databases. Finally, 59 articles which met the inclusion criteria were entered in this literature review. Almost all were descriptive studies written in English and published in Europe. The concept of comfort was defined as the immediate condition of being strengthened through having the three types of needs (relief, ease and transcendence) addressed in the four contexts of experience (physical, psychospiritual, social and environmental). Only two valid and reliable tools for assessing comfort were found: the Comfort Scale and the Comfort Behavior Scale. Comfort is subjective and difficult to assess. It has four facets: physical, emotional, social and environmental. High levels of noise and light are the inputs that cause the most discomfort. Comfort is a holistic, universal concept and an important component of quality nursing care. © 2018 John Wiley & Sons Ltd.
Normative ethics does not need a foundation: it needs more science.
Quintelier, Katinka; Van Speybroeck, Linda; Braeckman, Johan
2011-03-01
The impact of science on ethics forms since long the subject of intense debate. Although there is a growing consensus that science can describe morality and explain its evolutionary origins, there is less consensus about the ability of science to provide input to the normative domain of ethics. Whereas defenders of a scientific normative ethics appeal to naturalism, its critics either see the naturalistic fallacy committed or argue that the relevance of science to normative ethics remains undemonstrated. In this paper, we argue that current scientific normative ethicists commit no fallacy, that criticisms of scientific ethics contradict each other, and that scientific insights are relevant to normative inquiries by informing ethics about the options open to the ethical debate. Moreover, when conceiving normative ethics as being a nonfoundational ethics, science can be used to evaluate every possible norm. This stands in contrast to foundational ethics in which some norms remain beyond scientific inquiry. Finally, we state that a difference in conception of normative ethics underlies the disagreement between proponents and opponents of a scientific ethics. Our argument is based on and preceded by a reconsideration of the notions naturalistic fallacy and foundational ethics. This argument differs from previous work in scientific ethics: whereas before the philosophical project of naturalizing the normative has been stressed, here we focus on concrete consequences of biological findings for normative decisions or on the day-to-day normative relevance of these scientific insights.
The role of basal forebrain cholinergic neurons in fear and extinction memory.
Knox, Dayan
2016-09-01
Cholinergic input to the neocortex, dorsal hippocampus (dHipp), and basolateral amygdala (BLA) is critical for neural function and synaptic plasticity in these brain regions. Synaptic plasticity in the neocortex, dHipp, ventral Hipp (vHipp), and BLA has also been implicated in fear and extinction memory. This finding raises the possibility that basal forebrain (BF) cholinergic neurons, the predominant source of acetylcholine in these brain regions, have an important role in mediating fear and extinction memory. While empirical studies support this hypothesis, there are interesting inconsistencies among these studies that raise questions about how best to define the role of BF cholinergic neurons in fear and extinction memory. Nucleus basalis magnocellularis (NBM) cholinergic neurons that project to the BLA are critical for fear memory and contextual fear extinction memory. NBM cholinergic neurons that project to the neocortex are critical for cued and contextual fear conditioned suppression, but are not critical for fear memory in other behavioral paradigms and in the inhibitory avoidance paradigm may even inhibit contextual fear memory formation. Medial septum and diagonal band of Broca cholinergic neurons are critical for contextual fear memory and acquisition of cued fear extinction. Thus, even though the results of previous studies suggest BF cholinergic neurons modulate fear and extinction memory, inconsistent findings among these studies necessitates more research to better define the neural circuits and molecular processes through which BF cholinergic neurons modulate fear and extinction memory. Furthermore, studies determining if BF cholinergic neurons can be manipulated in such a manner so as to treat excessive fear in anxiety disorders are needed. Copyright © 2016 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henry, Christopher S.; Bernstein, Hans C.; Weisenhorn, Pamela
Metabolic network modeling of microbial communities provides an in-depth understanding of community-wide metabolic and regulatory processes. Compared to single organism analyses, community metabolic network modeling is more complex because it needs to account for interspecies interactions. To date, most approaches focus on reconstruction of high-quality individual networks so that, when combined, they can predict community behaviors as a result of interspecies interactions. However, this conventional method becomes ineffective for communities whose members are not well characterized and cannot be experimentally interrogated in isolation. Here, we tested a new approach that uses community-level data as a critical input for the networkmore » reconstruction process. This method focuses on directly predicting interspecies metabolic interactions in a community, when axenic information is insufficient. We validated our method through the case study of a bacterial photoautotroph-heterotroph consortium that was used to provide data needed for a community-level metabolic network reconstruction. Resulting simulations provided experimentally validated predictions of how a photoautotrophic cyanobacterium supports the growth of an obligate heterotrophic species by providing organic carbon and nitrogen sources.« less
NASA Astrophysics Data System (ADS)
Crosthwaite Eyre, Charles
2010-12-01
Payments for Ecosystem Services (PES) is an exciting and expanding opportunity for sustainably managed forests. PES are derived from a range of ecosystem benefits from forests including climate change mitigation through afforestation and avoided deforestation, green power generation, wetland and watershed rehabilitation, water quality improvement, marine flood defence and the reduction in desertification and soil erosion. Forests are also the ancestral home to many vulnerable communities which need protection. Sustainable forest management plays a key role in many of these services which generates a potentially critical source of finance. However, for forests to realise revenues from these PES, they must meet demanding standards of project validation and service verification. They also need geospatial data to manage and monitor operational risk. In many cases the data is difficult to collect on the ground - in some cases impossible. This will create a new demand for data that must be impartial, timely, area wide, accurate and cost effective. This presentation will highlight the unique capacity of EO to provide these geospatial inputs required in the generation of PES from forestry and demonstrate products with practical examples.
Human-Robot Interaction in High Vulnerability Domains
NASA Technical Reports Server (NTRS)
Gore, Brian F.
2016-01-01
Future NASA missions will require successful integration of the human with highly complex systems. Highly complex systems are likely to involve humans, automation, and some level of robotic assistance. The complex environments will require successful integration of the human with automation, with robots, and with human-automation-robot teams to accomplish mission critical goals. Many challenges exist for the human performing in these types of operational environments with these kinds of systems. Systems must be designed to optimally integrate various levels of inputs and outputs based on the roles and responsibilities of the human, the automation, and the robots; from direct manual control, shared human-robotic control, or no active human control (i.e. human supervisory control). It is assumed that the human will remain involved at some level. Technologies that vary based on contextual demands and on operator characteristics (workload, situation awareness) will be needed when the human integrates into these systems. Predictive models that estimate the impact of the technologies on the system performance and the on the human operator are also needed to meet the challenges associated with such future complex human-automation-robot systems in extreme environments.
Microbial Metagenomics Reveals Climate-Relevant Subsurface Biogeochemical Processes.
Long, Philip E; Williams, Kenneth H; Hubbard, Susan S; Banfield, Jillian F
2016-08-01
Microorganisms play key roles in terrestrial system processes, including the turnover of natural organic carbon, such as leaf litter and woody debris that accumulate in soils and subsurface sediments. What has emerged from a series of recent DNA sequencing-based studies is recognition of the enormous variety of little known and previously unknown microorganisms that mediate recycling of these vast stores of buried carbon in subsoil compartments of the terrestrial system. More importantly, the genome resolution achieved in these studies has enabled association of specific members of these microbial communities with carbon compound transformations and other linked biogeochemical processes-such as the nitrogen cycle-that can impact the quality of groundwater, surface water, and atmospheric trace gas concentrations. The emerging view also emphasizes the importance of organism interactions through exchange of metabolic byproducts (e.g., within the carbon, nitrogen, and sulfur cycles) and via symbioses since many novel organisms exhibit restricted metabolic capabilities and an associated extremely small cell size. New, genome-resolved information reshapes our view of subsurface microbial communities and provides critical new inputs for advanced reactive transport models. These inputs are needed for accurate prediction of feedbacks in watershed biogeochemical functioning and their influence on the climate via the fluxes of greenhouse gases, CO2, CH4, and N2O. Copyright © 2016 Elsevier Ltd. All rights reserved.
Annemans, Lieven
2008-01-01
The optimal adjuvant hormonal strategy in post-menopausal women with early breast cancer is a subject of ongoing debate. Aromatase inhibitors (AIs) have been successfully evaluated in clinical trials that have compared them with a standard treatment of 5 years of tamoxifen. However, several options are available in terms of treatment schedule and selected drug. Systematic reviews of clinical trials and health economic evaluations attempt to contribute to the debate. The objective of this paper is to provide a critical review of existing health economic evaluations with a focus on those parameters and assumptions with the largest impact on final outcomes.A wide range of different inputs and assumptions exist, which make a comparison of results difficult, if not impossible. In particular, the modelling of recurrence rates over longer time horizons than those observed in clinical trials, a cornerstone of health economic modelling, is subject to quite different approaches. The practice of indirect comparison of different AIs without sufficiently acknowledging population differences is also bothersome. A list of key features (related to time horizon, clinical data input, patient subtypes, budget impact and model calibration) that an ideal model should have in order to better assist decision makers in this field is proposed.
Elmore, Kim; Flanagan, Barry; Jones, Nicholas F; Heitgerd, Janet L
2010-04-01
In 2008, CDC convened an expert panel to gather input on the use of geospatial science in surveillance, research and program activities focused on CDC's Healthy Communities Goal. The panel suggested six priorities: spatially enable and strengthen public health surveillance infrastructure; develop metrics for geospatial categorization of community health and health inequity; evaluate the feasibility and validity of standard metrics of community health and health inequities; support and develop GIScience and geospatial analysis; provide geospatial capacity building, training and education; and, engage non-traditional partners. Following the meeting, the strategies and action items suggested by the expert panel were reviewed by a CDC subcommittee to determine priorities relative to ongoing CDC geospatial activities, recognizing that many activities may need to occur either in parallel, or occur multiple times across phases. Phase A of the action items centers on developing leadership support. Phase B focuses on developing internal and external capacity in both physical (e.g., software and hardware) and intellectual infrastructure. Phase C of the action items plan concerns the development and integration of geospatial methods. In summary, the panel members provided critical input to the development of CDC's strategic thinking on integrating geospatial methods and research issues across program efforts in support of its Healthy Communities Goal.
NASA Astrophysics Data System (ADS)
Aufdenkampe, A. K.; Mayorga, E.; Horsburgh, J. S.; Lehnert, K. A.; Zaslavsky, I.; Valentine, D. W., Jr.; Richard, S. M.; Cheetham, R.; Meyer, F.; Henry, C.; Berg-Cross, G.; Packman, A. I.; Aronson, E. L.
2014-12-01
Here we present the prototypes of a new scientific software system designed around the new Observations Data Model version 2.0 (ODM2, https://github.com/UCHIC/ODM2) to substantially enhance integration of biological and Geological (BiG) data for Critical Zone (CZ) science. The CZ science community takes as its charge the effort to integrate theory, models and data from the multitude of disciplines collectively studying processes on the Earth's surface. The central scientific challenge of the CZ science community is to develop a "grand unifying theory" of the critical zone through a theory-model-data fusion approach, for which the key missing need is a cyberinfrastructure for seamless 4D visual exploration of the integrated knowledge (data, model outputs and interpolations) from all the bio and geoscience disciplines relevant to critical zone structure and function, similar to today's ability to easily explore historical satellite imagery and photographs of the earth's surface using Google Earth. This project takes the first "BiG" steps toward answering that need. The overall goal of this project is to co-develop with the CZ science and broader community, including natural resource managers and stakeholders, a web-based integration and visualization environment for joint analysis of cross-scale bio and geoscience processes in the critical zone (BiG CZ), spanning experimental and observational designs. We will: (1) Engage the CZ and broader community to co-develop and deploy the BiG CZ software stack; (2) Develop the BiG CZ Portal web application for intuitive, high-performance map-based discovery, visualization, access and publication of data by scientists, resource managers, educators and the general public; (3) Develop the BiG CZ Toolbox to enable cyber-savvy CZ scientists to access BiG CZ Application Programming Interfaces (APIs); and (4) Develop the BiG CZ Central software stack to bridge data systems developed for multiple critical zone domains into a single metadata catalog. The entire BiG CZ Software system is being developed on public repositories as a modular suite of open source software projects. It will be built around a new Observations Data Model Version 2.0 (ODM2) that has been developed by members of the BiG CZ project team, with community input, under separate funding.
Report: Science to Support Rulemaking
Report #2003-P-00003, November 15, 2002. The rules included in the pilot study were not a representative statistical sample of EPA rules, and we did not identify all ofthe critical science inputs for every rule.
NASA Astrophysics Data System (ADS)
Song, Rui-Zhuo; Xiao, Wen-Dong; Wei, Qing-Lai
2014-05-01
We develop an online adaptive dynamic programming (ADP) based optimal control scheme for continuous-time chaotic systems. The idea is to use the ADP algorithm to obtain the optimal control input that makes the performance index function reach an optimum. The expression of the performance index function for the chaotic system is first presented. The online ADP algorithm is presented to achieve optimal control. In the ADP structure, neural networks are used to construct a critic network and an action network, which can obtain an approximate performance index function and the control input, respectively. It is proven that the critic parameter error dynamics and the closed-loop chaotic systems are uniformly ultimately bounded exponentially. Our simulation results illustrate the performance of the established optimal control method.
Nuclear Criticality Safety Data Book
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hollenbach, D. F.
The objective of this document is to support the revision of criticality safety process studies (CSPSs) for the Uranium Processing Facility (UPF) at the Y-12 National Security Complex (Y-12). This design analysis and calculation (DAC) document contains development and justification for generic inputs typically used in Nuclear Criticality Safety (NCS) DACs to model both normal and abnormal conditions of processes at UPF to support CSPSs. This will provide consistency between NCS DACs and efficiency in preparation and review of DACs, as frequently used data are provided in one reference source.
Critical care unit design: a nursing perspective.
Williams, M
2001-11-01
The task of designing a new critical care unit is best accomplished with the input of people representing multiple disciplines including architects, engineers, physicians, nurses, and equipment manufacturers. It is imperative that the critical care nursing staff and management take an active role in planning the layout of the unit and patient rooms, as the nurses will be the bedside providers 24 hours a day. The new unit should be designed to offer efficient patient care as well as a healing, comfortable environment for both the patients and their families.
Application of a neural network as a potential aid in predicting NTF pump failure
NASA Technical Reports Server (NTRS)
Rogers, James L.; Hill, Jeffrey S.; Lamarsh, William J., II; Bradley, David E.
1993-01-01
The National Transonic Facility has three centrifugal multi-stage pumps to supply liquid nitrogen to the wind tunnel. Pump reliability is critical to facility operation and test capability. A highly desirable goal is to be able to detect a pump rotating component problem as early as possible during normal operation and avoid serious damage to other pump components. If a problem is detected before serious damage occurs, the repair cost and downtime could be reduced significantly. A neural network-based tool was developed for monitoring pump performance and aiding in predicting pump failure. Once trained, neural networks can rapidly process many combinations of input values other than those used for training to approximate previously unknown output values. This neural network was applied to establish relationships among the critical frequencies and aid in predicting failures. Training pairs were developed from frequency scans from typical tunnel operations. After training, various combinations of critical pump frequencies were propagated through the neural network. The approximated output was used to create a contour plot depicting the relationships of the input frequencies to the output pump frequency.
Reliability of Beam Loss Monitors System for the Large Hadron Collider
NASA Astrophysics Data System (ADS)
Guaglio, G.; Dehning, B.; Santoni, C.
2004-11-01
The employment of superconducting magnets in high energy colliders opens challenging failure scenarios and brings new criticalities for the whole system protection. For the LHC beam loss protection system, the failure rate and the availability requirements have been evaluated using the Safety Integrity Level (SIL) approach. A downtime cost evaluation is used as input for the SIL approach. The most critical systems, which contribute to the final SIL value, are the dump system, the interlock system, the beam loss monitors system and the energy monitor system. The Beam Loss Monitors System (BLMS) is critical for short and intense particle losses, while at medium and higher loss time it is assisted by other systems, such as the quench protection system and the cryogenic system. For BLMS, hardware and software have been evaluated in detail. The reliability input figures have been collected using historical data from the SPS, using temperature and radiation damage experimental data as well as using standard databases. All the data have been processed by reliability software (Isograph). The analysis ranges from the components data to the system configuration.
Retaining the equilibrium point hypothesis as an abstract description of the neuromuscular system.
Tresilian, J R
1999-01-01
The lambda version of the equilibrium point (EP) hypothesis for motor control is examined in light of recent criticisms of its various instantiations. Four important assumptions that have formed the basis for recent criticism are analyzed: First, the assumption that intact muscles possess invariant force-length characteristics (ICs). Second, that these ICs are of the same form in agonist-antagonist pairs. Third, that muscle control is monoparametric and that the control parameter, lambda, can be given a neurophysiological interpretation. Fourth, that reflex loop time delays and the known, asymmetric, nonlinear mechanical properties of muscles can be ignored. Mechanical and neurophysiological investigations of the neuromuscular system suggests that none of these assumptions is likely to be correct. This has been taken to mean that the EP hypothesis is oversimplified and a new approach is needed. It is argued that such an approach can be provided without rejecting the EP hypothesis, rather to regard it as an input-output description of muscle and associated segmental circuits. The operation of the segmental circuitry can be interpreted as having the function, at least in part, of compensating for a variety of nonlinearities and asymmetries such that the overall system implements the lambda-EP model equations.
Extraction of conformal data in critical quantum spin chains using the Koo-Saleur formula
NASA Astrophysics Data System (ADS)
Milsted, Ashley; Vidal, Guifre
2017-12-01
We study the emergence of two-dimensional conformal symmetry in critical quantum spin chains on the finite circle. Our goal is to characterize the conformal field theory (CFT) describing the universality class of the corresponding quantum phase transition. As a means to this end, we propose and demonstrate automated procedures which, using only the lattice Hamiltonian H =∑jhj as an input, systematically identify the low-energy eigenstates corresponding to Virasoro primary and quasiprimary operators, and assign the remaining low-energy eigenstates to conformal towers. The energies and momenta of the primary operator states are needed to determine the primary operator scaling dimensions and conformal spins, an essential part of the conformal data that specifies the CFT. Our techniques use the action, on the low-energy eigenstates of H , of the Fourier modes Hn of the Hamiltonian density hj. The Hn were introduced as lattice representations of the Virasoro generators by Koo and Saleur [Nucl. Phys. B 426, 459 (1994), 10.1016/0550-3213(94)90018-3]. In this paper, we demonstrate that these operators can be used to extract conformal data in a nonintegrable quantum spin chain.
Adapting an Evidence-Based Intervention Targeting HIV-Infected Prisoners in Malaysia
Copenhaver, Michael M.; Tunku, Noor; Ezeabogu, Ifeoma; Potrepka, Jessica; Zahari, Muhammad Muhsin A.; Kamarulzaman, Adeeba; Altice, Frederick L.
2011-01-01
HIV-infected prisoners in Malaysia represent a critical target population for secondary HIV risk reduction interventions and care. We report on the process and outcome of our formative research aimed at systematically selecting and adapting an EBI designed to reduce secondary HIV risk and improve adherence to antiretroviral therapy among soon-to-be-released HIV-infected prisoners. Our formative work involved a critical examination of established EBIs and associated published reports complemented by data elicited through structured interviews and focus groups with key stakeholders, members of the target population, and their family members. Based on all information, we adapted the Holistic Health Recovery Program targeting people living with HIV (HHRP+), an EBI, to consist of eight 2-hour sessions that cover a range of specified topics so that participants may individually apply intervention content as needed to accommodate their particular substance abuse, HIV risk, and antiretroviral adherence issues. This study provides a complete example of the process of selecting and adapting an EBI—taking into account both empirical evidence and input from target organization stakeholders and target population members and their families—for use in real world prison settings where high-risk populations are concentrated. PMID:21860786
Gathering, organizing and accessing data for use in bird conservation across the Americas
Martin, Elizabeth; Peterjohn, Bruce G.; Kelling, Steve; Rich, Terrell D.
2008-01-01
The U.S. North American Bird Conservation Initiative (NABCI) Monitoring Subcommittee (2007) identified the need for a comprehensive plan for integrating and managing bird population monitoring data, and to adapt this as an integral component for improving monitoring activities across North America. While the Subcommittee provided a basic framework to begin development of this data management strategy, input from stakeholders is needed to identify data management needs and the technical capacity necessary to solve those challenges. We organized a session at the Fourth International Partners in Flight Conference to solicit input from session participants from across the Americas and identify their data management needs. Session speakers and participants provided examples of the challenges encountered with data management and how the Internet is increasingly used to provide access to the data needed for bird conservation decisions. Input provided during the session indicated that data management needs extended beyond technology to include scientifi c, conservation, social, institutional, and cultural issues. Because data management is intricately related to all aspects of bird conservation, a coordination process that elevates the importance of data management within the bird conservation community is needed, in addition to improving data management associated with bird population monitoring programs. Development of a comprehensive data management strategy for bird population monitoring data would help address the needs and challenges identified during this session.
T.C. McDonnell; B.J. Cosby; T.J. Sullivan; S.G. McNulty; E.C. Cohen
2010-01-01
The critical load (CL) of acidic atmospheric deposition represents the load of acidity deposited from the atmosphere to the earthâs surface at which harmful acidification effects on sensitive biological receptors are thought to occur. In this study, the CL for forest soils was estimated for 27 watersheds throughout the United States using a steady-state mass balance...
2007-05-01
services by implementing a disaster recovery plan to restore an organization’s critical business functions. (DRII 2004). ISO 27001 An information...the International Organization for Standardization ( ISO )), the IT SSP bases the terms and definitions on those in the NIPP because the SSP is an annex...International Organization for Standardization/International Electrotechnical Commission ( ISO /IEC) 27000 Series, Information technology—Security
Hill, Sophie J; Sofra, Tanya A
2017-03-07
Objective Health literacy is on the policy agenda. Accessible, high-quality health information is a major component of health literacy. Health information materials include print, electronic or other media-based information enabling people to understand health and make health-related decisions. The aim of the present study was to present the findings and recommended actions as they relate to health information of the Victorian Consultation on Health Literacy. Methods Notes and submissions from the 2014 Victorian Consultation workshops and submissions were analysed thematically and a report prepared with input from an advisory committee. Results Health information needs to improve and recommendations are grouped into two overarching themes. First, the quality of information needs to be increased and this can be done by developing a principle-based framework to inform updating guidance for information production, formulating standards to raise quality and improving the systems for delivering information to people. Second, there needs to be a focus on users of health information. Recommendation actions were for information that promoted active participation in health encounters, resources to encourage critical users of health information and increased availability of information tailored to population diversity. Conclusion A framework to improve health information would underpin the efforts to meet literacy needs in a more consistent way, improving standards and ultimately increasing the participation by consumers and carers in health decision making and self-management. What is known about the topic? Health information is a critical component of the concept of health literacy. Poorer health literacy is associated with poorer health outcomes across a range of measures. Improving access to and the use of quality sources of health information is an important strategy for meeting the health literacy needs of the population. In recent years, health services and governments have taken a critical interest in improving health literacy. What does this paper add? This article presents the findings of the Victorian Consultation on Health Literacy as they relate to needs, priorities and potential actions for improving health information. In the context of the National Statement for Health Literacy, health information should be a priority, given its centrality to the public's management of its own health and effective, standards-based, patient-centred clinical care. A framework to improve health information would underpin the efforts of government, services and consumer organisations to meet literacy needs in a more consistent way, improving standards and ultimately increasing the participation by consumers and carers in health decision making and self-management. What are the implications for practitioners? The development and provision of health information materials needs to be systematised and supported by infrastructure, requiring leadership, cultural change, standards and skills development.
Effect of Circuit Inductance on Ceramics Joining by Titanium Foil Explosion
NASA Astrophysics Data System (ADS)
Takada, Yoshihiro; Takaki, Koichi; Itagaki, Minoru; Mukaigawa, Seiji; Fujiwara, Tamiya; Ohshima, Shuzo; Takahashi, Ikuo; Kuwashima, Takayuki
This article describes the influences of circuit inductance on alumina (Al2O3) tile joining using explosive titanium foil. Several kAs pulse current was supplied from 8.28 µF storage capacitor to the 50 µm thickness titanium foil which was sandwiched between the Al2O3 tiles with pressure of 8.3 MPa. The temperature of the foil was rapidly increased owing to ohmic heating with the large current, and then the foil was liquefied and vaporized. The Al2O3 tiles were successfully bonded when the input energy to the titanium foil was higher than the energy required for the foil vaporization. The bonding strength increases with increasing the energy input to the foil. However, the foil explosion cracked the tiles when the input energy exceeds a critical value. Increasing the circuit inductance from 1.13 µH to 64.8 µH, the critical energy of tile cracking increase from 160 J to 507 J, respectively. the maximum bonding strength of 330 kg was obtained when the circuit inductance was 21.8 µH. An investigation of the interfacial structure of the joints using electron probe micro-analysis revealed that distinct reaction areas existed in the interlayer.
Socioeconomic Forecasting Model for the Tri-County Regional Planning Commission
DOT National Transportation Integrated Search
1997-01-01
Socioeconomic data is a critical input to transportation planning and travel demand forecasting. Accurate estimates of existing population, incomes, employment and other socioeconomic characteristics are necessary for meaningful calibration of a trav...
CARE3MENU- A CARE III USER FRIENDLY INTERFACE
NASA Technical Reports Server (NTRS)
Pierce, J. L.
1994-01-01
CARE3MENU generates an input file for the CARE III program. CARE III is used for reliability prediction of complex, redundant, fault-tolerant systems including digital computers, aircraft, nuclear and chemical control systems. The CARE III input file often becomes complicated and is not easily formatted with a text editor. CARE3MENU provides an easy, interactive method of creating an input file by automatically formatting a set of user-supplied inputs for the CARE III system. CARE3MENU provides detailed on-line help for most of its screen formats. The reliability model input process is divided into sections using menu-driven screen displays. Each stage, or set of identical modules comprising the model, must be identified and described in terms of number of modules, minimum number of modules for stage operation, and critical fault threshold. The fault handling and fault occurence models are detailed in several screens by parameters such as transition rates, propagation and detection densities, Weibull or exponential characteristics, and model accuracy. The system fault tree and critical pairs fault tree screens are used to define the governing logic and to identify modules affected by component failures. Additional CARE3MENU screens prompt the user for output options and run time control values such as mission time and truncation values. There are fourteen major screens, many with default values and HELP options. The documentation includes: 1) a users guide with several examples of CARE III models, the dialog required to input them to CARE3MENU, and the output files created; and 2) a maintenance manual for assistance in changing the HELP files and modifying any of the menu formats or contents. CARE3MENU is written in FORTRAN 77 for interactive execution and has been implemented on a DEC VAX series computer operating under VMS. This program was developed in 1985.
NASA Astrophysics Data System (ADS)
Kerner, H. R.; Bell, J. F., III; Ben Amor, H.
2017-12-01
The Mastcam color imaging system on the Mars Science Laboratory Curiosity rover acquires images within Gale crater for a variety of geologic and atmospheric studies. Images are often JPEG compressed before being downlinked to Earth. While critical for transmitting images on a low-bandwidth connection, this compression can result in image artifacts most noticeable as anomalous brightness or color changes within or near JPEG compression block boundaries. In images with significant high-frequency detail (e.g., in regions showing fine layering or lamination in sedimentary rocks), the image might need to be re-transmitted losslessly to enable accurate scientific interpretation of the data. The process of identifying which images have been adversely affected by compression artifacts is performed manually by the Mastcam science team, costing significant expert human time. To streamline the tedious process of identifying which images might need to be re-transmitted, we present an input-efficient neural network solution for predicting the perceived quality of a compressed Mastcam image. Most neural network solutions require large amounts of hand-labeled training data for the model to learn the target mapping between input (e.g. distorted images) and output (e.g. quality assessment). We propose an automatic labeling method using joint entropy between a compressed and uncompressed image to avoid the need for domain experts to label thousands of training examples by hand. We use automatically labeled data to train a convolutional neural network to estimate the probability that a Mastcam user would find the quality of a given compressed image acceptable for science analysis. We tested our model on a variety of Mastcam images and found that the proposed method correlates well with image quality perception by science team members. When assisted by our proposed method, we estimate that a Mastcam investigator could reduce the time spent reviewing images by a minimum of 70%.
Development of EnergyPlus Utility to Batch Simulate Building Energy Performance on a National Scale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valencia, Jayson F.; Dirks, James A.
2008-08-29
EnergyPlus is a simulation program that requires a large number of details to fully define and model a building. Hundreds or even thousands of lines in a text file are needed to run the EnergyPlus simulation depending on the size of the building. To manually create these files is a time consuming process that would not be practical when trying to create input files for thousands of buildings needed to simulate national building energy performance. To streamline the process needed to create the input files for EnergyPlus, two methods were created to work in conjunction with the National Renewable Energymore » Laboratory (NREL) Preprocessor; this reduced the hundreds of inputs needed to define a building in EnergyPlus to a small set of high-level parameters. The first method uses Java routines to perform all of the preprocessing on a Windows machine while the second method carries out all of the preprocessing on the Linux cluster by using an in-house built utility called Generalized Parametrics (GPARM). A comma delimited (CSV) input file is created to define the high-level parameters for any number of buildings. Each method then takes this CSV file and uses the data entered for each parameter to populate an extensible markup language (XML) file used by the NREL Preprocessor to automatically prepare EnergyPlus input data files (idf) using automatic building routines and macro templates. Using a Linux utility called “make”, the idf files can then be automatically run through the Linux cluster and the desired data from each building can be aggregated into one table to be analyzed. Creating a large number of EnergyPlus input files results in the ability to batch simulate building energy performance and scale the result to national energy consumption estimates.« less
Sensory experience modifies feature map relationships in visual cortex
Cloherty, Shaun L; Hughes, Nicholas J; Hietanen, Markus A; Bhagavatula, Partha S
2016-01-01
The extent to which brain structure is influenced by sensory input during development is a critical but controversial question. A paradigmatic system for studying this is the mammalian visual cortex. Maps of orientation preference (OP) and ocular dominance (OD) in the primary visual cortex of ferrets, cats and monkeys can be individually changed by altered visual input. However, the spatial relationship between OP and OD maps has appeared immutable. Using a computational model we predicted that biasing the visual input to orthogonal orientation in the two eyes should cause a shift of OP pinwheels towards the border of OD columns. We then confirmed this prediction by rearing cats wearing orthogonally oriented cylindrical lenses over each eye. Thus, the spatial relationship between OP and OD maps can be modified by visual experience, revealing a previously unknown degree of brain plasticity in response to sensory input. DOI: http://dx.doi.org/10.7554/eLife.13911.001 PMID:27310531
Mechanisms of Nicotine Addiction
DOE Office of Scientific and Technical Information (OSTI.GOV)
McGehee, Daniel
Nicotine reinforces the use of tobacco products primarily through its interaction with specific receptor proteins within the brain’s reward centers. A critical step in the process of addiction for many drugs, including nicotine, is the release of the neurotransmitter dopamine. A single nicotine exposure will enhance dopamine levels for hours, however, nicotinic receptors undergo both activation and then desensitization in minutes, which presents an important problem. How does the time course of receptor activity lead to the prolonged release of dopamine? We have found that persistent modulation of both inhibitory and excitatory synaptic connections by nicotine underlies the sustained increasemore » in dopamine release. Because these inputs express different types of nicotinic receptors there is a coordinated shift in the balance of synaptic inputs toward excitation of the dopamine neurons. Excitatory inputs are turned on while inhibitory inputs are depressed, thereby boosting the brain’s reward system.« less
Overview of Heat Addition and Efficiency Predictions for an Advanced Stirling Convertor
NASA Technical Reports Server (NTRS)
Wilson, Scott D.; Reid, Terry; Schifer, Nicholas; Briggs, Maxwell
2011-01-01
Past methods of predicting net heat input needed to be validated. Validation effort pursued with several paths including improving model inputs, using test hardware to provide validation data, and validating high fidelity models. Validation test hardware provided direct measurement of net heat input for comparison to predicted values. Predicted value of net heat input was 1.7 percent less than measured value and initial calculations of measurement uncertainty were 2.1 percent (under review). Lessons learned during validation effort were incorporated into convertor modeling approach which improved predictions of convertor efficiency.
Comparative study of bolometric and non-bolometric switching elements for microwave phase shifters
NASA Technical Reports Server (NTRS)
Tabib-Azar, Massood; Bhasin, Kul B.; Romanofsky, Robert R.
1991-01-01
The performance of semiconductor and high critical temperature superconductor switches is compared as they are used in delay-line-type microwave and millimeter-wave phase shifters. Such factors as their ratios of the off-to-on resistances, parasitic reactances, power consumption, speed, input-to-output isolation, ease of fabrication, and physical dimensions are compared. Owing to their almost infinite off-to-on resistance ratio and excellent input-to-output isolation, bolometric superconducting switches appear to be quite suitable for use in microwave phase shifters; their only drawbacks are their speed and size. The SUPERFET, a novel device whose operation is based on the electric field effect in high critical temperature ceramic superconductors is also discussed. Preliminary results indicate that the SUPERFET is fast and that it can be scaled; therefore, it can be fabricated with dimensions comparable to semiconductor field-effect transistors.
Critical heat flux phenomena depending on pre-pressurization in transient heat input
NASA Astrophysics Data System (ADS)
Park, Jongdoc; Fukuda, Katsuya; Liu, Qiusheng
2017-07-01
The critical heat flux (CHF) levels that occurred due to exponential heat inputs for varying periods to a 1.0-mm diameter horizontal cylinder immersed in various liquids were measured to develop an extended database on the effect of various pressures and subcoolings by photographic study. Two main mechanisms of CHF were found. One mechanism is due to the time lag of the hydrodynamic instability (HI) which starts at steady-state CHF upon fully developed nucleate boiling, and the other mechanism is due to the explosive process of heterogeneous spontaneous nucleation (HSN) which occurs at a certain HSN superheat in originally flooded cavities on the cylinder surface. The incipience of boiling processes was completely different depending on pre-pressurization. Also, the dependence of pre-pressure in transient CHFs changed due to the wettability of boiling liquids. The objective of this work is to clarify the transient CHF phenomena due to HI or HSN by photographic.
Literature Review on Needs of Upper Limb Prosthesis Users.
Cordella, Francesca; Ciancio, Anna Lisa; Sacchetti, Rinaldo; Davalli, Angelo; Cutti, Andrea Giovanni; Guglielmelli, Eugenio; Zollo, Loredana
2016-01-01
The loss of one hand can significantly affect the level of autonomy and the capability of performing daily living, working and social activities. The current prosthetic solutions contribute in a poor way to overcome these problems due to limitations in the interfaces adopted for controlling the prosthesis and to the lack of force or tactile feedback, thus limiting hand grasp capabilities. This paper presents a literature review on needs analysis of upper limb prosthesis users, and points out the main critical aspects of the current prosthetic solutions, in terms of users satisfaction and activities of daily living they would like to perform with the prosthetic device. The ultimate goal is to provide design inputs in the prosthetic field and, contemporary, increase user satisfaction rates and reduce device abandonment. A list of requirements for upper limb prostheses is proposed, grounded on the performed analysis on user needs. It wants to (i) provide guidelines for improving the level of acceptability and usefulness of the prosthesis, by accounting for hand functional and technical aspects; (ii) propose a control architecture of PNS-based prosthetic systems able to satisfy the analyzed user wishes; (iii) provide hints for improving the quality of the methods (e.g., questionnaires) adopted for understanding the user satisfaction with their prostheses.
Literature Review on Needs of Upper Limb Prosthesis Users
Cordella, Francesca; Ciancio, Anna Lisa; Sacchetti, Rinaldo; Davalli, Angelo; Cutti, Andrea Giovanni; Guglielmelli, Eugenio; Zollo, Loredana
2016-01-01
The loss of one hand can significantly affect the level of autonomy and the capability of performing daily living, working and social activities. The current prosthetic solutions contribute in a poor way to overcome these problems due to limitations in the interfaces adopted for controlling the prosthesis and to the lack of force or tactile feedback, thus limiting hand grasp capabilities. This paper presents a literature review on needs analysis of upper limb prosthesis users, and points out the main critical aspects of the current prosthetic solutions, in terms of users satisfaction and activities of daily living they would like to perform with the prosthetic device. The ultimate goal is to provide design inputs in the prosthetic field and, contemporary, increase user satisfaction rates and reduce device abandonment. A list of requirements for upper limb prostheses is proposed, grounded on the performed analysis on user needs. It wants to (i) provide guidelines for improving the level of acceptability and usefulness of the prosthesis, by accounting for hand functional and technical aspects; (ii) propose a control architecture of PNS-based prosthetic systems able to satisfy the analyzed user wishes; (iii) provide hints for improving the quality of the methods (e.g., questionnaires) adopted for understanding the user satisfaction with their prostheses. PMID:27242413
Cloud Intrusion Detection and Repair (CIDAR)
2016-02-01
form for VLC , Swftools-png2swf, Swftools-jpeg2swf, Dillo and GIMP. The superscript indicates the bit width of each expression atom. “sext(v, w... challenges in input rectification is the need to deal with nested fields. In general, input formats are in tree structures containing arbitrarily...length indicator constraints is challeng - ing, because of the presence of nested fields in hierarchical input format. For example, an integer field may
Lacy, Joyce W.; Yassa, Michael A.; Stark, Shauna M.; Muftuler, L. Tugan; Stark, Craig E.L.
2011-01-01
Producing and maintaining distinct (orthogonal) neural representations for similar events is critical to avoiding interference in long-term memory. Recently, our laboratory provided the first evidence for separation-like signals in the human CA3/dentate. Here, we extended this by parametrically varying the change in input (similarity) while monitoring CA1 and CA3/dentate for separation and completion-like signals using high-resolution fMRI. In the CA1, activity varied in a graded fashion in response to increases in the change in input. In contrast, the CA3/dentate showed a stepwise transfer function that was highly sensitive to small changes in input. PMID:21164173
What Critical Ethical Values Guide Strategic Planning Processes in Health Care Organizations?
Kucmanic, Matthew; Sheon, Amy R
2017-11-01
This case explores a fictitious hospital's use of co-creation to make a decision about redesign of inpatient units as a first step in incorporating stakeholder input into creation of governing policies. We apply a "procedural fairness" framework to reveal that conditions required for an ethical decision about space redesign were not met by using clinician and patient focus groups to obtain stakeholder input. In this article, we identify epistemic injustices resulting from this process that could undermine confidence in leadership decisions. Suggestions are offered for incorporating stakeholder input going forward that address prior shortcomings. The result should be conditions that are perceived as procedurally fair and decisions that engender confidence in institutional leadership. © 2017 American Medical Association. All Rights Reserved.
Morse Monte Carlo Radiation Transport Code System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Emmett, M.B.
1975-02-01
The report contains sections containing descriptions of the MORSE and PICTURE codes, input descriptions, sample problems, deviations of the physical equations and explanations of the various error messages. The MORSE code is a multipurpose neutron and gamma-ray transport Monte Carlo code. Time dependence for both shielding and criticality problems is provided. General three-dimensional geometry may be used with an albedo option available at any material surface. The PICTURE code provide aid in preparing correct input data for the combinatorial geometry package CG. It provides a printed view of arbitrary two-dimensional slices through the geometry. By inspecting these pictures one maymore » determine if the geometry specified by the input cards is indeed the desired geometry. 23 refs. (WRF)« less
Mock, Charles; Arreola-Risa, Carlos; Quansah, Robert
2003-01-01
In all countries, the priority for reducing road traffic injuries should be prevention. Nonetheless, there are low-cost ways to strengthen the care of injured persons, that will help to lower the toll from road traffic. The purpose of this review was to elucidate ways to accomplish this goal in the context of less developed countries. Studies selected for this review were obtained by Medline review, selecting on key words such as trauma, injury, trauma care, essential health services, and developing country. Articles pertaining to any country and all available years were considered. In addition, the authors utilized articles from the gray literature and journals from Mexico and Ghana that are not Medline referenced. Studies surveyed point to road safety and other forms of injury prevention, as well as prehospital care, as likely priorities for developing countries. Nonetheless, hospital-based improvements can contribute to decreases in mortality and, especially, decreases in disability. For both prehospital and hospital based care, studies revealed several critical weak points to address in: (1) human resources (staffing and training); (2) physical resources (equipment, supplies, and infrastructure); and (3) administration and organization. The 'essential services' approach, which has contributed to progress in a variety of fields of international health, needs to be developed for the care of the injured. This would define the trauma treatment services that could realistically be made available to virtually every injured person. It would then address the inputs of human resources, physical resources, and administration necessary to assure these services optimally in the different geographic and socioeconomic environments worldwide. Finally, it would identify and target deficiencies in these inputs that need to be strengthened.
NASA Astrophysics Data System (ADS)
Campo, Lorenzo; Caparrini, Francesca
2013-04-01
The need for accurate distributed hydrological modelling has constantly increased in last years for several purposes: agricultural applications, water resources management, hydrological balance at watershed scale, floods forecast. The main input for the hydrological numerical models is rainfall data that present, at the same time, a large availability of measures (in gauged regions, with respect to other micro-meteorological variables) and the most complex spatial patterns. While also in presence of densely gauged watersheds the spatial interpolation of the rainfall is a non-trivial problem, due to the spatial intermittence of the variable (especially at finer temporal scales), ungauged regions need an alternative source of rainfall data in order to perform the hydrological modelling. Such source can be constituted by the satellite-estimated rainfall fields, with reference to both geostationary and polar-orbit platforms. In this work the rainfall product obtained by the Aqua-AIRS sensor were used in order to assess the feasibility of the use of satellite-based rainfall as input for distributed hydrological modelling. The MOBIDIC (MOdello di BIlancio Distribuito e Continuo) model, developed at the Department of civil and Environmental Engineering of the University of Florence and operationally used by Tuscany Region and Umbria Region for flood prediction and management, was used for the experiments. In particular three experiments were carried on: a) hydrological simulation with the use of rain-gauges data, b) simulation with the use of satellite-only rainfall estimates, c) simulation with the combined use of the two sources of data in order to obtain an optimal estimate of the actual rainfall fields. The domain of the study was the central Italy. Several critical events occurred in the area were analyzed. A discussion of the results is provided.
Walker, Damian
2003-03-01
Many donors and countries are striving to respond to the HIV/AIDS epidemic by implementing prevention programmes. However, the resources available for providing these activities relative to needs are limited. Hence, decision-makers must choose among various types of interventions. Cost information, both measures of cost and cost-effectiveness, serves as a critical input into the processes of setting priorities and allocating resources efficiently. This paper reviews the cost and cost-effectiveness evidence base of HIV/AIDS prevention programmes in low- and middle-income countries (LMICs). None of the studies found have complete cost data for a full range of HIV/AIDS prevention programmes in any one country. However, the range of studies highlight the relative emphasis of different types of HIV/AIDS prevention strategies by region, reflecting the various modes of transmission and hence, to a certain extent, the stage of the epidemic. The costing methods applied and results obtained in this review give rise to questions of reliability, validity and transparency. First, not all of the studies report the methods used to calculate the costs, and/or do not provide all the necessary data inputs such that recalculation of the results is possible. Secondly, methods that are documented vary widely, rendering different studies, even within the same country and programme setting, largely incomparable. Finally, even with consistent and replicable measurement, the results as presented are generally not comparable because of the lack of a common outcome measure. Therefore, the extent to which the available cost and cost-effectiveness evidence base on HIV/AIDS prevention strategies can provide guidance to decision-makers is limited, and there is an urgent need for the generation of this knowledge for planning and decision-making.
The Future of UV-Visible Astronomy from Space - the NASA COPAG SIG
NASA Astrophysics Data System (ADS)
Scowen, Paul
2015-08-01
The ultraviolet (92-320nm) and visible (320-1000nm) (UVV) regions of the spectrum contain a vital suite of diagnostic lines that can be used to study diverse astronomical objects and phenomena that shape and energize the interstellar medium. It is a critical spectral range for tracing the physics of interstellar and intergalactic gas, the ionization of nebulae, the properties of shocks, the atmospheres and winds of hot stars, energy transfer between galaxies and their surrounding environments, and the engines of active galactic nuclei. This spectral range contains diagnostics that measure gas density, electron temperature, and energy balance between various modes of cooling. It is an unfortunate truth that many, if not most, of these diagnostics can only be observed outside the Earth’s atmosphere, requiring facilities in space. Space-based observations also provide access to diffraction-limited optical performance to achieve high spatial resolution. Such spatial resolutions cannot currently be achieved from the ground over wide fields, a capability that many science programs need for sampling and survey work.In order to provide continuing access in the future, new space-based missions will be needed to provide the core imaging and spectroscopic information in this important part of the electromagnetic spectrum. The technology that enables such access has been a high priority in technology development plans that have been developed by both the Cosmic Origins Program Office and Astrophysics Division at NASA, but a holistic approach to considering what is needed for a long-term technology roadmap has not yet been discussed widely within the community. This UVV Science Interest Group [SIG #2] has been established to collect community input and define long-term Cosmic Origins science objectives of the UVV astronomy community that can be addressed by space-based observations. The SIG facilitates communication to merge the needs and desires of the science community with the achievements and plans of the technology community. The SIG is open to any interested members of the community and we welcome any and all input. SIG website: http://sig2.asu.edu.
A Case Analysis to Increase Awareness of Current USMC Knowledge Management (KM) Practices
2013-09-01
because it is “the preeminent economic resource, more important than both raw material and money” (Stewart, 1997, p. 6). Grant ( 1996 ) emphasizes...business” (p. 6) in the modern economy. Other resources cited by Johnson (2010) include Grant ( 1996 ), to demonstrate that knowledge is the “critical...concepts, interpretations, ideas, observations, and judgments” (p. 3). Again, Grant ( 1996 ) adds that knowledge is the “critical input in production and
Curing Color Blindness—Mice and Nonhuman Primates
Neitz, Maureen; Neitz, Jay
2014-01-01
It has been possible to use viral-mediated gene therapy to transform dichromatic (red-green color-blind) primates to trichromatic. Even though the third cone type was added after the end of developmental critical periods, treated animals acquired red-green color vision. What happened in the treated animals may represent a recapitulation of the evolution of trichromacy, which seems to have evolved with the acquisition of a third cone type without the need for subsequent modification to the circuitry. Some transgenic mice in which a third cone type was added also acquired trichromacy. However, compared with treated primates, red-green color vision in mice is poor, indicating large differences between mice and monkeys in their ability to take advantage of the new input. These results have implications for understanding the limits and opportunities for using gene therapy to treat vision disorders caused by defects in cone function. PMID:25147187
Curing color blindness--mice and nonhuman primates.
Neitz, Maureen; Neitz, Jay
2014-08-21
It has been possible to use viral-mediated gene therapy to transform dichromatic (red-green color-blind) primates to trichromatic. Even though the third cone type was added after the end of developmental critical periods, treated animals acquired red-green color vision. What happened in the treated animals may represent a recapitulation of the evolution of trichromacy, which seems to have evolved with the acquisition of a third cone type without the need for subsequent modification to the circuitry. Some transgenic mice in which a third cone type was added also acquired trichromacy. However, compared with treated primates, red-green color vision in mice is poor, indicating large differences between mice and monkeys in their ability to take advantage of the new input. These results have implications for understanding the limits and opportunities for using gene therapy to treat vision disorders caused by defects in cone function. Copyright © 2014 Cold Spring Harbor Laboratory Press; all rights reserved.
Evaluated kinetic and photochemical data for atmospheric chemistry
NASA Technical Reports Server (NTRS)
Baulch, D. L.; Cox, R. A.; Hampson, R. F., Jr.; Kerr, J. A.; Troe, J.; Watson, R. T.
1980-01-01
This paper contains a critical evaluation of the kinetics and photochemistry of gas phase chemical reactions of neutral species involved in middle atmosphere chemistry (10-55 km altitude). Data sheets have been prepared for 148 thermal and photochemical reactions, containing summaries of the available experimental data with notes giving details of the experimental procedures. For each reaction a preferred value of the rate coefficient at 298 K is given together with a temperature dependency where possible. The selection of the preferred value is discussed, and estimates of the accuracies of the rate coefficients and temperature coefficients have been made for each reaction. The data sheets are intended to provide the basic physical chemical data needed as input for calculations which model atmospheric chemistry. A table summarizing the preferred rate data is provided, together with an appendix listing the available data on enthalpies of formation of the reactant and product species.
Using the Climbing Drum Peel (CDP) Test to Obtain a G(sub IC) value for Core/Facesheet Bonds
NASA Technical Reports Server (NTRS)
Nettles, A. T.; Gregory, Elizabeth D.; Jackson, Justin R.
2006-01-01
A method of measuring the Mode I fracture toughness of core/facesheet bonds in sandwich Structures is desired, particularly with the widespread use of models that need this data as input. This study examined if a critical strain energy release rate, G(sub IC), can be obtained from the climbing drum peel (CDP) test. The CDP test is relatively simple to perform and does not rely on measuring small crack lengths such as required by the double cantilever beam (DCB) test. Simple energy methods were used to calculate G(sub IC) from CDP test data on composite facesheets bonded to a honeycomb core. Facesheet thicknesses from 2 to 5 plies were tested to examine the upper and lower bounds on facesheet thickness requirements. Results from the study suggest that the CDP test, with certain provisions, can be used to find the GIG value of a core/facesheet bond.
Monnier, Stéphanie; Cox, David G; Albion, Tim; Canzian, Federico
2005-01-01
Background Single Nucleotide Polymorphism (SNP) genotyping is a major activity in biomedical research. The Taqman technology is one of the most commonly used approaches. It produces large amounts of data that are difficult to process by hand. Laboratories not equipped with a Laboratory Information Management System (LIMS) need tools to organize the data flow. Results We propose a package of Visual Basic programs focused on sample management and on the parsing of input and output TaqMan files. The code is written in Visual Basic, embedded in the Microsoft Office package, and it allows anyone to have access to those tools, without any programming skills and with basic computer requirements. Conclusion We have created useful tools focused on management of TaqMan genotyping data, a critical issue in genotyping laboratories whithout a more sophisticated and expensive system, such as a LIMS. PMID:16221298
Kalpathy-Cramer, Jayashree; Awan, Musaddiq; Bedrick, Steven; Rasch, Coen R N; Rosenthal, David I; Fuller, Clifton D
2014-02-01
Modern radiotherapy requires accurate region of interest (ROI) inputs for plan optimization and delivery. Target delineation, however, remains operator-dependent and potentially serves as a major source of treatment delivery error. In order to optimize this critical, yet observer-driven process, a flexible web-based platform for individual and cooperative target delineation analysis and instruction was developed in order to meet the following unmet needs: (1) an open-source/open-access platform for automated/semiautomated quantitative interobserver and intraobserver ROI analysis and comparison, (2) a real-time interface for radiation oncology trainee online self-education in ROI definition, and (3) a source for pilot data to develop and validate quality metrics for institutional and cooperative group quality assurance efforts. The resultant software, Target Contour Testing/Instructional Computer Software (TaCTICS), developed using Ruby on Rails, has since been implemented and proven flexible, feasible, and useful in several distinct analytical and research applications.
Effects of Long-Duration Microgravity on Fine Motor Skills: ISS One-Year Mission
NASA Technical Reports Server (NTRS)
Holden, Kritina; Greene, Maya; Cross, Ernest
2017-01-01
Fine motor skills will be critical in future long-duration missions, particularly those skills needed to interact with advanced technologies in next-generation vehicles, spacesuits, and habitats. Studies to date on the effects of microgravity and gravitational transitions on fine motor performance have not yielded conclusive results. Datasets are incomplete-timeline gaps in the microgravity data sessions. Studies have not focused on the fine motor actions that are likely to be required for interacting with software displays and controls (pointing, clicking, dragging, multi-touch/pinching). The majority of studies have used a joystick or arm reaching task. Touchscreen tablets are already in use on ISS, and at least one commercial partner is already planning a cockpit with touchscreens as the primary means of input. We must ensure that crewmembers are ready to perform with computer-based devices after a long-duration voyage and transition to surface operations.
Safety monitoring and reactor transient interpreter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hench, J. E.; Fukushima, T. Y.
1983-12-20
An apparatus which monitors a subset of control panel inputs in a nuclear reactor power plant, the subset being those indicators of plant status which are of a critical nature during an unusual event. A display (10) is provided for displaying primary information (14) as to whether the core is covered and likely to remain covered, including information as to the status of subsystems needed to cool the core and maintain core integrity. Secondary display information (18,20) is provided which can be viewed selectively for more detailed information when an abnormal condition occurs. The primary display information has messages (24)more » for prompting an operator as to which one of a number of pushbuttons (16) to press to bring up the appropriate secondary display (18,20). The apparatus utilizes a thermal-hydraulic analysis to more accurately determine key parameters (such as water level) from other measured parameters, such as power, pressure, and flow rate.« less
Standards application and development plan for solar thermal technologies
NASA Astrophysics Data System (ADS)
Cobb, H. R. W.
1981-07-01
Functional and standards matrices, developed from input from ST users and from the industry that will be continually reviewed and updated as commercial aspects develop are presented. The matrices highlight codes, standards, test methods, functions and definitions that need to be developed. They will be submitted through ANSI for development by national consensus bodies. A contingency action is proposed for standards development if specific input is lacking at the committee level or if early development of a standard would hasten commercialization or gain needed jurisdictional acceptance.
NASA Astrophysics Data System (ADS)
Troncossi, M.; Di Sante, R.; Rivola, A.
2016-10-01
In the field of vibration qualification testing, random excitations are typically imposed on the tested system in terms of a power spectral density (PSD) profile. This is the one of the most popular ways to control the shaker or slip table for durability tests. However, these excitations (and the corresponding system responses) exhibit a Gaussian probability distribution, whereas not all real-life excitations are Gaussian, causing the response to be also non-Gaussian. In order to introduce non-Gaussian peaks, a further parameter, i.e., kurtosis, has to be controlled in addition to the PSD. However, depending on the specimen behaviour and input signal characteristics, the use of non-Gaussian excitations with high kurtosis and a given PSD does not automatically imply a non-Gaussian stress response. For an experimental investigation of these coupled features, suitable measurement methods need to be developed in order to estimate the stress amplitude response at critical failure locations and consequently evaluate the input signals most representative for real-life, non-Gaussian excitations. In this paper, a simple test rig with a notched cantilevered specimen was developed to measure the response and examine the kurtosis values in the case of stationary Gaussian, stationary non-Gaussian, and burst non-Gaussian excitation signals. The laser Doppler vibrometry technique was used in this type of test for the first time, in order to estimate the specimen stress amplitude response as proportional to the differential displacement measured at the notch section ends. A method based on the use of measurements using accelerometers to correct for the occasional signal dropouts occurring during the experiment is described. The results demonstrate the ability of the test procedure to evaluate the output signal features and therefore to select the most appropriate input signal for the fatigue test.
Uncovering Spatial Variation in Acoustic Environments Using Sound Mapping.
Job, Jacob R; Myers, Kyle; Naghshineh, Koorosh; Gill, Sharon A
2016-01-01
Animals select and use habitats based on environmental features relevant to their ecology and behavior. For animals that use acoustic communication, the sound environment itself may be a critical feature, yet acoustic characteristics are not commonly measured when describing habitats and as a result, how habitats vary acoustically over space and time is poorly known. Such considerations are timely, given worldwide increases in anthropogenic noise combined with rapidly accumulating evidence that noise hampers the ability of animals to detect and interpret natural sounds. Here, we used microphone arrays to record the sound environment in three terrestrial habitats (forest, prairie, and urban) under ambient conditions and during experimental noise introductions. We mapped sound pressure levels (SPLs) over spatial scales relevant to diverse taxa to explore spatial variation in acoustic habitats and to evaluate the number of microphones needed within arrays to capture this variation under both ambient and noisy conditions. Even at small spatial scales and over relatively short time spans, SPLs varied considerably, especially in forest and urban habitats, suggesting that quantifying and mapping acoustic features could improve habitat descriptions. Subset maps based on input from 4, 8, 12 and 16 microphones differed slightly (< 2 dBA/pixel) from those based on full arrays of 24 microphones under ambient conditions across habitats. Map differences were more pronounced with noise introductions, particularly in forests; maps made from only 4-microphones differed more (> 4 dBA/pixel) from full maps than the remaining subset maps, but maps with input from eight microphones resulted in smaller differences. Thus, acoustic environments varied over small spatial scales and variation could be mapped with input from 4-8 microphones. Mapping sound in different environments will improve understanding of acoustic environments and allow us to explore the influence of spatial variation in sound on animal ecology and behavior.
NASA Astrophysics Data System (ADS)
Goode, J. R.; Candelaria, T.; Kramer, N. R.; Hill, A. F.
2016-12-01
As global energy demands increase, generating hydroelectric power by constructing dams and reservoirs on large river systems is increasingly seen as a renewable alternative to fossil fuels, especially in emerging economies. Many large-scale hydropower projects are located in steep mountainous terrain, where environmental factors have the potential to conspire against the sustainability and success of such projects. As reservoir storage capacity decreases when sediment builds up behind dams, high sediment yields can limit project life expectancy and overall hydropower viability. In addition, episodically delivered sediment from landslides can make quantifying sediment loads difficult. These factors, combined with remote access, limit the critical data needed to effectively evaluate development decisions. In the summer of 2015, we conducted a basic survey to characterize the geomorphology, hydrology and ecology of 620 km of the Rio Maranon, Peru - a major tributary to the Amazon River, which flows north from the semi-arid Peruvian Andes - prior to its dissection by several large hydropower dams. Here we present one component of this larger study: a first order analysis of potential sediment inputs to the Rio Maranon, Peru. To evaluate sediment delivery and storage in this system, we used high resolution Google Earth imagery to delineate landslides, combined with high resolution imagery from a DJI Phantom 3 Drone, flown at alluvial fan inputs to the river in the field. Because hillslope-derived sediment inputs from headwater tributaries are important to overall ecosystem health in large river systems, our study has the potential to contribute to the understanding the impacts of large Andean dams on sediment connectivity to the Amazon basin.
An Algorithm to Atmospherically Correct Visible and Thermal Airborne Imagery
NASA Technical Reports Server (NTRS)
Rickman, Doug L.; Luvall, Jeffrey C.; Schiller, Stephen; Arnold, James E. (Technical Monitor)
2000-01-01
The program Watts implements a system of physically based models developed by the authors, described elsewhere, for the removal of atmospheric effects in multispectral imagery. The band range we treat covers the visible, near IR and the thermal IR. Input to the program begins with atmospheric pal red models specifying transmittance and path radiance. The system also requires the sensor's spectral response curves and knowledge of the scanner's geometric definition. Radiometric characterization of the sensor during data acquisition is also necessary. While the authors contend that active calibration is critical for serious analytical efforts, we recognize that most remote sensing systems, either airborne or space borne, do not as yet attain that minimal level of sophistication. Therefore, Watts will also use semi-active calibration where necessary and available. All of the input is then reduced to common terms, in terms of the physical units. From this it Is then practical to convert raw sensor readings into geophysically meaningful units. There are a large number of intricate details necessary to bring an algorithm or this type to fruition and to even use the program. Further, at this stage of development the authors are uncertain as to the optimal presentation or minimal analytical techniques which users of this type of software must have. Therefore, Watts permits users to break out and analyze the input in various ways. Implemented in REXX under OS/2 the program is designed with attention to the probability that it will be ported to other systems and other languages. Further, as it is in REXX, it is relatively simple for anyone that is literate in any computer language to open the code and modify to meet their needs. The authors have employed Watts in their research addressing precision agriculture and urban heat island.
Systems Analysis and Design for Decision Support Systems on Economic Feasibility of Projects
NASA Astrophysics Data System (ADS)
Balaji, S. Arun
2010-11-01
This paper discuss about need for development of the Decision Support System (DSS) software for economic feasibility of projects in Rwanda, Africa. The various economic theories needed and the corresponding formulae to compute payback period, internal rate of return and benefit cost ratio of projects are clearly given in this paper. This paper is also deals with the systems flow chart to fabricate the system in any higher level computing language. The various input requirements from the projects and the output needed for the decision makers are also included in this paper. The data dictionary used for input and output data structure is also explained.
Development of an ecohydrological salt marsh model
Terrestrial nitrogen input to coastal waters is a critical water quality problem nationwide. Even in systems well described experimentally, a clear understanding of process-level hydrological and biogeochemical controls can be difficult to ascertain from data alone. For examp...
Evaluation of FEM engineering parameters from insitu tests
DOT National Transportation Integrated Search
2001-12-01
The study looked critically at insitu test methods (SPT, CPT, DMT, and PMT) as a means for developing finite element constitutive model input parameters. The first phase of the study examined insitu test derived parameters with laboratory triaxial te...
78 FR 45598 - Industry Forums on the Next ITS Strategic Plan; Notice of Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-29
... of critical thinking tools designed to draw out information to identify and validate focus areas for... inputs through an online tool, IdeaScale at http://itsstrategicplan.ideascale.com . The first facilitated...
NASA Technical Reports Server (NTRS)
Young, J. W.; Schy, A. A.; Johnson, K. G.
1977-01-01
An analytical method has been developed for predicting critical control inputs for which nonlinear rotational coupling may cause sudden jumps in aircraft response. The analysis includes the effect of aerodynamics which are nonlinear in angle of attack. The method involves the simultaneous solution of two polynomials in roll rate, whose coefficients are functions of angle of attack and the control inputs. Results obtained using this procedure are compared with calculated time histories to verify the validity of the method for predicting jump-like instabilities.
Sensitivity of Rainfall-runoff Model Parametrization and Performance to Potential Evaporation Inputs
NASA Astrophysics Data System (ADS)
Jayathilake, D. I.; Smith, T. J.
2017-12-01
Many watersheds of interest are confronted with insufficient data and poor process understanding. Therefore, understanding the relative importance of input data types and the impact of different qualities on model performance, parameterization, and fidelity is critically important to improving hydrologic models. In this paper, the change in model parameterization and performance are explored with respect to four different potential evapotranspiration (PET) products of varying quality. For each PET product, two widely used, conceptual rainfall-runoff models are calibrated with multiple objective functions to a sample of 20 basins included in the MOPEX data set and analyzed to understand how model behavior varied. Model results are further analyzed by classifying catchments as energy- or water-limited using the Budyko framework. The results demonstrated that model fit was largely unaffected by the quality of the PET inputs. However, model parameterizations were clearly sensitive to PET inputs, as their production parameters adjusted to counterbalance input errors. Despite this, changes in model robustness were not observed for either model across the four PET products, although robustness was affected by model structure.
Analysis and Simple Circuit Design of Double Differential EMG Active Electrode.
Guerrero, Federico Nicolás; Spinelli, Enrique Mario; Haberman, Marcelo Alejandro
2016-06-01
In this paper we present an analysis of the voltage amplifier needed for double differential (DD) sEMG measurements and a novel, very simple circuit for implementing DD active electrodes. The three-input amplifier that standalone DD active electrodes require is inherently different from a differential amplifier, and general knowledge about its design is scarce in the literature. First, the figures of merit of the amplifier are defined through a decomposition of its input signal into three orthogonal modes. This analysis reveals a mode containing EMG crosstalk components that the DD electrode should reject. Then, the effect of finite input impedance is analyzed. Because there are three terminals, minimum bounds for interference rejection ratios due to electrode and input impedance unbalances with two degrees of freedom are obtained. Finally, a novel circuit design is presented, including only a quadruple operational amplifier and a few passive components. This design is nearly as simple as the branched electrode and much simpler than the three instrumentation amplifier design, while providing robust EMG crosstalk rejection and better input impedance using unity gain buffers for each electrode input. The interference rejection limits of this input stage are analyzed. An easily replicable implementation of the proposed circuit is described, together with a parameter design guideline to adjust it to specific needs. The electrode is compared with the established alternatives, and sample sEMG signals are obtained, acquired on different body locations with dry contacts, successfully rejecting interference sources.
Meeting the challenge of food and energy security.
Karp, Angela; Richter, Goetz M
2011-06-01
Growing crops for bioenergy or biofuels is increasingly viewed as conflicting with food production. However, energy use continues to rise and food production requires fuel inputs, which have increased with intensification. Focussing on the question of food or fuel is thus not helpful. The bigger, more pertinent, challenge is how the increasing demands for food and energy can be met in the future, particularly when water and land availability will be limited. Energy crop production systems differ greatly in environmental impact. The use of high-input food crops for liquid transport fuels (first-generation biofuels) needs to be phased out and replaced by the use of crop residues and low-input perennial crops (second/advanced-generation biofuels) with multiple environmental benefits. More research effort is needed to improve yields of biomass crops grown on lower grade land, and maximum value should be extracted through the exploitation of co-products and integrated biorefinery systems. Policy must continually emphasize the changes needed and tie incentives to improved greenhous gas reduction and environmental performance of biofuels.
Short- and long-term impact of critical illness on relatives: literature review.
Paul, Fiona; Rattray, Janice
2008-05-01
This paper is a report of a literature review undertaken to identify the short- and long-term impact of critical illness on relatives. Patients in intensive care can experience physical and psychological consequences, and their relatives may also experience such effects. Although it is recognized that relatives have specific needs, it is not clear whether these needs are always met and whether further support is required, particularly after intensive care. The following databases were searched for the period 1950-2007: Medline, British Nursing Index and Archive, EMBASE, CINAHL, PsycINFO and EMB Reviews--Cochrane Central Register of Clinical Trials. Search terms focused on adult relatives of critically ill adult patients during and after intensive care. Recurrent topics were categorized to structure the review, i.e. 'relatives needs', 'meeting relatives' needs', 'interventions', 'satisfaction', 'psychological outcomes' and 'coping'. Studies have mainly identified relatives' immediate needs using the Critical Care Family Needs Inventory. There are few studies of interventions to meet relatives' needs and the short- and long-term effects of critical illness on relatives. Despite widespread use of the Critical Care Family Needs Inventory, factors such as local or cultural differences may influence relatives' needs. Relatives may also have unidentified needs, and these needs should be explored. Limited research has been carried out into interventions to meet relatives' needs and the effects of critical illness on their well-being, yet some relatives may experience negative psychological consequences far beyond the acute phase of the illness.
Input filter compensation for switching regulators
NASA Technical Reports Server (NTRS)
Lee, F. C.; Kelkar, S. S.
1982-01-01
The problems caused by the interaction between the input filter, output filter, and the control loop are discussed. The input filter design is made more complicated because of the need to avoid performance degradation and also stay within the weight and loss limitations. Conventional input filter design techniques are then dicussed. The concept of pole zero cancellation is reviewed; this concept is the basis for an approach to control the peaking of the output impedance of the input filter and thus mitigate some of the problems caused by the input filter. The proposed approach for control of the peaking of the output impedance of the input filter is to use a feedforward loop working in conjunction with feedback loops, thus forming a total state control scheme. The design of the feedforward loop for a buck regulator is described. A possible implementation of the feedforward loop design is suggested.
Nonsinusoidal Beta Oscillations Reflect Cortical Pathophysiology in Parkinson's Disease.
Cole, Scott R; van der Meij, Roemer; Peterson, Erik J; de Hemptinne, Coralie; Starr, Philip A; Voytek, Bradley
2017-05-03
Oscillations in neural activity play a critical role in neural computation and communication. There is intriguing new evidence that the nonsinusoidal features of the oscillatory waveforms may inform underlying physiological and pathophysiological characteristics. Time-domain waveform analysis approaches stand in contrast to traditional Fourier-based methods, which alter or destroy subtle waveform features. Recently, it has been shown that the waveform features of oscillatory beta (13-30 Hz) events, a prominent motor cortical oscillation, may reflect near-synchronous excitatory synaptic inputs onto cortical pyramidal neurons. Here we analyze data from invasive human primary motor cortex (M1) recordings from patients with Parkinson's disease (PD) implanted with a deep brain stimulator (DBS) to test the hypothesis that the beta waveform becomes less sharp with DBS, suggesting that M1 input synchrony may be decreased. We find that, in PD, M1 beta oscillations have sharp, asymmetric, nonsinusoidal features, specifically asymmetries in the ratio between the sharpness of the beta peaks compared with the troughs. This waveform feature is nearly perfectly correlated with beta-high gamma phase-amplitude coupling ( r = 0.94), a neural index previously shown to track PD-related motor deficit. Our results suggest that the pathophysiological beta generator is altered by DBS, smoothing out the beta waveform. This has implications not only for the interpretation of the physiological mechanism by which DBS reduces PD-related motor symptoms, but more broadly for our analytic toolkit in general. That is, the often-overlooked time-domain features of oscillatory waveforms may carry critical physiological information about neural processes and dynamics. SIGNIFICANCE STATEMENT To better understand the neural basis of cognition and disease, we need to understand how groups of neurons interact to communicate with one another. For example, there is evidence that parkinsonian bradykinesia and rigidity may arise from an oversynchronization of afferents to the motor cortex, and that these symptoms are treatable using deep brain stimulation. Here we show that the waveform shape of beta (13-30 Hz) oscillations, which may reflect input synchrony onto the cortex, is altered by deep brain stimulation. This suggests that mechanistic inferences regarding physiological and pathophysiological neural communication may be made from the temporal dynamics of oscillatory waveform shape. Copyright © 2017 the authors 0270-6474/17/374830-11$15.00/0.
Input Scanners: A Growing Impact In A Diverse Marketplace
NASA Astrophysics Data System (ADS)
Marks, Kevin E.
1989-08-01
Just as newly invented photographic processes revolutionized the printing industry at the turn of the century, electronic imaging has affected almost every computer application today. To completely emulate traditionally mechanical means of information handling, computer based systems must be able to capture graphic images. Thus, there is a widespread need for the electronic camera, the digitizer, the input scanner. This paper will review how various types of input scanners are being used in many diverse applications. The following topics will be covered: - Historical overview of input scanners - New applications for scanners - Impact of scanning technology on select markets - Scanning systems issues
Reliability of Beam Loss Monitor Systems for the Large Hadron Collider
NASA Astrophysics Data System (ADS)
Guaglio, G.; Dehning, B.; Santoni, C.
2005-06-01
The increase of beam energy and beam intensity, together with the use of super conducting magnets, opens new failure scenarios and brings new criticalities for the whole accelerator protection system. For the LHC beam loss protection system, the failure rate and the availability requirements have been evaluated using the Safety Integrity Level (SIL) approach. A downtime cost evaluation is used as input for the SIL approach. The most critical systems, which contribute to the final SIL value, are the dump system, the interlock system, the beam loss monitors system, and the energy monitor system. The Beam Loss Monitors System (BLMS) is critical for short and intense particles losses at 7 TeV and assisted by the Fast Beam Current Decay Monitors at 450 GeV. At medium and higher loss time it is assisted by other systems, such as the quench protection system and the cryogenic system. For BLMS, hardware and software have been evaluated in detail. The reliability input figures have been collected using historical data from the SPS, using temperature and radiation damage experimental data as well as using standard databases. All the data has been processed by reliability software (Isograph). The analysis spaces from the components data to the system configuration.
MediLink: a wearable telemedicine system for emergency and mobile applications.
Koval, T; Dudziak, M
1999-01-01
The practical needs of the medical professional faced with critical care or emergency situations differ from those working in many environments where telemedicine and mobile computing have been introduced and tested. One constructive criticism of the telemedicine initiative has been to question what positive benefits are gained from videoconferencing, paperless transactions, and online access to patient record. With a goal of producing a positive answer to such questions an architecture for multipurpose mobile telemedicine applications has been developed. The core technology is based upon a wearable personal computer with a smart-card interface coupled with speech, pen, video input and wireless intranet connectivity. The TransPAC system with the MedLink software system is designed to provide an integrated solution for a broad range of health care functions where mobile and hands-free or limited-access systems are preferred or necessary and where the capabilities of other mobile devices are insufficient or inappropriate. Structured and noise-resistant speech-to-text interfacing plus the use of a web browser-like display, accessible through either a flatpanel, standard, or headset monitor, gives the beltpack TransPAC computer the functions of a complete desktop including PCMCIA card interfaces for internet connectivity and a secure smartcard with 16-bit microprocessor and upwards of 64K memory. The card acts to provide user access control for security, user custom configuration of applications and display and vocabulary, and memory to diminish the need for PC-server communications while in an active session. TransPAC is being implemented for EMT and ER staff usage.
Snyder, Claire F.; Jensen, Roxanne E.; Segal, Jodi B.; Wu, Albert W.
2013-01-01
Patient-centered outcomes research (PCOR) aims to improve care quality and patient outcomes by providing information that patients, clinicians, and family members need regarding treatment alternatives, and emphasizing patient input to inform the research process. PCOR capitalizes on available data sources and generates new evidence to provide timely and relevant information and can be conducted using prospective data collection, disease registries, electronic medical records, aggregated results from prior research, and administrative claims. Given PCOR’s emphasis on the patient perspective, methods to incorporate patient-reported outcomes (PROs) are critical. PROs are defined by the U.S. Food & Drug Administration as “Any report coming directly from patients… about a health condition and its treatment.” However, PROs have not routinely been collected in a way that facilitates their use in PCOR. Electronic medical records, disease registries, and administrative data have only rarely collected, or been linked to, PROs. Recent technological developments facilitate the electronic collection of PROs and linkage of PRO data, offering new opportunities for putting the patient perspective in PCOR. This paper describes the importance of and methods for using PROs for PCOR. We (1) define PROs; (2) identify how PROs can be used in PCOR, and the critical role of electronic data methods for facilitating the use of PRO data in PCOR; (3) outline the challenges and key unanswered questions that need to be addressed for the routine use of PROs in PCOR; and (4) discuss policy and research interventions to accelerate the integration of PROs with clinical data. PMID:23774513
Assessing risk based on uncertain avalanche activity patterns
NASA Astrophysics Data System (ADS)
Zeidler, Antonia; Fromm, Reinhard
2015-04-01
Avalanches may affect critical infrastructure and may cause great economic losses. The planning horizon of infrastructures, e.g. hydropower generation facilities, reaches well into the future. Based on the results of previous studies on the effect of changing meteorological parameters (precipitation, temperature) and the effect on avalanche activity we assume that there will be a change of the risk pattern in future. The decision makers need to understand what the future might bring to best formulate their mitigation strategies. Therefore, we explore a commercial risk software to calculate risk for the coming years that might help in decision processes. The software @risk, is known to many larger companies, and therefore we explore its capabilities to include avalanche risk simulations in order to guarantee a comparability of different risks. In a first step, we develop a model for a hydropower generation facility that reflects the problem of changing avalanche activity patterns in future by selecting relevant input parameters and assigning likely probability distributions. The uncertain input variables include the probability of avalanches affecting an object, the vulnerability of an object, the expected costs for repairing the object and the expected cost due to interruption. The crux is to find the distribution that best represents the input variables under changing meteorological conditions. Our focus is on including the uncertain probability of avalanches based on the analysis of past avalanche data and expert knowledge. In order to explore different likely outcomes we base the analysis on three different climate scenarios (likely, worst case, baseline). For some variables, it is possible to fit a distribution to historical data, whereas in cases where the past dataset is insufficient or not available the software allows to select from over 30 different distribution types. The Monte Carlo simulation uses the probability distribution of uncertain variables using all valid combinations of the values of input variables to simulate all possible outcomes. In our case the output is the expected risk (Euro/year) for each object (e.g. water intake) considered and the entire hydropower generation system. The output is again a distribution that is interpreted by the decision makers as the final strategy depends on the needs and requirements of the end-user, which may be driven by personal preferences. In this presentation, we will show a way on how we used the uncertain information on avalanche activity in future to subsequently use it in a commercial risk software and therefore bringing the knowledge of natural hazard experts to decision makers.
NASA Technical Reports Server (NTRS)
Williams-Byrd, Julie; Arney, Dale C.; Hay, Jason; Reeves, John D.; Craig, Douglas
2016-01-01
NASA is transforming human spaceflight. The Agency is shifting from an exploration-based program with human activities in low Earth orbit (LEO) and targeted robotic missions in deep space to a more sustainable and integrated pioneering approach. Through pioneering, NASA seeks to address national goals to develop the capacity for people to work, learn, operate, live, and thrive safely beyond Earth for extended periods of time. However, pioneering space involves daunting technical challenges of transportation, maintaining health, and enabling crew productivity for long durations in remote, hostile, and alien environments. Prudent investments in capability and technology developments, based on mission need, are critical for enabling a campaign of human exploration missions. There are a wide variety of capabilities and technologies that could enable these missions, so it is a major challenge for NASA's Human Exploration and Operations Mission Directorate (HEOMD) to make knowledgeable portfolio decisions. It is critical for this pioneering initiative that these investment decisions are informed with a prioritization process that is robust and defensible. It is NASA's role to invest in targeted technologies and capabilities that would enable exploration missions even though specific requirements have not been identified. To inform these investments decisions, NASA's HEOMD has supported a variety of analysis activities that prioritize capabilities and technologies. These activities are often based on input from subject matter experts within the NASA community who understand the technical challenges of enabling human exploration missions. This paper will review a variety of processes and methods that NASA has used to prioritize and rank capabilities and technologies applicable to human space exploration. The paper will show the similarities in the various processes and showcase instances were customer specified priorities force modifications to the process. Specifically, this paper will describe the processes that the NASA Langley Research Center (LaRC) Technology Assessment and Integration Team (TAIT) has used for several years and how those processes have been customized to meet customer needs while staying robust and defensible. This paper will show how HEOMD uses these analyses results to assist with making informed portfolio investment decisions. The paper will also highlight which human exploration capabilities and technologies typically rank high regardless of the specific design reference mission. The paper will conclude by describing future capability and technology ranking activities that will continue o leverage subject matter experts (SME) input while also incorporating more model-based analysis.
AGING AND THE ENVIRONMENT: A RESEARCH FRAMEWORK.
This manuscript discusses the development of a research program on health effects and environmental exposures to older adults. It summarizes input to this process from experts and the public, and outlines the critical elements necessary to fully address issues of environmental p...
FACTORS INFLUENCING TOTAL DIETARY EXPOSURE OF YOUNG CHILDREN
A deterministic model was developed to identify critical input parameters to assess dietary intake of young children. The model was used as a framework for understanding important factors in data collection and analysis. Factors incorporated included transfer efficiencies of pest...
LANDSCAPE INDICATORS OF SURFACE WATER CONDITIONS
This task comprises three inter-related projects: 1) impervious surface mapping and evaluation of its impact ; 2) detection of BMPs and estimation of their ability to reduce nutrient input into streams, and; 3) detection of isolated wetlands. Each substask addresses critical is...
High-speed asynchronous data mulitiplexer/demultiplexer for high-density digital recorders
NASA Astrophysics Data System (ADS)
Berdugo, Albert; Small, Martin B.
1996-11-01
Modern High Density Digital Recorders are ideal devices for the storage of large amounts of digital and/or wideband analog data. Ruggedized versions of these recorders are currently available and are supporting many military and commercial flight test applications. However, in certain cases, the storage format becomes very critical, e.g., when a large number of data types are involved, or when channel- to-channel correlation is critical, or when the original data source must be accurately recreated during post mission analysis. A properly designed storage format will not only preserve data quality, but will yield the maximum storage capacity and record time for any given recorder family or data type. This paper describes a multiplex/demultiplex technique that formats multiple high speed data sources into a single, common format for recording. The method is compatible with many popular commercial recorder standards such as DCRsi, VLDS, and DLT. Types of input data typically include PCM, wideband analog data, video, aircraft data buses, avionics, voice, time code, and many others. The described method preserves tight data correlation with minimal data overhead. The described technique supports full reconstruction of the original input signals during data playback. Output data correlation across channels is preserved for all types of data inputs. Simultaneous real- time data recording and reconstruction are also supported.
Optimized distributed computing environment for mask data preparation
NASA Astrophysics Data System (ADS)
Ahn, Byoung-Sup; Bang, Ju-Mi; Ji, Min-Kyu; Kang, Sun; Jang, Sung-Hoon; Choi, Yo-Han; Ki, Won-Tai; Choi, Seong-Woon; Han, Woo-Sung
2005-11-01
As the critical dimension (CD) becomes smaller, various resolution enhancement techniques (RET) are widely adopted. In developing sub-100nm devices, the complexity of optical proximity correction (OPC) is severely increased and applied OPC layers are expanded to non-critical layers. The transformation of designed pattern data by OPC operation causes complexity, which cause runtime overheads to following steps such as mask data preparation (MDP), and collapse of existing design hierarchy. Therefore, many mask shops exploit the distributed computing method in order to reduce the runtime of mask data preparation rather than exploit the design hierarchy. Distributed computing uses a cluster of computers that are connected to local network system. However, there are two things to limit the benefit of the distributing computing method in MDP. First, every sequential MDP job, which uses maximum number of available CPUs, is not efficient compared to parallel MDP job execution due to the input data characteristics. Second, the runtime enhancement over input cost is not sufficient enough since the scalability of fracturing tools is limited. In this paper, we will discuss optimum load balancing environment that is useful in increasing the uptime of distributed computing system by assigning appropriate number of CPUs for each input design data. We will also describe the distributed processing (DP) parameter optimization to obtain maximum throughput in MDP job processing.
NASA Astrophysics Data System (ADS)
Cicuttin, Andres; Colavita, Alberto; Cerdeira, Alberto; Fratnik, Fabio; Vacchi, Andrea
1997-02-01
In this report we describe a mixed analog-digital integrated circuit (IC) designed as the front-end electronics for silicon strip-detectors for space applications. In space power consumption, compactness and robustness become critical constraints for a pre-amplifier design. The IC is a prototype with 32 complete channels, and it is intended for a large area particle tracker of a new generation of gamma ray telescopes. Each channel contains a charge sensitive amplifier, a pulse shaper, a discriminator and two digital buffers. The reference trip point of the discriminator is adjustable. This chip also has a custom PMOSFET transistor per channel, included in order to provide the high dynamic resistance needed to reverse-bias the strip diode. The digital part of the chip is used to store and serially shift out the state of the channels. There is also a storage buffer that allows the disabling of non-functioning channels if it is required by the data acquisition system. An input capacitance of 30 pF introduced at the input of the front-end produces less than 1000 electrons of RMS equivalent noise charge (ENC), for a total power dissipation of only 60 μW per channel. The chip was made using Orbit's 1.2 μm double poly, double metal n-well low noise CMOS process. The dimensions of the IC are 2400 μm × 8840 μm.
State of the art in nuclear telerobotics: focus on the man/machine connection
NASA Astrophysics Data System (ADS)
Greaves, Amna E.
1995-12-01
The interface between the human controller and remotely operated device is a crux of telerobotic investigation today. This human-to-machine connection is the means by which we communicate our commands to the device, as well as the medium for decision-critical feedback to the operator. The amount of information transferred through the user interface is growing. This can be seen as a direct result of our need to support added complexities, as well as a rapidly expanding domain of applications. A user interface, or UI, is therefore subject to increasing demands to present information in a meaningful manner to the user. Virtual reality, and multi degree-of-freedom input devices lend us the ability to augment the man/machine interface, and handle burgeoning amounts of data in a more intuitive and anthropomorphically correct manner. Along with the aid of 3-D input and output devices, there are several visual tools that can be employed as part of a graphical UI that enhance and accelerate our comprehension of the data being presented. Thus an advanced UI that features these improvements would reduce the amount of fatigue on the teleoperator, increase his level of safety, facilitate learning, augment his control, and potentially reduce task time. This paper investigates the cutting edge concepts and enhancements that lead to the next generation of telerobotic interface systems.
NASA Astrophysics Data System (ADS)
Wang, Zhenming; Shi, Baoping; Kiefer, John D.; Woolery, Edward W.
2004-06-01
Musson's comments on our article, ``Communicating with uncertainty: A critical issue with probabilistic seismic hazard analysis'' are an example of myths and misunderstandings. We did not say that probabilistic seismic hazard analysis (PSHA) is a bad method, but we did say that it has some limitations that have significant implications. Our response to these comments follows. There is no consensus on exactly how to select seismological parameters and to assign weights in PSHA. This was one of the conclusions reached by a senior seismic hazard analysis committee [SSHAC, 1997] that included C. A. Cornell, founder of the PSHA methodology. The SSHAC report was reviewed by a panel of the National Research Council and was well accepted by seismologists and engineers. As an example of the lack of consensus, Toro and Silva [2001] produced seismic hazard maps for the central United States region that are quite different from those produced by Frankel et al. [2002] because they used different input seismological parameters and weights (see Table 1). We disagree with Musson's conclusion that ``because a method may be applied badly on one occasion does not mean the method itself is bad.'' We do not say that the method is poor, but rather that those who use PSHA need to document their inputs and communicate them fully to the users. It seems that Musson is trying to create myth by suggesting his own methods should be used.
Robustness analysis of elastoplastic structure subjected to double impulse
NASA Astrophysics Data System (ADS)
Kanno, Yoshihiro; Takewaki, Izuru
2016-11-01
The double impulse has extensively been used to evaluate the critical response of an elastoplastic structure against a pulse-type input, including near-fault earthquake ground motions. In this paper, we propose a robustness assessment method for elastoplastic single-degree-of-freedom structures subjected to the double impulse input. Uncertainties in the initial velocity of the input, as well as the natural frequency and the strength of the structure, are considered. As fundamental properties of the structural robustness, we show monotonicity of the robustness measure with respect to the natural frequency. In contrast, we show that robustness is not necessarily improved even if the structural strength is increased. Moreover, the robustness preference between two structures with different values of structural strength can possibly reverse when the performance requirement is changed.
NASA Astrophysics Data System (ADS)
Collins, J. A.; Oldenburg, J.; Liu, M.; Pulsifer, P. L.; Kaufman, M.; Eicken, H.; Parsons, M. A.
2012-12-01
Knowledge of sea ice is critical to the hunting, whaling, and cultural activities of many Indigenous communities in Northern and Western Alaska. Experienced hunters have monitored seasonal changes of the sea ice over many years, giving them a unique expertise in assessing the current state of the sea ice as well as any anomalies in seasonal sea ice conditions. The Seasonal Ice Zone Observing Network (SIZONet), in collaboration with the Exchange for Local Observations and Knowledge of the Arctic (ELOKA), has developed an online application for collecting, storing, and analyzing sea ice observations contributed by local experts from coastal Alaskan communities. Here we present the current iteration of the application, outline future plans and discuss how the development process and resulting system have improved our collective understanding of sea ice processes and changes. The SIZONet application design is based on the needs of the research scientists responsible for entering observation data into the database, the needs of local sea ice experts contributing their observations and knowledge, and the information needs of Alaska coastal communities. Entry forms provide a variety of input methods, including menus, check boxes, and free text input. Input options strive to balance flexibility in capturing concepts and details with the need for analytical consistency. Currently, research staff at the University of Alaska Fairbanks use the application to enter observations received via written or electronic communications from local sea ice experts. Observation data include current weather conditions, snow and ice quantity and quality, and wildlife sighted or taken. Future plans call for direct use of the SIZONet interface by local sea ice experts as well as students, both as contributors to the data collection and as users seeking meaning in the data. This functionality is currently available to a limited number of community members as we extend the application to support specific roles for particular users (or groups of users); this role-based access will be necessary to support a diverse user population while maintaining the integrity of the data and protecting personal information, or the location of sensitive sites, captured in the data records. Additionally, future improvements to the interface will include the ability to upload photos and videos to capture visual records of the environment. The SIZONet application was developed to provide a robust interface for working with observational data. The contributed nature of the data, however, presents a unique set of collaborative benefits and challenges as we work towards the final implementation of the application. The successful partnership supporting the observation network is a direct function of the long-term relationships established between university-based researchers and community members.
Staib, Jennifer M; Della Valle, Rebecca; Knox, Dayan K
2018-07-01
In classical fear conditioning, a neutral conditioned stimulus (CS) is paired with an aversive unconditioned stimulus (US), which leads to a fear memory. If the CS is repeatedly presented without the US after fear conditioning, the formation of an extinction memory occurs, which inhibits fear memory expression. A previous study has demonstrated that selective cholinergic lesions in the medial septum and vertical limb of the diagonal bands of Broca (MS/vDBB) prior to fear and extinction learning disrupt contextual fear memory discrimination and acquisition of extinction memory. MS/vDBB cholinergic neurons project to a number of substrates that are critical for fear and extinction memory. However, it is currently unknown which of these efferent projections are critical for contextual fear memory discrimination and extinction memory. To address this, we induced cholinergic lesions in efferent targets of MS/vDBB cholinergic neurons. These included the dorsal hippocampus (dHipp), ventral hippocampus (vHipp), medial prefrontal cortex (mPFC), and in the mPFC and dHipp combined. None of these lesion groups exhibited deficits in contextual fear memory discrimination or extinction memory. However, vHipp cholinergic lesions disrupted auditory fear memory. Because MS/vDBB cholinergic neurons are the sole source of acetylcholine in the vHipp, these results suggest that MS/vDBB cholinergic input to the vHipp is critical for auditory fear memory. Taken together with previous findings, the results of this study suggest that MS/vDBB cholinergic neurons are critical for fear and extinction memory, though further research is needed to elucidate the role of MS/vDBB cholinergic neurons in these types of emotional memory. Copyright © 2018 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, T.
2000-07-01
The Write One, Run Many (WORM) site (worm.csirc.net) is the on-line home of the WORM language and is hosted by the Criticality Safety Information Resource Center (CSIRC) (www.csirc.net). The purpose of this web site is to create an on-line community for WORM users to gather, share, and archive WORM-related information. WORM is an embedded, functional, programming language designed to facilitate the creation of input decks for computer codes that take standard ASCII text files as input. A functional programming language is one that emphasizes the evaluation of expressions, rather than execution of commands. The simplest and perhaps most common examplemore » of a functional language is a spreadsheet such as Microsoft Excel. The spreadsheet user specifies expressions to be evaluated, while the spreadsheet itself determines the commands to execute, as well as the order of execution/evaluation. WORM functions in a similar fashion and, as a result, is very simple to use and easy to learn. WORM improves the efficiency of today's criticality safety analyst by allowing: (1) input decks for parameter studies to be created quickly and easily; (2) calculations and variables to be embedded into any input deck, thus allowing for meaningful parameter specifications; (3) problems to be specified using any combination of units; and (4) complex mathematically defined models to be created. WORM is completely written in Perl. Running on all variants of UNIX, Windows, MS-DOS, MacOS, and many other operating systems, Perl is one of the most portable programming languages available. As such, WORM works on practically any computer platform.« less
Local and Long-Range Circuit Connections to Hilar Mossy Cells in the Dentate Gyrus
Sun, Yanjun; Grieco, Steven F.; Holmes, Todd C.
2017-01-01
Abstract Hilar mossy cells are the prominent glutamatergic cell type in the dentate hilus of the dentate gyrus (DG); they have been proposed to have critical roles in the DG network. To better understand how mossy cells contribute to DG function, we have applied new viral genetic and functional circuit mapping approaches to quantitatively map and compare local and long-range circuit connections of mossy cells and dentate granule cells in the mouse. The great majority of inputs to mossy cells consist of two parallel inputs from within the DG: an excitatory input pathway from dentate granule cells and an inhibitory input pathway from local DG inhibitory neurons. Mossy cells also receive a moderate degree of excitatory and inhibitory CA3 input from proximal CA3 subfields. Long range inputs to mossy cells are numerically sparse, and they are only identified readily from the medial septum and the septofimbrial nucleus. In comparison, dentate granule cells receive most of their inputs from the entorhinal cortex. The granule cells receive significant synaptic inputs from the hilus and the medial septum, and they also receive direct inputs from both distal and proximal CA3 subfields, which has been underdescribed in the existing literature. Our slice-based physiological mapping studies further supported the identified circuit connections of mossy cells and granule cells. Together, our data suggest that hilar mossy cells are major local circuit integrators and they exert modulation of the activity of dentate granule cells as well as the CA3 region through “back-projection” pathways. PMID:28451637
Early Phonological and Lexical Development and Otitis Media: A Diary Study.
ERIC Educational Resources Information Center
Donahue, Mavis L.
1993-01-01
A child with chronic otitis media with effusion solved the problem of reduced and fluctuating auditory input with phonological selection and avoidance strategies that capitalized on prosodic cues. Findings illustrate the need to consider interactions among performance, input, and linguistic constraints to explain individual variation in language…
Experimental Optoelectronic Associative Memory
NASA Technical Reports Server (NTRS)
Chao, Tien-Hsin
1992-01-01
Optoelectronic associative memory responds to input image by displaying one of M remembered images. Which image to display determined by optoelectronic analog computation of resemblance between input image and each remembered image. Does not rely on precomputation and storage of outer-product synapse matrix. Size of memory needed to store and process images reduced.
Proceedings: Sixth Annual Workshop on Meteorological and Environmental Inputs to Aviation Systems
NASA Technical Reports Server (NTRS)
Frost, W. (Editor); Camp, D. W. (Editor); Hershman, L. W. (Editor)
1983-01-01
The topics of interaction of the atmosphere with aviation systems, the better definition and implementation of services to operators, and the collection and interpretation of data for establishing operational criteria relating the total meteorological inputs from the atmospheric sciences to the needs of aviation communities were addressed.
Genesis of an oak-fire science consortium
Grabner, K.W.; Stambaugh, M. C.; Guyette, R.P.; Dey, D. C.; Willson, G.D.; Dey, D. C.; Stambaugh, M. C.; Clark, S.L.; Schweitzer, C. J.
2012-01-01
With respect to fire management and practices, one of the most overlooked regions lies in the middle of the country. In this region there is a critical need for both recognition of fire’s importance and sharing of fire information and expertise. Recently we proposed and were awarded funding by the Joint Fire Science Program to initiate the planning phase for a regional fire consortium. The purpose of the consortium will be to promote the dissemination of fire information across the interior United States and to identify fire information needs of oak-dominated communities such as woodlands, forests, savannas, and barrens. Geographically, the consortium region will cover: 1) the Interior Lowland Plateau Ecoregion in Illinois, Indiana, central Kentucky and Tennessee; 2) the Missouri, Arkansas, and Oklahoma Ozarks; 3) the Ouachita Mountains of Arkansas and Oklahoma; and 4) the Cross Timbers Region in Texas, Oklahoma, and Kansas. This region coincides with the southwestern half of the Central Hardwoods Forest Region. The tasks of this consortium will be to disseminate fire information, connect fire professionals, and efficiently address fire issues within our region. If supported, the success and the future direction of the consortium will be driven by end-users, their input, and involvement.
Predicting species distributions for conservation decisions
Guisan, Antoine; Tingley, Reid; Baumgartner, John B; Naujokaitis-Lewis, Ilona; Sutcliffe, Patricia R; Tulloch, Ayesha I T; Regan, Tracey J; Brotons, Lluis; McDonald-Madden, Eve; Mantyka-Pringle, Chrystal; Martin, Tara G; Rhodes, Jonathan R; Maggini, Ramona; Setterfield, Samantha A; Elith, Jane; Schwartz, Mark W; Wintle, Brendan A; Broennimann, Olivier; Austin, Mike; Ferrier, Simon; Kearney, Michael R; Possingham, Hugh P; Buckley, Yvonne M
2013-01-01
Species distribution models (SDMs) are increasingly proposed to support conservation decision making. However, evidence of SDMs supporting solutions for on-ground conservation problems is still scarce in the scientific literature. Here, we show that successful examples exist but are still largely hidden in the grey literature, and thus less accessible for analysis and learning. Furthermore, the decision framework within which SDMs are used is rarely made explicit. Using case studies from biological invasions, identification of critical habitats, reserve selection and translocation of endangered species, we propose that SDMs may be tailored to suit a range of decision-making contexts when used within a structured and transparent decision-making process. To construct appropriate SDMs to more effectively guide conservation actions, modellers need to better understand the decision process, and decision makers need to provide feedback to modellers regarding the actual use of SDMs to support conservation decisions. This could be facilitated by individuals or institutions playing the role of ‘translators’ between modellers and decision makers. We encourage species distribution modellers to get involved in real decision-making processes that will benefit from their technical input; this strategy has the potential to better bridge theory and practice, and contribute to improve both scientific knowledge and conservation outcomes. PMID:24134332
Millimeter and Sub-millimeter High Resolution Spectroscopy: New Frontiers with ALMA
NASA Astrophysics Data System (ADS)
Ziurys, Lucy M.
2016-06-01
It is becoming increasingly clear that new laboratory data will be critical for the next decade of observations with the Atacama Large Millimeter Array (ALMA). The high spatial resolution offered by ALMA will probe new regions of molecular complexity, including the inner envelopes of evolved stars, regions dominated by UV radiation, and the densest cores of molecular clouds. New molecular lines will be discovered in the wide wavelength range covered by the ALMA bands, and high resolution, gas-phase spectroscopy are needed to provide crucial “rest frequencies.” In particular, highly accurate methods that measure millimeter and sub-millimeter rotational transitions, such as direct absorption and Fourier transform mm-wave techniques, are important, especially when coupled to exotic molecular production schemes. Recent ALMA studies of SH+ and larger organic species have already demonstrated the need for laboratory measurements. New laboratory work will likely be required for circumstellar refractory molecules, radicals and ions generated near photon-dominated regions (PDRs), and large, organic-type species. This talk will give an overview of current contributions of laboratory spectroscopy to ALMA observations, summarize relevant spectroscopic techniques, and provide input into future prospects and directions.
NASA Astrophysics Data System (ADS)
French, J.
2015-12-01
Ports are vital to the global economy, but assessments of global exposure to flood risk have generally focused on major concentrations of population or asset values. Few studies have examined the impact of extreme inundation events on port operation and critical supply chains. Extreme water levels and recurrence intervals have conventionally been estimated via analysis of historic water level maxima, and these vary widely depending on the statistical assumptions made. This information is supplemented by near-term forecasts from operational surge-tide models, which give continuous water levels but at considerable computational cost. As part of a NERC Infrastructure and Risk project, we have investigated the impact of North Sea tidal surges on the Port of Immingham, eastern, UK. This handles the largest volume of bulk cargo in the UK and flows of coal and biomass that are critically important for national energy security. The port was partly flooded during a major tidal surge in 2013. This event highlighted the need for improved local forecasts of surge timing in relation to high water, with a better indication of flood depth and duration. We address this problem using a combination of data-driven and numerical hydrodynamic models. An Artificial Neural Network (ANN) is first used to predict the surge component of water level from meteorological data. The input vector comprises time-series of local wind (easterly and northerly wind stress) and pressure, as well as regional pressure and pressure gradients from stations between the Shetland Islands and the Humber estuary. The ANN achieves rms errors of around 0.1 m and can generate short-range (~ 3 to 12 hour) forecasts given real-time input data feeds. It can also synthesize water level events for a wider range of tidal and meteorological forcing combinations than contained in the observational records. These are used to force Telemac2D numerical floodplain simulations using a LiDAR digital elevation model of the port. Functional relationships between peak water level and surge 'shape' allow estimation of flood depths and durations for any location. Supplementing existing surge warning systems, our approach predicts the location and duration of flooding in detail, and allows port managers to take steps to minimize its impact on the most critical aspects of port operation.
NASA Astrophysics Data System (ADS)
Little, J. C.; Filz, G. M.
2016-12-01
As modern societies become more complex, critical interdependent infrastructure systems become more likely to fail under stress unless they are designed and implemented to be resilient. Hurricane Katrina clearly demonstrated the catastrophic and as yet unpredictable consequences of such failures. Resilient infrastructure systems maintain the flow of goods and services in the face of a broad range of natural and manmade hazards. In this presentation, we illustrate a generic computational framework to facilitate high-level decision-making about how to invest scarce resources most effectively to enhance resilience in coastal protection, transportation, and the economy of a region. Coastal Louisiana, our study area, has experienced the catastrophic effects of several land-falling hurricanes in recent years. In this project, we implement and further refine three process models (a coastal protection model, a transportation model, and an economic model) for the coastal Louisiana region. We upscale essential mechanistic features of the three detailed process models to the systems level and integrate the three reduced-order systems models in a modular fashion. We also evaluate the proposed approach in annual workshops with input from stakeholders. Based on stakeholder inputs, we derive a suite of goals, targets, and indicators for evaluating resilience at the systems level, and assess and enhance resilience using several deterministic scenarios. The unifying framework will be able to accommodate the different spatial and temporal scales that are appropriate for each model. We combine our generic computational framework, which encompasses the entire system of systems, with the targets, and indicators needed to systematically meet our chosen resilience goals. We will start with targets that focus on technical and economic systems, but future work will ensure that targets and indicators are extended to other dimensions of resilience including those in the environmental and social systems. The overall model can be used to optimize decision making in a probabilistic risk-based framework.
Bryophytes and Organic layers Control Uptake of Airborne Nitrogen in Low-N Environments.
Bähring, Alexandra; Fichtner, Andreas; Friedrich, Uta; von Oheimb, Goddert; Härdtle, Werner
2017-01-01
The effects of atmospheric nitrogen (N) deposition on ecosystem functioning largely depend on the retention of N in different ecosystem compartments, but accumulation and partitioning processes have rarely been quantified in long-term field experiments. In the present study we analysed for the first time decadal-scale flows and allocation patterns of N in a heathland ecosystem that has been subject to airborne N inputs over decades. Using a long-term 15 N tracer experiment, we quantified N retention and flows to and between ecosystem compartments (above-ground/below-ground vascular biomass, moss layer, soil horizons, leachate). After 9 years, about 60% of the added 15 N-tracer remained in the N cycle of the ecosystem. The moss layer proved to be a crucial link between incoming N and its allocation to different ecosystem compartments (in terms of a short-term capture, but long-term release function). However, about 50% of the 15 N captured and released by the moss layer was not compensated for by a corresponding increase in recovery rates in any other compartment, probably due to denitrification losses from the moss layer in the case of water saturation after rain events. The O-horizon proved to be the most important long-term sink for added 15 N, as reflected by an increase in recovery rates from 18 to 40% within 8 years. Less than 2.1% of 15 N were recovered in the podzol-B-horizon, suggesting that only negligible amounts of N were withdrawn from the N cycle of the ecosystem. Moreover, 15 N recovery was low in the dwarf shrub above-ground biomass (<3.9% after 9 years) and in the leachate (about 0.03% within 1 year), indicating still conservative N cycles of the ecosystem, even after decades of N inputs beyond critical load thresholds. The continuous accumulation of reactive forms of airborne N suggests that critical load-estimates need to account for cumulative effects of N additions into ecosystems.
Bryophytes and Organic layers Control Uptake of Airborne Nitrogen in Low-N Environments
Bähring, Alexandra; Fichtner, Andreas; Friedrich, Uta; von Oheimb, Goddert; Härdtle, Werner
2017-01-01
The effects of atmospheric nitrogen (N) deposition on ecosystem functioning largely depend on the retention of N in different ecosystem compartments, but accumulation and partitioning processes have rarely been quantified in long-term field experiments. In the present study we analysed for the first time decadal-scale flows and allocation patterns of N in a heathland ecosystem that has been subject to airborne N inputs over decades. Using a long-term 15N tracer experiment, we quantified N retention and flows to and between ecosystem compartments (above-ground/below-ground vascular biomass, moss layer, soil horizons, leachate). After 9 years, about 60% of the added 15N-tracer remained in the N cycle of the ecosystem. The moss layer proved to be a crucial link between incoming N and its allocation to different ecosystem compartments (in terms of a short-term capture, but long-term release function). However, about 50% of the 15N captured and released by the moss layer was not compensated for by a corresponding increase in recovery rates in any other compartment, probably due to denitrification losses from the moss layer in the case of water saturation after rain events. The O-horizon proved to be the most important long-term sink for added 15N, as reflected by an increase in recovery rates from 18 to 40% within 8 years. Less than 2.1% of 15N were recovered in the podzol-B-horizon, suggesting that only negligible amounts of N were withdrawn from the N cycle of the ecosystem. Moreover, 15N recovery was low in the dwarf shrub above-ground biomass (<3.9% after 9 years) and in the leachate (about 0.03% within 1 year), indicating still conservative N cycles of the ecosystem, even after decades of N inputs beyond critical load thresholds. The continuous accumulation of reactive forms of airborne N suggests that critical load-estimates need to account for cumulative effects of N additions into ecosystems. PMID:29375589
Threshold and multiple indicators for nitrogen saturation in subtropical forests.
Yu, Qian; Duan, Lei; Yu, Longfei; Chen, Xiao; Si, Gaoyue; Ke, Piaopiao; Ye, Zhixiang; Mulder, Jan
2018-06-11
The influence of nitrogen (N) deposition on forest ecosystems largely depend on the N status. Developing threshold and practical indicators for N saturation in subtropical forests, with extremely high N deposition, would both enhance forest management and the assessments of global N balance and carbon (C) sequestration. Here, we quantified the N mass balance and assessed current N status at a number of subtropical forest sites in South China, using both N content, C/N ratio, and 15 N natural abundance (δ 15 N) as potential indicators of N saturation. Among the studied sites, N deposition ranged from 13.8 to 113 kg N ha -1 yr -1 in throughfall, and was dominated by ammonium (NH 4 + ). The threshold for N leaching in subtropical forest was first found to be 26-36 kg N ha -1 yr -1 , which was 160% higher than in temperate forest (based on prescribed minimum). This indicates that critical parameter inputs in global models of the impact of N deposition are in need of revision, based on specific ecosystem characteristics. We found a critical C/N ratio of 20 for the O/A horizon as indicator of N saturation. Foliar N content and δ 15 N were positively correlated with N deposition and were well suited to indicate regional N status. The δ 15 N enrichment factor (Ɛ foli/So2 , δ 15 N foliage - δ 15 N Soil2 ) was between -10‰ and -1‰, and had similar trend to those obtained from other regions with increasing N deposition. These suggest that the enrichment factor could be used to investigate the influence of N deposition in forest ecosystems, regardless of spatial heterogeneity in δ 15 N of N input, soil N availability and geomorphology. Copyright © 2018 Elsevier Ltd. All rights reserved.
Nitrogen and harvest impact on warm-season grasses biomass yield
USDA-ARS?s Scientific Manuscript database
Perennial warm-season grasses have drawn interest as bioenergy feedstocks due to their high productivity with minimal amounts of inputs while producing multiple environmental benefits. Nitrogen (N) fertility and harvest timing are critical management practices when optimizing biomass yield of these ...
Pavement thickness design for local roads in Iowa : tech brief.
DOT National Transportation Integrated Search
2010-01-01
The main objectives of this research are to: 1) identify the most critical design input parameters, 2) determine the minimum pavement thickness, and 3) develop new pavement design and sensitivity analysis (PD&SA) software which can provide the most a...
75 FR 66739 - Technology Innovation Program (TIP) Seeks White Papers
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-29
... network analyses in the following areas--sustainable manufacturing models, resource management and... manufacturing, all endeavors require energy as input. Escalating energy demands throughout the world can lead to... such as: Technologies for improved manufacturing of critical components for alternative energy...
Statistical self-similarity of hotspot seamount volumes modeled as self-similar criticality
Tebbens, S.F.; Burroughs, S.M.; Barton, C.C.; Naar, D.F.
2001-01-01
The processes responsible for hotspot seamount formation are complex, yet the cumulative frequency-volume distribution of hotspot seamounts in the Easter Island/Salas y Gomez Chain (ESC) is found to be well-described by an upper-truncated power law. We develop a model for hotspot seamount formation where uniform energy input produces events initiated on a self-similar distribution of critical cells. We call this model Self-Similar Criticality (SSC). By allowing the spatial distribution of magma migration to be self-similar, the SSC model recreates the observed ESC seamount volume distribution. The SSC model may have broad applicability to other natural systems.
Jackson, Terence
2011-01-01
There appears to be a gap between the billions of dollars inputted into fighting HIV/AIDS and TB and outcomes. This in part can be attributed to the lack of attention in International Development to managing programmes and projects within complex levels of cross-cultural interactions. International Development often ignores management issues, yet Management Studies is left wanting through a lack of engagement with development issues including the fight against disease and poverty. This paper attempts to link these two disciplines towards mutual benefit, through a critical cross-cultural approach. It provides contextualization of international development policies/strategies; conceptualization of dominant paradigms; structural analysis of how a programme/project fits into the global governance structure; analysis of complexities and levels of cross-cultural interaction and their consequences and the process and implications of knowledge transfer across cultural distances. It concludes with implications for policy and practice, as well as what is needed from cross-disciplinary research. This includes how feedback loops can be strengthened from local to global, how indigenous knowledge may be better understood and integrated, how power relations within the global governance structure could be managed, how cross-cultural interaction could be better understood, and how knowledge transfer/sharing should be critically managed. Copyright © 2010 John Wiley & Sons, Ltd.
Fleisher, Linda; Erkoboni, Danielle; Halkyard, Katherine; Sykes, Emily; Norris, Marisol S.; Walker, Lorrie; Winston, Flaura
2017-01-01
Childhood death from vehicle crashes and the delivery of information about proper child restraint systems (CRS) use continues to be a critical public health issue. Safe Seat, a sequential, mixed-methods study identified gaps in parental knowledge about and perceived challenges in the use of appropriate CRS and insights into the preferences of various technological approaches to deliver CRS education. Focus groups (eight groups with 21 participants) and a quantitative national survey (N = 1251) using MTurk were conducted. Although there were differences in the age, racial/ethnic background, and educational level between the focus group participants and the national sample, there was a great deal of consistency in the need for more timely and personalized information about CRS. The majority of parents did not utilize car seat check professionals although they expressed interest in and lack of knowledge about how to access these resources. Although there was some interest in an app that would be personalized and able to push just-in-time content (e.g., new guidelines, location and times of car seat checks), content that has sporadic relevance (e.g., initial installation) seemed more appropriate for a website. Stakeholder input is critical to guide the development and delivery of acceptable and useful child safety education. PMID:28954429
Accelerated Strength Testing of Thermoplastic Composites
NASA Technical Reports Server (NTRS)
Reeder, J. R.; Allen, D. H.; Bradley, W. L.
1998-01-01
Constant ramp strength tests on unidirectional thermoplastic composite specimens oriented in the 90 deg. direction were conducted at constant temperatures ranging from 149 C to 232 C. Ramp rates spanning 5 orders of magnitude were tested so that failures occurred in the range from 0.5 sec. to 24 hrs. (0.5 to 100,000 MPa/sec). Below 204 C, time-temperature superposition held allowing strength at longer times to be estimated from strength tests at shorter times but higher temperatures. The data indicated that a 50% drop in strength might be expected for this material when the test time is increased by 9 orders of magnitude. The shift factors derived from compliance data applied well to the strength results. To explain the link between compliance and strength, a viscoelastic fracture model was investigated. The model, which used compliance as input, was found to fit the strength data only if the critical fracture energy was allowed to vary with temperature reduced stress rate. This variation in the critical parameter severely limits its use in developing a robust time-dependent strength model. The significance of this research is therefore seen as providing both the indication that a more versatile acceleration method for strength can be developed and the evidence that such a method is needed.
Fleisher, Linda; Erkoboni, Danielle; Halkyard, Katherine; Sykes, Emily; Norris, Marisol S; Walker, Lorrie; Winston, Flaura
2017-09-26
Childhood death from vehicle crashes and the delivery of information about proper child restraint systems (CRS) use continues to be a critical public health issue. Safe Seat, a sequential, mixed-methods study identified gaps in parental knowledge about and perceived challenges in the use of appropriate CRS and insights into the preferences of various technological approaches to deliver CRS education. Focus groups (eight groups with 21 participants) and a quantitative national survey (N = 1251) using MTurk were conducted. Although there were differences in the age, racial/ethnic background, and educational level between the focus group participants and the national sample, there was a great deal of consistency in the need for more timely and personalized information about CRS. The majority of parents did not utilize car seat check professionals although they expressed interest in and lack of knowledge about how to access these resources. Although there was some interest in an app that would be personalized and able to push just-in-time content (e.g., new guidelines, location and times of car seat checks), content that has sporadic relevance (e.g., initial installation) seemed more appropriate for a website. Stakeholder input is critical to guide the development and delivery of acceptable and useful child safety education.
More, S J; Hanlon, A; Marchewka, J; Boyle, L
2017-06-24
In recent years, 'private standards' in animal health and welfare have become increasingly common, and are often incorporated into quality assurance (QA) programmes. Here, we present an overview of the use of private animal health and welfare standards in QA programmes, and propose a generic framework to facilitate critical programme review. Private standards are being developed in direct response to consumer demand for QA, and offer an opportunity for product differentiation and a means to drive consumer choice. Nonetheless, a range of concerns have been raised, relating to the credibility of these standards, their potential as a discriminatory barrier to trade, the multiplicity of private standards that have been developed, the lack of consumer input and compliance costs. There is a need for greater scrutiny of private standards and of associated QA programmes. We propose a framework to clarify the primary programme goal(s) and measureable outputs relevant to animal health and welfare, the primary programme beneficiaries and to determine whether the programme is effective, efficient and transparent. This paper provides a theoretical overview, noting that this framework could be used as a tool directly for programme evaluation, or as a tool to assist with programme development and review. British Veterinary Association.
Critical current and linewidth reduction in spin-torque nano-oscillators by delayed self-injection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khalsa, Guru, E-mail: guru.khalsa@nist.gov; Stiles, M. D.; Grollier, J.
2015-06-15
Based on theoretical models, the dynamics of spin-torque nano-oscillators can be substantially modified by re-injecting the emitted signal to the input of the oscillator after some delay. Numerical simulations for vortex magnetic tunnel junctions show that with reasonable parameters this approach can decrease critical currents as much as 25% and linewidths by a factor of 4. Analytical calculations, which agree well with simulations, demonstrate that these results can be generalized to any kind of spin-torque oscillator.
Taillefumier, Thibaud; Magnasco, Marcelo O
2013-04-16
Finding the first time a fluctuating quantity reaches a given boundary is a deceptively simple-looking problem of vast practical importance in physics, biology, chemistry, neuroscience, economics, and industrial engineering. Problems in which the bound to be traversed is itself a fluctuating function of time include widely studied problems in neural coding, such as neuronal integrators with irregular inputs and internal noise. We show that the probability p(t) that a Gauss-Markov process will first exceed the boundary at time t suffers a phase transition as a function of the roughness of the boundary, as measured by its Hölder exponent H. The critical value occurs when the roughness of the boundary equals the roughness of the process, so for diffusive processes the critical value is Hc = 1/2. For smoother boundaries, H > 1/2, the probability density is a continuous function of time. For rougher boundaries, H < 1/2, the probability is concentrated on a Cantor-like set of zero measure: the probability density becomes divergent, almost everywhere either zero or infinity. The critical point Hc = 1/2 corresponds to a widely studied case in the theory of neural coding, in which the external input integrated by a model neuron is a white-noise process, as in the case of uncorrelated but precisely balanced excitatory and inhibitory inputs. We argue that this transition corresponds to a sharp boundary between rate codes, in which the neural firing probability varies smoothly, and temporal codes, in which the neuron fires at sharply defined times regardless of the intensity of internal noise.
High-Voltage-Input Level Translator Using Standard CMOS
NASA Technical Reports Server (NTRS)
Yager, Jeremy A.; Mojarradi, Mohammad M.; Vo, Tuan A.; Blalock, Benjamin J.
2011-01-01
proposed integrated circuit would translate (1) a pair of input signals having a low differential potential and a possibly high common-mode potential into (2) a pair of output signals having the same low differential potential and a low common-mode potential. As used here, "low" and "high" refer to potentials that are, respectively, below or above the nominal supply potential (3.3 V) at which standard complementary metal oxide/semiconductor (CMOS) integrated circuits are designed to operate. The input common-mode potential could lie between 0 and 10 V; the output common-mode potential would be 2 V. This translation would make it possible to process the pair of signals by use of standard 3.3-V CMOS analog and/or mixed-signal (analog and digital) circuitry on the same integrated-circuit chip. A schematic of the circuit is shown in the figure. Standard 3.3-V CMOS circuitry cannot withstand input potentials greater than about 4 V. However, there are many applications that involve low-differential-potential, high-common-mode-potential input signal pairs and in which standard 3.3-V CMOS circuitry, which is relatively inexpensive, would be the most appropriate circuitry for performing other functions on the integrated-circuit chip that handles the high-potential input signals. Thus, there is a need to combine high-voltage input circuitry with standard low-voltage CMOS circuitry on the same integrated-circuit chip. The proposed circuit would satisfy this need. In the proposed circuit, the input signals would be coupled into both a level-shifting pair and a common-mode-sensing pair of CMOS transistors. The output of the level-shifting pair would be fed as input to a differential pair of transistors. The resulting differential current output would pass through six standoff transistors to be mirrored into an output branch by four heterojunction bipolar transistors. The mirrored differential current would be converted back to potential by a pair of diode-connected transistors, which, by virtue of being identical to the input transistors, would reproduce the input differential potential at the output
"Diagnosing" Saudi health reforms: is NHIS the right "prescription"?
Al-Sharqi, Omar Zayan; Abdullah, Muhammad Tanweer
2013-01-01
This paper outlines the health context of the Kingdom of Saudi Arabia (KSA). It reviews health systems development in the KSA from 1925 through to contemporary New Health Insurance System (NHIS). It also examines the consistency of NHIS in view of the emerging challenges. This paper identifies the determinants and scope of contextual consistency. First, it indicates the need to evolve an indigenous, integrated, and comprehensive insurance system. Second, it highlights the access and equity gaps in service delivery across the rural and remote regions and suggests how to bring these under insurance coverage. Third, it suggests how inputs from both the public and private sectors should be harmonized - the "quality" of services in the private healthcare industry to be regulated by the state and international standards, its scope to be determined primarily by open-market dynamics and the public sector welfare-model to ensure "access" of all to essential health services. Fourth, it states the need to implement an evidence-based public health policy and bridge inherent gaps in policy design and personal-level lifestyles. Fifth, it points out the need to produce a viable infrastructure for health insurance. Because social research and critical reviews in the KSA health scenario are rare, this paper offers insights into the mainstream challenges of NHIS implementation and identifies the inherent weaknesses that need attention. It guides health policy makers, economists, planners, healthcare service managers, and even the insurance businesses, and points to key directions for similar research in future. Copyright © 2012 John Wiley & Sons, Ltd.
Assessing the need for communication training for specialists in poison information.
Planalp, Sally; Crouch, Barbara; Rothwell, Erin; Ellington, Lee
2009-07-01
Effective communication has been shown to be essential to physician-patient communication and may be even more critical for poison control center (PCC) calls because of the absence of visual cues, the need for quick and accurate information exchange, and possible suboptimal conditions such as call surges. Professionals who answer poison control calls typically receive extensive training in toxicology but very little formal training in communication. An instrument was developed to assess the perceived need for communication training for specialists in poison information (SPIs) with input from focus groups and a panel of experts. Requests to respond to an online questionnaire were made to PCCs throughout the United States and Canada. The 537 respondents were 70% SPIs or poison information providers (PIPs), primarily educated in nursing or pharmacy, working across the United States and Canada, and employed by their current centers an average of 10 years. SPIs rated communication skills as extremely important to securing positive outcomes for PCC calls even though they reported that their own training was not strongly focused on communication and existing training in communication was perceived as only moderately useful. Ratings of the usefulness of 21 specific training units were consistently high, especially for new SPIs but also for experienced SPIs. Directors rated the usefulness of training for experienced SPIs higher for 5 of the 21 challenges compared to the ratings of SPIs. Findings support the need for communication training for SPIs and provide an empirical basis for setting priorities in developing training units.
Climate Observations from Space
NASA Astrophysics Data System (ADS)
Briggs, Stephen
2016-07-01
The latest Global Climate Observing System (GCOS) Status Report on global climate observations, delivered to the UNFCCC COP21 in November 2016, showed how satellite data are critical for observations relating to climate. Of the 50 Essential Climate Variables (ECVs) identified by GCOS as necessary for understanding climate change, about half are derived only from satellite data while half of the remainder have a significant input from satellites. Hence data from Earth observing satellite systems are now a fundamental requirement for understanding the climate system and for managing the consequences of climate change. Following the Paris Agreement of COP21 this need is only greater. Not only will satellites have to continue to provide data for modelling and predicting climate change but also for a much wider range of actions relating to climate. These include better information on loss and damage, resilience, improved adaptation to change, and on mitigation including information on greenhouse gas emissions. In addition there is an emerging need for indicators of the risks associated with future climate change which need to be better quantified, allowing policy makers both to understand what decisions need to be taken, and to see the consequences of their actions. The presentation will set out some of the ways in which satellite data are important in all aspects of understanding, managing and predicting climate change and how they may be used to support future decisions by those responsible for policy related to managing climate change and its consequences.
Application of Blind Quantum Computation to Two-Party Quantum Computation
NASA Astrophysics Data System (ADS)
Sun, Zhiyuan; Li, Qin; Yu, Fang; Chan, Wai Hong
2018-06-01
Blind quantum computation (BQC) allows a client who has only limited quantum power to achieve quantum computation with the help of a remote quantum server and still keep the client's input, output, and algorithm private. Recently, Kashefi and Wallden extended BQC to achieve two-party quantum computation which allows two parties Alice and Bob to perform a joint unitary transform upon their inputs. However, in their protocol Alice has to prepare rotated single qubits and perform Pauli operations, and Bob needs to have a powerful quantum computer. In this work, we also utilize the idea of BQC to put forward an improved two-party quantum computation protocol in which the operations of both Alice and Bob are simplified since Alice only needs to apply Pauli operations and Bob is just required to prepare and encrypt his input qubits.
Application of Blind Quantum Computation to Two-Party Quantum Computation
NASA Astrophysics Data System (ADS)
Sun, Zhiyuan; Li, Qin; Yu, Fang; Chan, Wai Hong
2018-03-01
Blind quantum computation (BQC) allows a client who has only limited quantum power to achieve quantum computation with the help of a remote quantum server and still keep the client's input, output, and algorithm private. Recently, Kashefi and Wallden extended BQC to achieve two-party quantum computation which allows two parties Alice and Bob to perform a joint unitary transform upon their inputs. However, in their protocol Alice has to prepare rotated single qubits and perform Pauli operations, and Bob needs to have a powerful quantum computer. In this work, we also utilize the idea of BQC to put forward an improved two-party quantum computation protocol in which the operations of both Alice and Bob are simplified since Alice only needs to apply Pauli operations and Bob is just required to prepare and encrypt his input qubits.
Homeostasis in a feed forward loop gene regulatory motif.
Antoneli, Fernando; Golubitsky, Martin; Stewart, Ian
2018-05-14
The internal state of a cell is affected by inputs from the extra-cellular environment such as external temperature. If some output, such as the concentration of a target protein, remains approximately constant as inputs vary, the system exhibits homeostasis. Special sub-networks called motifs are unusually common in gene regulatory networks (GRNs), suggesting that they may have a significant biological function. Potentially, one such function is homeostasis. In support of this hypothesis, we show that the feed-forward loop GRN produces homeostasis. Here the inputs are subsumed into a single parameter that affects only the first node in the motif, and the output is the concentration of a target protein. The analysis uses the notion of infinitesimal homeostasis, which occurs when the input-output map has a critical point (zero derivative). In model equations such points can be located using implicit differentiation. If the second derivative of the input-output map also vanishes, the critical point is a chair: the output rises roughly linearly, then flattens out (the homeostasis region or plateau), and then starts to rise again. Chair points are a common cause of homeostasis. In more complicated equations or networks, numerical exploration would have to augment analysis. Thus, in terms of finding chairs, this paper presents a proof of concept. We apply this method to a standard family of differential equations modeling the feed-forward loop GRN, and deduce that chair points occur. This function determines the production of a particular mRNA and the resulting chair points are found analytically. The same method can potentially be used to find homeostasis regions in other GRNs. In the discussion and conclusion section, we also discuss why homeostasis in the motif may persist even when the rest of the network is taken into account. Copyright © 2018 Elsevier Ltd. All rights reserved.
Power characteristics in GMAW: Experimental and numerical investigation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joensson, P.G.; Szekely, J.; Madigan, R.B.
1995-03-01
The voltage and power distributions in gas metal arc welding (GMAW) were studied both experimentally and numerically. The principal voltage drop takes place in the arc, which also constitutes the dominant power contribution. Within the arc, the dominating voltage contributions are from the arc column and the cathode fall, while the anode fall and the electrode regions are less significant. The power input to the arc column increases with both increasing current and increasing arc length. These results indicate that it is critical to control the arc length in order to control the power input to the system.
Working for a sustainable future: healthcare leaders provide input for new model.
2003-06-01
With each tick of the clock, healthcare leaders are coming face to face with a pressing quandary: How can they best guide their organizations to success and sustainability in a rocky and ever-changing healthcare environment? A new "model of sustainability," developed with input from nine CEOs of top medical institutions, may provide some guidance. The model includes six leadership imperatives that underscore critical approaches to supporting the hospital of the future: Build strong organization-wide leadership, become the employer of choice, generate financial strength, redesign structures and processes, develop productive physician relationships, and engage consumers.
Computer program for preliminary design analysis of axial-flow turbines
NASA Technical Reports Server (NTRS)
Glassman, A. J.
1972-01-01
The program method is based on a mean-diameter flow analysis. Input design requirements include power or pressure ratio, flow, temperature, pressure, and speed. Turbine designs are generated for any specified number of stages and for any of three types of velocity diagrams (symmetrical, zero exit swirl, or impulse). Exit turning vanes can be included in the design. Program output includes inlet and exit annulus dimensions, exit temperature and pressure, total and static efficiencies, blading angles, and last-stage critical velocity ratios. The report presents the analysis method, a description of input and output with sample cases, and the program listing.
Neural Network Modeling for Gallium Arsenide IC Fabrication Process and Device Characteristics.
NASA Astrophysics Data System (ADS)
Creech, Gregory Lee, I.
This dissertation presents research focused on the utilization of neurocomputing technology to achieve enhanced yield and effective yield prediction in integrated circuit (IC) manufacturing. Artificial neural networks are employed to model complex relationships between material and device characteristics at critical stages of the semiconductor fabrication process. Whole wafer testing was performed on the starting substrate material and during wafer processing at four critical steps: Ohmic or Post-Contact, Post-Recess, Post-Gate and Final, i.e., at completion of fabrication. Measurements taken and subsequently used in modeling include, among others, doping concentrations, layer thicknesses, planar geometries, layer-to-layer alignments, resistivities, device voltages, and currents. The neural network architecture used in this research is the multilayer perceptron neural network (MLPNN). The MLPNN is trained in the supervised mode using the generalized delta learning rule. It has one hidden layer and uses continuous perceptrons. The research focuses on a number of different aspects. First is the development of inter-process stage models. Intermediate process stage models are created in a progressive fashion. Measurements of material and process/device characteristics taken at a specific processing stage and any previous stages are used as input to the model of the next processing stage characteristics. As the wafer moves through the fabrication process, measurements taken at all previous processing stages are used as input to each subsequent process stage model. Secondly, the development of neural network models for the estimation of IC parametric yield is demonstrated. Measurements of material and/or device characteristics taken at earlier fabrication stages are used to develop models of the final DC parameters. These characteristics are computed with the developed models and compared to acceptance windows to estimate the parametric yield. A sensitivity analysis is performed on the models developed during this yield estimation effort. This is accomplished by analyzing the total disturbance of network outputs due to perturbed inputs. When an input characteristic bears no, or little, statistical or deterministic relationship to the output characteristics, it can be removed as an input. Finally, neural network models are developed in the inverse direction. Characteristics measured after the final processing step are used as the input to model critical in-process characteristics. The modeled characteristics are used for whole wafer mapping and its statistical characterization. It is shown that this characterization can be accomplished with minimal in-process testing. The concepts and methodologies used in the development of the neural network models are presented. The modeling results are provided and compared to the actual measured values of each characteristic. An in-depth discussion of these results and ideas for future research are presented.
A Critical Realist Orientation to Learner Needs
ERIC Educational Resources Information Center
Ayers, David F.
2011-01-01
The objective of this essay is to propose critical realism as a philosophical middle way between two sets of ontological, epistemological, and methodological assumptions regarding learner needs. Key concepts of critical realism, a tradition in the philosophy of science, are introduced and applied toward an analysis of learner needs, resulting in…
Forecasting: Exercises to Enhance Learning from Business Simulations
ERIC Educational Resources Information Center
Clark, Timothy S.; Kent, Brian M.
2013-01-01
Forecasting the outputs of dynamic systems develops a richer understanding of relevant inputs and their interrelationships than merely observing them ex post. Academic business simulations foster students' development of this critical competency, but learning outcomes can be significantly augmented with relatively simple, complementary exercises…
Area XV Career Education Research & Planning. Final Report.
ERIC Educational Resources Information Center
Indian Hills Community Coll., Ottumwa, IA.
Critical issues in career education are addressed in this report of an advisory committee seeking input and making recommendations for career education implementation in Iowa. Recommendations addressing state, area, and local school district responsibilities are grouped into three main perspectives: planning, implementation, and evaluation. The…
Nitrogen in agricultural systems: Implications for conservation policy
USDA-ARS?s Scientific Manuscript database
Nitrogen is an important agricultural input that is critical for providing food to feed a growing world population. However, the introduction of large amount of reactive nitrogen into the environment has a number of undesirable impacts on water, terrestrial, and atmospheric resources. Careful manage...
Synchronization properties of coupled chaotic neurons: The role of random shared input
NASA Astrophysics Data System (ADS)
Kumar, Rupesh; Bilal, Shakir; Ramaswamy, Ram
2016-06-01
Spike-time correlations of neighbouring neurons depend on their intrinsic firing properties as well as on the inputs they share. Studies have shown that periodically firing neurons, when subjected to random shared input, exhibit asynchronicity. Here, we study the effect of random shared input on the synchronization of weakly coupled chaotic neurons. The cases of so-called electrical and chemical coupling are both considered, and we observe a wide range of synchronization behaviour. When subjected to identical shared random input, there is a decrease in the threshold coupling strength needed for chaotic neurons to synchronize in-phase. The system also supports lag-synchronous states, and for these, we find that shared input can cause desynchronization. We carry out a master stability function analysis for a network of such neurons and show agreement with the numerical simulations. The contrasting role of shared random input for complete and lag synchronized neurons is useful in understanding spike-time correlations observed in many areas of the brain.
Synchronization properties of coupled chaotic neurons: The role of random shared input
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumar, Rupesh; Bilal, Shakir; Ramaswamy, Ram
Spike-time correlations of neighbouring neurons depend on their intrinsic firing properties as well as on the inputs they share. Studies have shown that periodically firing neurons, when subjected to random shared input, exhibit asynchronicity. Here, we study the effect of random shared input on the synchronization of weakly coupled chaotic neurons. The cases of so-called electrical and chemical coupling are both considered, and we observe a wide range of synchronization behaviour. When subjected to identical shared random input, there is a decrease in the threshold coupling strength needed for chaotic neurons to synchronize in-phase. The system also supports lag–synchronous states,more » and for these, we find that shared input can cause desynchronization. We carry out a master stability function analysis for a network of such neurons and show agreement with the numerical simulations. The contrasting role of shared random input for complete and lag synchronized neurons is useful in understanding spike-time correlations observed in many areas of the brain.« less
Eye-gaze and intent: Application in 3D interface control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schryver, J.C.; Goldberg, J.H.
1993-06-01
Computer interface control is typically accomplished with an input ``device`` such as keyboard, mouse, trackball, etc. An input device translates a users input actions, such as mouse clicks and key presses, into appropriate computer commands. To control the interface, the user must first convert intent into the syntax of the input device. A more natural means of computer control is possible when the computer can directly infer user intent, without need of intervening input devices. We describe an application of eye-gaze-contingent control of an interactive three-dimensional (3D) user interface. A salient feature of the user interface is natural input, withmore » a heightened impression of controlling the computer directly by the mind. With this interface, input of rotation and translation are intuitive, whereas other abstract features, such as zoom, are more problematic to match with user intent. This paper describes successes with implementation to date, and ongoing efforts to develop a more sophisticated intent inferencing methodology.« less
Eye-gaze and intent: Application in 3D interface control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schryver, J.C.; Goldberg, J.H.
1993-01-01
Computer interface control is typically accomplished with an input device'' such as keyboard, mouse, trackball, etc. An input device translates a users input actions, such as mouse clicks and key presses, into appropriate computer commands. To control the interface, the user must first convert intent into the syntax of the input device. A more natural means of computer control is possible when the computer can directly infer user intent, without need of intervening input devices. We describe an application of eye-gaze-contingent control of an interactive three-dimensional (3D) user interface. A salient feature of the user interface is natural input, withmore » a heightened impression of controlling the computer directly by the mind. With this interface, input of rotation and translation are intuitive, whereas other abstract features, such as zoom, are more problematic to match with user intent. This paper describes successes with implementation to date, and ongoing efforts to develop a more sophisticated intent inferencing methodology.« less
The $19.95 Solution to Large Group Telephone Interviews with Special Speakers.
ERIC Educational Resources Information Center
Robinson, George H.
1998-01-01
Describes an inexpensive solution for holding large-group telephone interviews, listing the equipment needed (record control, telephone, phone line with modular jack, portable amplifier with microphone-level input jack, audio cable with jack and plug compatible with the microphone input jack on the amplifier) and providing directions for setup.…
An Automated Program Testing Methodology and its Implementation.
1980-01-01
correctly on its input data; the number of for each software system. asse rtions violated defines an "error function"Itiimoan tocos ecsswhh over the Input...space of the program. ThisItiimoan tocos tecse whh remove@ the need to examine a program’s output uncover errors early in the development cycle, in
ERIC Educational Resources Information Center
Blandford, A. E.; Smith, P. R.
1986-01-01
Describes the style of design of computer simulations developed by Computer Assisted Teaching Unit at Queen Mary College with reference to user interface, input and initialization, input data vetting, effective display screen use, graphical results presentation, and need for hard copy. Procedures and problems relating to academic involvement are…
Ethanol or Biodiesel? A Systems-Analysis Decision
ERIC Educational Resources Information Center
Dinan, Frank; Stabler, Tom
2008-01-01
This case study stresses the need to broadly consider an entire system, including all of the energy inputs and outputs involved, to determine the real efficiency of that system. It also asks its student audience to consider the role that scientific input plays in policy decision-making processes. It emphasizes that, despite the importance of this…
Accurate, up-to-date information describing Nr inputs by source is needed for effective Nr management and for guiding Nr research. Here we present a new synthesis of spatial data describing present Nr inputs to terrestrial and aquatic ecosystems across the conterminous US to hel...
Context-based virtual metrology
NASA Astrophysics Data System (ADS)
Ebersbach, Peter; Urbanowicz, Adam M.; Likhachev, Dmitriy; Hartig, Carsten; Shifrin, Michael
2018-03-01
Hybrid and data feed forward methodologies are well established for advanced optical process control solutions in highvolume semiconductor manufacturing. Appropriate information from previous measurements, transferred into advanced optical model(s) at following step(s), provides enhanced accuracy and exactness of the measured topographic (thicknesses, critical dimensions, etc.) and material parameters. In some cases, hybrid or feed-forward data are missed or invalid for dies or for a whole wafer. We focus on approaches of virtual metrology to re-create hybrid or feed-forward data inputs in high-volume manufacturing. We discuss missing data inputs reconstruction which is based on various interpolation and extrapolation schemes and uses information about wafer's process history. Moreover, we demonstrate data reconstruction approach based on machine learning techniques utilizing optical model and measured spectra. And finally, we investigate metrics that allow one to assess error margin of virtual data input.
Universal Skills and Competencies for Geoscientists
NASA Astrophysics Data System (ADS)
Mosher, S.
2015-12-01
Geoscience students worldwide face a changing future workforce, but all geoscience work has universal cross-cutting skills and competencies that are critical for success. A recent Geoscience Employers Workshop, and employers' input on the "Future of Undergraduate Geoscience Education" survey, identified three major areas. Geoscience work requires spatial and temporal (3D & 4D) thinking, understanding that the Earth is a system of interacting parts and processes, and geoscience reasoning and synthesis. Thus, students need to be able to solve problems in the context of an open and dynamic system, recognizing that most geoscience problems have no clear, unambiguous answers. Students must learn to manage uncertainty, work by analogy and inference, and make predations with limited data. Being able to visualize and solve problems in 3D, incorporate the element of time, and understand scale is critical. Additionally students must learn how to tackle problems using real data, including understand the problems' context, identify appropriate questions to ask, and determine how to proceed. Geoscience work requires integration of quantitative, technical, and computational skills and the ability to be intellectually flexible in applying skills to new situations. Students need experience using high-level math and computational methods to solve geoscience problems, including probability and statistics to understand risk. Increasingly important is the ability to use "Big Data", GIS, visualization and modeling tools. Employers also agree a strong field component in geoscience education is important. Success as a geoscientist also requires non-technical skills. Because most work environments involve working on projects with a diverse team, students need experience with project management in team settings, including goal setting, conflict resolution, time management and being both leader and follower. Written and verbal scientific communication, as well as public speaking and listening skills, are important. Success also depends on interpersonal skills and professionalism, including business acumen, risk management, ethical conduct, and leadership. A global perspective is increasingly important, including cultural literacy and understanding societal relevance.
A pilot study comparing mouse and mouse-emulating interface devices for graphic input.
Kanny, E M; Anson, D K
1991-01-01
Adaptive interface devices make it possible for individuals with physical disabilities to use microcomputers and thus perform many tasks that they would otherwise be unable to accomplish. Special equipment is available that purports to allow functional access to the computer for users with disabilities. As technology moves from purely keyboard applications to include graphic input, it will be necessary for assistive interface devices to support graphics as well as text entry. Headpointing systems that emulate the mouse in combination with on-screen keyboards are of particular interest to persons with severe physical impairment such as high level quadriplegia. Two such systems currently on the market are the HeadMaster and the Free Wheel. The authors have conducted a pilot study comparing graphic input speed using the mouse and two headpointing interface systems on the Macintosh computer. The study used a single subject design with six able-bodied subjects, to establish a baseline for comparison with persons with severe disabilities. Results of these preliminary data indicated that the HeadMaster was nearly as effective as the mouse and that it was superior to the Free Wheel for graphics input. This pilot study, however, demonstrated several experimental design problems that need to be addressed to make the study more robust. It also demonstrated the need to include the evaluation of text input so that the effectiveness of the interface devices with text and graphic input could be compared.
NASA Technical Reports Server (NTRS)
Aggarwal, Arun K.
1993-01-01
The computer program SASHBEAN (Sikorsky Aircraft Spherical Roller High Speed Bearing Analysis) analyzes and predicts the operating characteristics of a Single Row, Angular Contact, Spherical Roller Bearing (SRACSRB). The program runs on an IBM or IBM compatible personal computer, and for a given set of input data analyzes the bearing design for it's ring deflections (axial and radial), roller deflections, contact areas and stresses, induced axial thrust, rolling element and cage rotation speeds, lubrication parameters, fatigue lives, and amount of heat generated in the bearing. The dynamic loading of rollers due to centrifugal forces and gyroscopic moments, which becomes quite significant at high speeds, is fully considered in this analysis. For a known application and it's parameters, the program is also capable of performing steady-state and time-transient thermal analyses of the bearing system. The steady-state analysis capability allows the user to estimate the expected steady-state temperature map in and around the bearing under normal operating conditions. On the other hand, the transient analysis feature provides the user a means to simulate the 'lost lubricant' condition and predict a time-temperature history of various critical points in the system. The bearing's 'time-to-failure' estimate may also be made from this (transient) analysis by considering the bearing as failed when a certain temperature limit is reached in the bearing components. The program is fully interactive and allows the user to get started and access most of its features with a minimal of training. For the most part, the program is menu driven, and adequate help messages were provided to guide a new user through various menu options and data input screens. All input data, both for mechanical and thermal analyses, are read through graphical input screens, thereby eliminating any need of a separate text editor/word processor to edit/create data files. Provision is also available to select and view the contents of output files on the monitor screen if no paper printouts are required. A separate volume (Volume-2) of this documentation describes, in detail, the underlying mathematical formulations, assumptions, and solution algorithms of this program.
The development of a physiotherapy continence promotion program using a customer focus.
Chiarelli, Pauline; Cockburn, Jill
1999-01-01
Health promotion programs provide information, education for health and opportunity for the development of the skills that people need to make healthy choices. The current climate of health care practice also directs its focus to the needs and wants of the health care consumers. This entails active input from the target group. The present study used focus groups in an attempt to ensure input from women in early postpartum into the development of a postpartum continence promotion program. The focus groups revealed anomalies in women's perceived susceptibility to, and knowledge about, urinary incontinence and pelvic floor exercises, while highlighting other areas of need. Focus groups proved an invaluable tool in the development of a more effective physiotherapy continence promotion program.
Advances in EPA’s Rapid Exposure and Dosimetry Project (Interagency Alternatives Assessment Webinar)
Estimates of human and ecological exposures are required as critical input to risk-based prioritization and screening of chemicals. The CSS Rapid Exposure and Dosimetry project seeks to develop the data, tools, and evaluation approaches required to generate rapid and scientifical...
USDA-ARS?s Scientific Manuscript database
Successful hydrological model predictions depend on appropriate framing of scale and the spatial-temporal accuracy of input parameters describing soil hydraulic properties. Saturated soil hydraulic conductivity (Ksat) is one of the most important properties influencing water movement through soil un...
Fernandez, Fernando R.; Malerba, Paola; White, John A.
2015-01-01
The presence of voltage fluctuations arising from synaptic activity is a critical component in models of gain control, neuronal output gating, and spike rate coding. The degree to which individual neuronal input-output functions are modulated by voltage fluctuations, however, is not well established across different cortical areas. Additionally, the extent and mechanisms of input-output modulation through fluctuations have been explored largely in simplified models of spike generation, and with limited consideration for the role of non-linear and voltage-dependent membrane properties. To address these issues, we studied fluctuation-based modulation of input-output responses in medial entorhinal cortical (MEC) stellate cells of rats, which express strong sub-threshold non-linear membrane properties. Using in vitro recordings, dynamic clamp and modeling, we show that the modulation of input-output responses by random voltage fluctuations in stellate cells is significantly limited. In stellate cells, a voltage-dependent increase in membrane resistance at sub-threshold voltages mediated by Na+ conductance activation limits the ability of fluctuations to elicit spikes. Similarly, in exponential leaky integrate-and-fire models using a shallow voltage-dependence for the exponential term that matches stellate cell membrane properties, a low degree of fluctuation-based modulation of input-output responses can be attained. These results demonstrate that fluctuation-based modulation of input-output responses is not a universal feature of neurons and can be significantly limited by subthreshold voltage-gated conductances. PMID:25909971
Wang, Yi; Li, Chunyue; Tu, Cong; Hoyt, Greg D; DeForest, Jared L; Hu, Shuijin
2017-12-31
Intensive tillage and high inputs of chemicals are frequently used in conventional agriculture management, which critically depresses soil properties and causes soil erosion and nonpoint source pollution. Conservation practices, such as no-tillage and organic farming, have potential to enhance soil health. However, the long-term impact of no-tillage and organic practices on soil microbial diversity and community structure has not been fully understood, particularly in humid, warm climate regions such as the southeast USA. We hypothesized that organic inputs will lead to greater microbial diversity and a more stable microbial community, and that the combination of no-tillage and organic inputs will maximize soil microbial diversity. We conducted a long-term experiment in the southern Appalachian mountains of North Carolina, USA to test these hypotheses. The results showed that soil microbial diversity and community structure diverged under different management regimes after long term continuous treatments. Organic input dominated the effect of management practices on soil microbial properties, although no-tillage practice also exerted significant impacts. Both no-tillage and organic inputs significantly promoted soil microbial diversity and community stability. The combination of no-tillage and organic management increased soil microbial diversity over the conventional tillage and led to a microbial community structure more similar to the one in an adjacent grassland. These results indicate that effective management through reducing tillage and increasing organic C inputs can enhance soil microbial diversity and community stability. Copyright © 2017 Elsevier B.V. All rights reserved.
Fernandez, Fernando R; Malerba, Paola; White, John A
2015-04-01
The presence of voltage fluctuations arising from synaptic activity is a critical component in models of gain control, neuronal output gating, and spike rate coding. The degree to which individual neuronal input-output functions are modulated by voltage fluctuations, however, is not well established across different cortical areas. Additionally, the extent and mechanisms of input-output modulation through fluctuations have been explored largely in simplified models of spike generation, and with limited consideration for the role of non-linear and voltage-dependent membrane properties. To address these issues, we studied fluctuation-based modulation of input-output responses in medial entorhinal cortical (MEC) stellate cells of rats, which express strong sub-threshold non-linear membrane properties. Using in vitro recordings, dynamic clamp and modeling, we show that the modulation of input-output responses by random voltage fluctuations in stellate cells is significantly limited. In stellate cells, a voltage-dependent increase in membrane resistance at sub-threshold voltages mediated by Na+ conductance activation limits the ability of fluctuations to elicit spikes. Similarly, in exponential leaky integrate-and-fire models using a shallow voltage-dependence for the exponential term that matches stellate cell membrane properties, a low degree of fluctuation-based modulation of input-output responses can be attained. These results demonstrate that fluctuation-based modulation of input-output responses is not a universal feature of neurons and can be significantly limited by subthreshold voltage-gated conductances.
Guide for Commenting on NEEDS and IPM
Find on this page a document intended to provide guidance on submitting clear, concise, and impactful comments on NEEDS (National Electric Energy Data System), other inputs to the Integrated Planning Model (IPM), or outputs from IPM.
Probalistic Criticality Consequence Evaluation (SCPB:N/A)
DOE Office of Scientific and Technical Information (OSTI.GOV)
P. Gottlieb; J.W. Davis; J.R. Massari
1996-09-04
This analysis is prepared by the Mined Geologic Disposal System (MGDS) Waste Package Development (WPD) department with the objective of providing a comprehensive, conservative estimate of the consequences of the criticality which could possibly occur as the result of commercial spent nuclear fuel emplaced in the underground repository at Yucca Mountain. The consequences of criticality are measured principally in terms of the resulting changes in radionuclide inventory as a function of the power level and duration of the criticality. The purpose of this analysis is to extend the prior estimates of increased radionuclide inventory (Refs. 5.52 and 5.54), for bothmore » internal and external criticality. This analysis, and similar estimates and refinements to be completed before the end of fiscal year 1997, will be provided as input to Total System Performance Assessment-Viability Assessment (TSPA-VA) to demonstrate compliance with the repository performance objectives.« less
Zhang, Xiaoyu; Ju, Han; Penney, Trevor B; VanDongen, Antonius M J
2017-01-01
Humans instantly recognize a previously seen face as "familiar." To deepen our understanding of familiarity-novelty detection, we simulated biologically plausible neural network models of generic cortical microcircuits consisting of spiking neurons with random recurrent synaptic connections. NMDA receptor (NMDAR)-dependent synaptic plasticity was implemented to allow for unsupervised learning and bidirectional modifications. Network spiking activity evoked by sensory inputs consisting of face images altered synaptic efficacy, which resulted in the network responding more strongly to a previously seen face than a novel face. Network size determined how many faces could be accurately recognized as familiar. When the simulated model became sufficiently complex in structure, multiple familiarity traces could be retained in the same network by forming partially-overlapping subnetworks that differ slightly from each other, thereby resulting in a high storage capacity. Fisher's discriminant analysis was applied to identify critical neurons whose spiking activity predicted familiar input patterns. Intriguingly, as sensory exposure was prolonged, the selected critical neurons tended to appear at deeper layers of the network model, suggesting recruitment of additional circuits in the network for incremental information storage. We conclude that generic cortical microcircuits with bidirectional synaptic plasticity have an intrinsic ability to detect familiar inputs. This ability does not require a specialized wiring diagram or supervision and can therefore be expected to emerge naturally in developing cortical circuits.
NASA Astrophysics Data System (ADS)
Weldeab, S.
2014-12-01
Understanding of the last interglacial (LIG) is critical for the assessment of long-term impact of global warming on the Atlantic meridional overturning circulation (AMOC) and climate. Relative to the Millennium, air temperature over Greenland and eustatic sea-level during the LIG was higher by 8±4˚C and 4-8 m, with a considerable oscillation in the rate of meltwater input (NEEM Community rembers, Nature, v.493, p.489; Kopp et al., Nature, v. 462, p. 863) . The impact of millennial-scale LIG meltwater input on the AMOC and global climate is, however, less understood. Here we present a highly resolved, benthic foraminiferal multi-proxy record from the eastern equatorial Atlantic. The record shows that the LIG was punctuated by at least two episodes of reduced AMOC whose impact on the global climate varied considerably. While the event between 126,000 and 123,800 years ago lacks imprints on available global climate records, the AMOC perturbation between 129,000 and 128,000 years ago provides a causative link to a rapid increase of atmospheric CO2, peak air warming over Antarctica, and a slow down of the rate of global monsoon intensification. We suggest that the rate of meltwater input into the North Atlantic and the size of remanent Greenland ice sheet was critical in determining the degree of AMOC reduction and its effect on the interhemispheric climate.
2017-01-01
Abstract Humans instantly recognize a previously seen face as “familiar.” To deepen our understanding of familiarity-novelty detection, we simulated biologically plausible neural network models of generic cortical microcircuits consisting of spiking neurons with random recurrent synaptic connections. NMDA receptor (NMDAR)-dependent synaptic plasticity was implemented to allow for unsupervised learning and bidirectional modifications. Network spiking activity evoked by sensory inputs consisting of face images altered synaptic efficacy, which resulted in the network responding more strongly to a previously seen face than a novel face. Network size determined how many faces could be accurately recognized as familiar. When the simulated model became sufficiently complex in structure, multiple familiarity traces could be retained in the same network by forming partially-overlapping subnetworks that differ slightly from each other, thereby resulting in a high storage capacity. Fisher’s discriminant analysis was applied to identify critical neurons whose spiking activity predicted familiar input patterns. Intriguingly, as sensory exposure was prolonged, the selected critical neurons tended to appear at deeper layers of the network model, suggesting recruitment of additional circuits in the network for incremental information storage. We conclude that generic cortical microcircuits with bidirectional synaptic plasticity have an intrinsic ability to detect familiar inputs. This ability does not require a specialized wiring diagram or supervision and can therefore be expected to emerge naturally in developing cortical circuits. PMID:28534043
Yousefi, Mohammad; Mahdavi Damghani, Abdolmajid; Khoramivafa, Mahmud
2016-04-01
The aims of this study were to determine energy requirement and global warming potential (GWP) in low and high input wheat production systems in western of Iran. For this purpose, data were collected from 120 wheat farms applying questionnaires via face-to-face interviews. Results showed that total energy input and output were 60,000 and 180,000 MJ ha(-1) in high input systems and 14,000 and 56,000 MJ ha(-1) in low input wheat production systems, respectively. The highest share of total input energy in high input systems recorded for electricity power, N fertilizer, and diesel fuel with 36, 18, and 13 %, respectively, while the highest share of input energy in low input systems observed for N fertilizer, diesel fuel, and seed with 32, 31, and 27 %. Energy use efficiency in high input systems (3.03) was lower than of low input systems (3.94). Total CO2, N2O, and CH4 emissions in high input systems were 1981.25, 31.18, and 1.87 kg ha(-1), respectively. These amounts were 699.88, 0.02, and 0.96 kg ha(-1) in low input systems. In high input wheat production systems, total GWP was 11686.63 kg CO2eq ha(-1) wheat. This amount was 725.89 kg CO2eq ha(-1) in low input systems. The results show that 1 ha of high input system will produce greenhouse effect 17 times of low input systems. So, high input production systems need to have an efficient and sustainable management for reducing environmental crises such as change climate.
NASA Technical Reports Server (NTRS)
Kellogg, W. W.
1975-01-01
A study was conducted to identify the sequence of processes that lead from some change in solar input to the earth to a change in tropospheric circulation and weather. Topics discussed include: inputs from the sun, the solar wind, and the magnetosphere; bremsstrahlung, ionizing radiation, cirrus clouds, thunderstorms, wave propagation, and gravity waves.
Can Non-Interactive Language Input Benefit Young Second-Language Learners?
ERIC Educational Resources Information Center
Au, Terry Kit-fong; Chan, Winnie Wailan; Cheng, Liao; Siegel, Linda S.; Tso, Ricky Van Yip
2015-01-01
To fully acquire a language, especially its phonology, children need linguistic input from native speakers early on. When interaction with native speakers is not always possible--e.g. for children learning a second language that is not the societal language--audios are commonly used as an affordable substitute. But does such non-interactive input…
A throughfall collection method using mixed bed ion exchange resin columns
Mark E. Fenn; Mark A. Poth; Michael J. Arbaugh
2002-01-01
Measurement of ionic deposition in throughfall is a widely used method for measuring deposition inputs to the forest floor. Many studies have been published, providing a large database of throughfall deposition inputs to forests. However, throughfall collection and analysis is labor intensive and expensive because of the large number of replicate collectors needed and...
Developing fire management mixes for fire program planning
Armando González-Cabán; Patricia B. Shinkle; Thomas J. Mills
1986-01-01
Evaluating economic efficiency of fire management program options requires information on the firefighting inputs, such as vehicles and crews, that would be needed to execute the program option selected. An algorithm was developed to translate automatically dollars allocated to type of firefighting inputs to numbers of units, using a set of weights for a specific fire...
Prototype Development of a Tradespace Analysis Tool for Spaceflight Medical Resources.
Antonsen, Erik L; Mulcahy, Robert A; Rubin, David; Blue, Rebecca S; Canga, Michael A; Shah, Ronak
2018-02-01
The provision of medical care in exploration-class spaceflight is limited by mass, volume, and power constraints, as well as limitations of available skillsets of crewmembers. A quantitative means of exploring the risks and benefits of inclusion or exclusion of onboard medical capabilities may help to inform the development of an appropriate medical system. A pilot project was designed to demonstrate the utility of an early tradespace analysis tool for identifying high-priority resources geared toward properly equipping an exploration mission medical system. Physician subject matter experts identified resources, tools, and skillsets required, as well as associated criticality scores of the same, to meet terrestrial, U.S.-specific ideal medical solutions for conditions concerning for exploration-class spaceflight. A database of diagnostic and treatment actions and resources was created based on this input and weighed against the probabilities of mission-specific medical events to help identify common and critical elements needed in a future exploration medical capability. Analysis of repository data demonstrates the utility of a quantitative method of comparing various medical resources and skillsets for future missions. Directed database queries can provide detailed comparative estimates concerning likelihood of resource utilization within a given mission and the weighted utility of tangible and intangible resources. This prototype tool demonstrates one quantitative approach to the complex needs and limitations of an exploration medical system. While this early version identified areas for refinement in future version development, more robust analysis tools may help to inform the development of a comprehensive medical system for future exploration missions.Antonsen EL, Mulcahy RA, Rubin D, Blue RS, Canga MA, Shah R. Prototype development of a tradespace analysis tool for spaceflight medical resources. Aerosp Med Hum Perform. 2018; 89(2):108-114.
Models, measurement, and strategies in developing critical-thinking skills.
Brunt, Barbara A
2005-01-01
Health care professionals must use critical-thinking skills to solve increasingly complex problems. Educators need to help nurses develop their critical-thinking skills to maintain and enhance their competence. This article reviews various models of critical thinking, as well as methods used to evaluate critical thinking. Specific educational strategies to develop nurses' critical-thinking skills are discussed. Additional research studies are needed to determine how the process of nursing practice can nurture and develop critical-thinking skills, and which strategies are most effective in developing and evaluating critical thinking.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-22
... (Operation Enduring Freedom/ Operation Iraqi Freedom Veterans Health Needs Assessment) Activity; Comment...: Operation Enduring Freedom/Operation Iraqi Freedom Veterans Health Needs Assessment, VA Form 10-21091. OMB... 10-21091 is used to gather input from returning war zone veterans to identify their needs, concerns...
NASA Astrophysics Data System (ADS)
Pioldi, Fabio; Ferrari, Rosalba; Rizzi, Egidio
2016-02-01
The present paper deals with the seismic modal dynamic identification of frame structures by a refined Frequency Domain Decomposition (rFDD) algorithm, autonomously formulated and implemented within MATLAB. First, the output-only identification technique is outlined analytically and then employed to characterize all modal properties. Synthetic response signals generated prior to the dynamic identification are adopted as input channels, in view of assessing a necessary condition for the procedure's efficiency. Initially, the algorithm is verified on canonical input from random excitation. Then, modal identification has been attempted successfully at given seismic input, taken as base excitation, including both strong motion data and single and multiple input ground motions. Rather than different attempts investigating the role of seismic response signals in the Time Domain, this paper considers the identification analysis in the Frequency Domain. Results turn-out very much consistent with the target values, with quite limited errors in the modal estimates, including for the damping ratios, ranging from values in the order of 1% to 10%. Either seismic excitation and high values of damping, resulting critical also in case of well-spaced modes, shall not fulfill traditional FFD assumptions: this shows the consistency of the developed algorithm. Through original strategies and arrangements, the paper shows that a comprehensive rFDD modal dynamic identification of frames at seismic input is feasible, also at concomitant high damping.
Variance-based interaction index measuring heteroscedasticity
NASA Astrophysics Data System (ADS)
Ito, Keiichi; Couckuyt, Ivo; Poles, Silvia; Dhaene, Tom
2016-06-01
This work is motivated by the need to deal with models with high-dimensional input spaces of real variables. One way to tackle high-dimensional problems is to identify interaction or non-interaction among input parameters. We propose a new variance-based sensitivity interaction index that can detect and quantify interactions among the input variables of mathematical functions and computer simulations. The computation is very similar to first-order sensitivity indices by Sobol'. The proposed interaction index can quantify the relative importance of input variables in interaction. Furthermore, detection of non-interaction for screening can be done with as low as 4 n + 2 function evaluations, where n is the number of input variables. Using the interaction indices based on heteroscedasticity, the original function may be decomposed into a set of lower dimensional functions which may then be analyzed separately.
There is a critical opportunity in the field of nanoscience to compare and integrate information across diverse fields of study through informatics (i.e., nanoinformatics). This paper is one in a series of articles on the data curation process in nanoinformatics (nanocuration). O...
EPA Exposure Research and the ExpoCast Project: New Methods and New Data (NIEHS Exposome webinar)
Estimates of human and ecological exposures are required as critical input to risk-based prioritization and screening of thousands of chemicals. In a 2009 commentary in Environmental Health Perspectives, Shelden and Hubal proposed that “Novel statistical and informatic approaches...
USDA-ARS?s Scientific Manuscript database
Deployment of biomass feedstock production systems in marginal lands with minimal external inputs is being recommended for sustainable feedstock supply. While nitrogen is critical for plant growth, injudicious application of fertilizer nitrogen in such marginal lands could magnify the existing non-p...
Concentrating on Affective Feedforward in Online Tutoring
ERIC Educational Resources Information Center
Chen, Ya-Ting; Chou, Yung-Hsin; Cowan, John
2014-01-01
With considerable input from the student voice, the paper centres on a detailed account of the experiences of Western academic, tutoring Eastern students online to develop their critical thinking skills. From their online experiences together as tutor and students, the writers present a considered case for the main emphasis in facilitative online…
ERIC Educational Resources Information Center
Jones, Frankie S.
2007-01-01
This qualitative study explored how collaborative technologies influence the informal learning experiences of virtual team members. Inputs revealed as critical to virtual informal learning were integrated, collaborative technological systems; positive relationships and trust; and organizational support and virtual team management. These inputs…
ERIC Educational Resources Information Center
Navracsics, Judit
2014-01-01
According to the critical period hypothesis, the earlier the acquisition of a second language starts, the better. Owing to the plasticity of the brain, up until a certain age a second language can be acquired successfully according to this view. Early second language learners are commonly said to have an advantage over later ones especially in…
1986-04-14
CONCIPT DIFINITION OIVILOPMINTITIST I OPERATION ANO ■ MAINTENANCE ■ TRACK MOifCTIO PROGRAMS • «VIIW CRITICAL ISSUIS . Mt PARI INPUTS TO PMO...development and beyond, evaluation criteria must Include quantitative goals (the desired value) and thresholds (the value beyond which the charac
Dispersion modeling tools have traditionally provided critical information for air quality management decisions, but have been used recently to provide exposure estimates to support health studies. However, these models can be challenging to implement, particularly in near-road s...
Flight Termination Systems Commonality Standard
2014-09-01
input level is increased 60 dB from threshold. Corona : A visible electric discharge resulting from a partial electric breakdown of gases as the...4-49 Figure 4-10. Electrostatic Discharge Test...check channel telemetry output (EFTR) CDR critical design review CFS flight self- discharge in amp-hrs CR remaining capacity for determining
Elementary Social Studies: Alaska Curriculum Guide. Second Edition.
ERIC Educational Resources Information Center
Alaska State Dept. of Education, Juneau. Office of Curriculum Services.
This guide represents a synthesis of input from many sources, both Alaskan and national. The critical components of a social studies education (knowledge, democratic beliefs and values, and skills) are incorporated throughout the guide which also features the concepts of justice, equality, responsibility, rule of law, freedom, diversity, privacy,…
Yanjun Su; Qinghua Guo; Danny L. Fry; Brandon M. Collins; Maggi Kelly; Jacob P. Flanagan; John J. Battles
2016-01-01
Abstract. Accurate vegetation mapping is critical for natural resources management, ecological analysis, and hydrological modeling, among other tasks. Remotely sensed multispectral and hyperspectral imageries have proved to be valuable inputs to the vegetation mapping process, but they can provide only limited vegetation structure...
L.H. Pardo; C.T. Driscoll; C.L. Goodale
2011-01-01
This publication provides a scientific synthesis of the current state of research and knowledge about the response of terrestrial and aquatic ecosystems to nitrogen (N) inputs (N deposition or N additions), and, where possible, identifi es critical loads for atmospheric N deposition. It also targets policy makers and resource managers who are seeking a scientific basis...
USDA-ARS?s Scientific Manuscript database
Irradiance, CO2, and temperature are critical inputs for photosynthesis and crop growth. They are also environmental parameters which growers can control in protected horticulture production systems. We evaluated the photosynthetic response of 13 herbaceous ornamentals (Begonia × hiemalis, Begonia...
Feed efficiency - how should it be used for the cow herd?
USDA-ARS?s Scientific Manuscript database
In cows, the most critical factor influencing the output component of efficiency is reproductive rate, and not necessarily weight gain. Thus benefits of selecting animals with desirable measures of feed efficiency on cow efficiency remain to be determined. The feed input component of cow efficiency...
Processing Instruction: A Review of Issues
ERIC Educational Resources Information Center
Rasuki, Muhlisin
2017-01-01
This paper provides a critical review of Processing Instruction (PI). This type of instructional option was specifically designed to help second/foreign language (L2) learners grasp meaning manifested in the use of particular grammatical forms in a target language effectively through the provision of input. In this way, PI attempts to help…
Cover crop, N-rate impacts on corn yield and soil N
USDA-ARS?s Scientific Manuscript database
Nitrogen fertilizer is a significant input expense for producers, as conversion of stable nitrogen into plant available reactive forms such as NH4 or NO3 is energy intensive and costly. These reactive forms of nitrogen (Nr), critical for crop production, can escape from agricultural systems into sur...
Estimates of human and ecological exposures are required as critical input to risk-based prioritization and screening of chemicals. This project seeks to develop the data, tools, and evaluation approaches required to generate rapid and scientifically-defensible exposure predictio...
Department of Defense meteorological and environmental inputs to aviation systems
NASA Technical Reports Server (NTRS)
Try, P. D.
1983-01-01
Recommendations based on need, cost, and achievement of flight safety are offered, and the re-evaluation of weather parameters needed for safe landing operations that lead to reliable and consistent automated observation capabilities are considered.
A Measure Approximation for Distributionally Robust PDE-Constrained Optimization Problems
Kouri, Drew Philip
2017-12-19
In numerous applications, scientists and engineers acquire varied forms of data that partially characterize the inputs to an underlying physical system. This data is then used to inform decisions such as controls and designs. Consequently, it is critical that the resulting control or design is robust to the inherent uncertainties associated with the unknown probabilistic characterization of the model inputs. Here in this work, we consider optimal control and design problems constrained by partial differential equations with uncertain inputs. We do not assume a known probabilistic model for the inputs, but rather we formulate the problem as a distributionally robustmore » optimization problem where the outer minimization problem determines the control or design, while the inner maximization problem determines the worst-case probability measure that matches desired characteristics of the data. We analyze the inner maximization problem in the space of measures and introduce a novel measure approximation technique, based on the approximation of continuous functions, to discretize the unknown probability measure. Finally, we prove consistency of our approximated min-max problem and conclude with numerical results.« less
NASA Astrophysics Data System (ADS)
Majumder, Himadri; Maity, Kalipada
2018-03-01
Shape memory alloy has a unique capability to return to its original shape after physical deformation by applying heat or thermo-mechanical or magnetic load. In this experimental investigation, desirability function analysis (DFA), a multi-attribute decision making was utilized to find out the optimum input parameter setting during wire electrical discharge machining (WEDM) of Ni-Ti shape memory alloy. Four critical machining parameters, namely pulse on time (TON), pulse off time (TOFF), wire feed (WF) and wire tension (WT) were taken as machining inputs for the experiments to optimize three interconnected responses like cutting speed, kerf width, and surface roughness. Input parameter combination TON = 120 μs., TOFF = 55 μs., WF = 3 m/min. and WT = 8 kg-F were found to produce the optimum results. The optimum process parameters for each desired response were also attained using Taguchi’s signal-to-noise ratio. Confirmation test has been done to validate the optimum machining parameter combination which affirmed DFA was a competent approach to select optimum input parameters for the ideal response quality for WEDM of Ni-Ti shape memory alloy.
Hypersonic Vehicle Trajectory Optimization and Control
NASA Technical Reports Server (NTRS)
Balakrishnan, S. N.; Shen, J.; Grohs, J. R.
1997-01-01
Two classes of neural networks have been developed for the study of hypersonic vehicle trajectory optimization and control. The first one is called an 'adaptive critic'. The uniqueness and main features of this approach are that: (1) they need no external training; (2) they allow variability of initial conditions; and (3) they can serve as feedback control. This is used to solve a 'free final time' two-point boundary value problem that maximizes the mass at the rocket burn-out while satisfying the pre-specified burn-out conditions in velocity, flightpath angle, and altitude. The second neural network is a recurrent network. An interesting feature of this network formulation is that when its inputs are the coefficients of the dynamics and control matrices, the network outputs are the Kalman sequences (with a quadratic cost function); the same network is also used for identifying the coefficients of the dynamics and control matrices. Consequently, we can use it to control a system whose parameters are uncertain. Numerical results are presented which illustrate the potential of these methods.
Challenges for Preclinical Investigations of Human Biofield Modalities
Gronowicz, Gloria; Bengston, William
2015-01-01
Preclinical models for studying the effects of the human biofield have great potential to advance our understanding of human biofield modalities, which include external qigong, Johrei, Reiki, therapeutic touch, healing touch, polarity therapy, pranic healing, and other practices. A short history of Western biofield studies using preclinical models is presented and demonstrates numerous and consistent examples of human biofields significantly affecting biological systems both in vitro and in vivo. Methodological issues arising from these studies and practical solutions in experimental design are presented. Important questions still left unanswered with preclinical models include variable reproducibility, dosing, intentionality of the practitioner, best preclinical systems, and mechanisms. Input from the biofield practitioners in the experimental design is critical to improving experimental outcomes; however, the development of standard criteria for uniformity of practice and for inclusion of multiple practitioners is needed. Research in human biofield studies involving preclinical models promises a better understanding of the mechanisms underlying the efficacy of biofield therapies and will be important in guiding clinical protocols and integrating treatments with conventional medical therapies. PMID:26665042
Effects of human fatigue on speech signals
NASA Astrophysics Data System (ADS)
Stamoulis, Catherine
2004-05-01
Cognitive performance may be significantly affected by fatigue. In the case of critical personnel, such as pilots, monitoring human fatigue is essential to ensure safety and success of a given operation. One of the modalities that may be used for this purpose is speech, which is sensitive to respiratory changes and increased muscle tension of vocal cords, induced by fatigue. Age, gender, vocal tract length, physical and emotional state may significantly alter speech intensity, duration, rhythm, and spectral characteristics. In addition to changes in speech rhythm, fatigue may also affect the quality of speech, such as articulation. In a noisy environment, detecting fatigue-related changes in speech signals, particularly subtle changes at the onset of fatigue, may be difficult. Therefore, in a performance-monitoring system, speech parameters which are significantly affected by fatigue need to be identified and extracted from input signals. For this purpose, a series of experiments was performed under slowly varying cognitive load conditions and at different times of the day. The results of the data analysis are presented here.
A Lay Ethics Quest for Technological Futures: About Tradition, Narrative and Decision-Making.
van der Burg, Simone
2016-01-01
Making better choices about future technologies that are being researched or developed is an important motivator behind lay ethics interventions. However, in practice, they do not always succeed to serve that goal. Especially authors who have noted that lay ethicists sometimes take recourse to well-known themes which stem from old, even 'archetypical' stories, have been criticized for making too little room for agency and decision-making in their approach. This paper aims to contribute to a reflection on how lay ethics can acquire more practical relevance. It will use resources in narrative ethics to suggest that in order to be relevant for action, facilitators of lay ethics interventions need to invite participants to engage in a narrative quest. As part of a quest, lay ethicists should be asked to (1) reflect on a specific question or choice, (2) use diverse (imaginative) input which is informative about the heterogeneity of viewpoints that are defended in society and (3) argue for their standpoints.
Seismic event classification system
Dowla, F.U.; Jarpe, S.P.; Maurer, W.
1994-12-13
In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. Self-organizing neural networks (SONNs) can be used for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs are useful for this purpose. Given the detection of a seismic event and the corresponding signal, computation is made of: the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This pre-processed input is fed into the SONNs. These neural networks are able to group events that look similar. The ART SONN has an advantage in classifying the event because the types of cluster groups do not need to be pre-defined. The results from the SONNs together with an expert seismologist's classification are then used to derive event classification probabilities. 21 figures.
Memory attacks on device-independent quantum cryptography.
Barrett, Jonathan; Colbeck, Roger; Kent, Adrian
2013-01-04
Device-independent quantum cryptographic schemes aim to guarantee security to users based only on the output statistics of any components used, and without the need to verify their internal functionality. Since this would protect users against untrustworthy or incompetent manufacturers, sabotage, or device degradation, this idea has excited much interest, and many device-independent schemes have been proposed. Here we identify a critical weakness of device-independent protocols that rely on public communication between secure laboratories. Untrusted devices may record their inputs and outputs and reveal information about them via publicly discussed outputs during later runs. Reusing devices thus compromises the security of a protocol and risks leaking secret data. Possible defenses include securely destroying or isolating used devices. However, these are costly and often impractical. We propose other more practical partial defenses as well as a new protocol structure for device-independent quantum key distribution that aims to achieve composable security in the case of two parties using a small number of devices to repeatedly share keys with each other (and no other party).
The phonological loop as a buffer store: An update.
Baddeley, Alan D; Hitch, Graham J
2018-05-30
We regard our multicomponent model of working memory as reflecting a hierarchy of buffer stores with buffer storage providing an effective way of combining information from two or more streams that may differ in either the speed of input or in the features coded. We illustrate this through the case of the phonological loop component of the model. We discuss its gradual development through a combination of evidence from mainstream cognition and neuropsychology with the need for more detailed modelling of issues such as the representation of serial order. A brief account follows of the application, beyond the laboratory and clinic, of the concept of a phonological loop and the methods designed to study it. We then discuss some criticisms of the overall multicomponent model, concluding with a discussion of the major contribution made by neuropsychological evidence to its development together with some suggestions as to comparative lack of influence from more recent studies based on neuro-imaging. Copyright © 2018 Elsevier Ltd. All rights reserved.
Seismic event classification system
Dowla, Farid U.; Jarpe, Stephen P.; Maurer, William
1994-01-01
In the computer interpretation of seismic data, the critical first step is to identify the general class of an unknown event. For example, the classification might be: teleseismic, regional, local, vehicular, or noise. Self-organizing neural networks (SONNs) can be used for classifying such events. Both Kohonen and Adaptive Resonance Theory (ART) SONNs are useful for this purpose. Given the detection of a seismic event and the corresponding signal, computation is made of: the time-frequency distribution, its binary representation, and finally a shift-invariant representation, which is the magnitude of the two-dimensional Fourier transform (2-D FFT) of the binary time-frequency distribution. This pre-processed input is fed into the SONNs. These neural networks are able to group events that look similar. The ART SONN has an advantage in classifying the event because the types of cluster groups do not need to be pre-defined. The results from the SONNs together with an expert seismologist's classification are then used to derive event classification probabilities.
Tchumatchenko, Tatjana; Clopath, Claudia
2014-01-01
Oscillations play a critical role in cognitive phenomena and have been observed in many brain regions. Experimental evidence indicates that classes of neurons exhibit properties that could promote oscillations, such as subthreshold resonance and electrical gap junctions. Typically, these two properties are studied separately but it is not clear which is the dominant determinant of global network rhythms. Our aim is to provide an analytical understanding of how these two effects destabilize the fluctuation-driven state, in which neurons fire irregularly, and lead to an emergence of global synchronous oscillations. Here we show how the oscillation frequency is shaped by single neuron resonance, electrical and chemical synapses.The presence of both gap junctions and subthreshold resonance are necessary for the emergence of oscillations. Our results are in agreement with several experimental observations such as network responses to oscillatory inputs and offer a much-needed conceptual link connecting a collection of disparate effects observed in networks. PMID:25405458
Risk Mitigation for the Development of the New Ariane 5 On-Board Computer
NASA Astrophysics Data System (ADS)
Stransky, Arnaud; Chevalier, Laurent; Dubuc, Francois; Conde-Reis, Alain; Ledoux, Alain; Miramont, Philippe; Johansson, Leif
2010-08-01
In the frame of the Ariane 5 production, some equipment will become obsolete and need to be redesigned and redeveloped. This is the case for the On-Board Computer, which has to be completely redesigned and re-qualified by RUAG Space, as well as all its on-board software and associated development tools by ASTRIUM ST. This paper presents this obsolescence treatment, which has started in 2007 under an ESA contract, in the frame of ACEP and ARTA accompaniment programmes, and is very critical in technical term but also from schedule point of view: it gives the context and overall development plan, and details the risk mitigation actions agreed with ESA, especially those related to the development of the input/output ASIC, and also the on-board software porting and revalidation strategy. The efficiency of these risk mitigation actions has been proven by the outcome schedule; this development constitutes an up-to-date case for good practices, including some experience report and feedback for future other developments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelsen, Nicholas H.; Kolb, James D.; Kulkarni, Akshay G.
Mechanical component response to shock environments must be predictable in order to ensure reliability and safety. Whether the shock input results from accidental drops during transportation to projectile impact scenarios, the system must irreversibly transition into a safe state that is incapable of triggering the component . With this critical need in mind, the 2017 Nuclear Weapons Summer Product Realization Institute (NW SPRINT) program objective sought the design of a passive shock failsafe with emphasis on additively manufactured (AM) components. Team Advanced and Exploratory (A&E) responded to the challenge by designing and delivering multiple passive shock sensing mech anisms thatmore » activate within a prescribed mechanical shock threshold. These AM failsafe designs were tuned and validated using analytical and computational techniques including the shock response spectrum (SRS) and finite element analysis (FEA). After rapid prototyping, the devices experienced physical shock tests conducted on Sandia drop tables to experimentally verify performance. Keywords: Additive manufacturing, dynamic system, failsafe, finite element analysis, mechanical shock, NW SPRINT, shock respon se spectrum« less
Zhang, Zhen; Ma, Cheng; Zhu, Rong
2016-10-14
High integration of multi-functional instruments raises a critical issue in temperature control that is challenging due to its spatial-temporal complexity. This paper presents a multi-input multi-output (MIMO) self-tuning temperature sensing and control system for efficiently modulating the temperature environment within a multi-module instrument. The smart system ensures that the internal temperature of the instrument converges to a target without the need of a system model, thus making the control robust. The system consists of a fully-connected proportional-integral-derivative (PID) neural network (FCPIDNN) and an on-line self-tuning module. The experimental results show that the presented system can effectively control the internal temperature under various mission scenarios, in particular, it is able to self-reconfigure upon actuator failure. The system provides a new scheme for a complex and time-variant MIMO control system which can be widely applied for the distributed measurement and control of the environment in instruments, integration electronics, and house constructions.
Steidl, Stephan; Wang, Huiling; Wise, Roy A
2014-01-01
Cholinergic input to the ventral tegmental area (VTA) is known to contribute to reward. Although it is known that the pedunculopontine tegmental nucleus (PPTg) provides an important source of excitatory input to the dopamine system, the specific role of PPTg cholinergic input to the VTA in cocaine reward has not been previously determined. We used a diphtheria toxin conjugated to urotensin-II (Dtx::UII), the endogenous ligand for urotensin-II receptors expressed by PPTg cholinergic but not glutamatergic or GABAergic cells, to lesion cholinergic PPTg neurons. Dtx::UII toxin infusion resulted in the loss of 95.78 (±0.65)% of PPTg cholinergic cells but did not significantly alter either cocaine or heroin self-administration or the development of cocaine or heroin conditioned place preferences. Thus, cholinergic cells originating in PPTg do not appear to be critical for the rewarding effects of cocaine or of heroin.
Dendritic nonlinearities are tuned for efficient spike-based computations in cortical circuits.
Ujfalussy, Balázs B; Makara, Judit K; Branco, Tiago; Lengyel, Máté
2015-12-24
Cortical neurons integrate thousands of synaptic inputs in their dendrites in highly nonlinear ways. It is unknown how these dendritic nonlinearities in individual cells contribute to computations at the level of neural circuits. Here, we show that dendritic nonlinearities are critical for the efficient integration of synaptic inputs in circuits performing analog computations with spiking neurons. We developed a theory that formalizes how a neuron's dendritic nonlinearity that is optimal for integrating synaptic inputs depends on the statistics of its presynaptic activity patterns. Based on their in vivo preynaptic population statistics (firing rates, membrane potential fluctuations, and correlations due to ensemble dynamics), our theory accurately predicted the responses of two different types of cortical pyramidal cells to patterned stimulation by two-photon glutamate uncaging. These results reveal a new computational principle underlying dendritic integration in cortical neurons by suggesting a functional link between cellular and systems--level properties of cortical circuits.
Dust inputs and bacteria influence dissolved organic matter in clear alpine lakes.
Mladenov, N; Sommaruga, R; Morales-Baquero, R; Laurion, I; Camarero, L; Diéguez, M C; Camacho, A; Delgado, A; Torres, O; Chen, Z; Felip, M; Reche, I
2011-07-26
Remote lakes are usually unaffected by direct human influence, yet they receive inputs of atmospheric pollutants, dust, and other aerosols, both inorganic and organic. In remote, alpine lakes, these atmospheric inputs may influence the pool of dissolved organic matter, a critical constituent for the biogeochemical functioning of aquatic ecosystems. Here, to assess this influence, we evaluate factors related to aerosol deposition, climate, catchment properties, and microbial constituents in a global dataset of 86 alpine and polar lakes. We show significant latitudinal trends in dissolved organic matter quantity and quality, and uncover new evidence that this geographic pattern is influenced by dust deposition, flux of incident ultraviolet radiation, and bacterial processing. Our results suggest that changes in land use and climate that result in increasing dust flux, ultraviolet radiation, and air temperature may act to shift the optical quality of dissolved organic matter in clear, alpine lakes. © 2011 Macmillan Publishers Limited. All rights reserved.
Attention Enhances Synaptic Efficacy and Signal-to-Noise in Neural Circuits
Briggs, Farran; Mangun, George R.; Usrey, W. Martin
2013-01-01
Summary Attention is a critical component of perception. However, the mechanisms by which attention modulates neuronal communication to guide behavior are poorly understood. To elucidate the synaptic mechanisms of attention, we developed a sensitive assay of attentional modulation of neuronal communication. In alert monkeys performing a visual spatial attention task, we probed thalamocortical communication by electrically stimulating neurons in the lateral geniculate nucleus of the thalamus while simultaneously recording shock-evoked responses from monosynaptically connected neurons in primary visual cortex. We found that attention enhances neuronal communication by (1) increasing the efficacy of presynaptic input in driving postsynaptic responses, (2) increasing synchronous responses among ensembles of postsynaptic neurons receiving independent input, and (3) decreasing redundant signals between postsynaptic neurons receiving common input. These results demonstrate that attention finely tunes neuronal communication at the synaptic level by selectively altering synaptic weights, enabling enhanced detection of salient events in the noisy sensory milieu. PMID:23803766
Integrated input protection against discharges for Micro Pattern Gas Detectors readout ASICs
NASA Astrophysics Data System (ADS)
Fiutowski, T.; Dąbrowski, W.; Koperny, S.; Wiącek, P.
2017-02-01
Immunity against possible random discharges inside active detector volume of MPGDs is one of the key aspects that should be addressed in the design of the front-end electronics. This issue becomes particularly critical for systems with high channel counts and high density readout employing the front-end electronics built as multichannel ASICs implemented in modern CMOS technologies, for which the breakdown voltages are in the range of a few Volts. The paper presents the design of various input protection structures integrated in the ASIC manufactured in a 350 nm CMOS process and test results using an electrical circuit to mimic discharges in the detectors.
Why differentiating between health system support and health system strengthening is needed
Chee, Grace; Pielemeier, Nancy; Lion, Ann; Connor, Catherine
2013-01-01
There is increasing recognition that efforts to improve global health cannot be achieved without stronger health systems. Interpretation of health system strengthening (HSS) has varied widely however, with much of the focus to-date on alleviating input constraints, whereas less attention has been given to other performance drivers. It is important to distinguish activities that support the health system, from ones that strengthen the health system. Supporting the health system can include any activity that improves services, from distributing mosquito nets to procuring medicines. These activities improve outcomes primarily by increasing inputs. Strengthening the health system is accomplished by more comprehensive changes to performance drivers such as policies and regulations, organizational structures, and relationships across the health system to motivate changes in behavior and/or allow more effective use of resources to improve multiple health services. Even organizations that have made significant investments in health systems have not provided guidance on what HSS entails. While both supporting and strengthening are important and necessary, it is nonetheless important to make a distinction. If activities fail to produce improvements in system performance because they were incorrectly labeled as system strengthening, the value of HSS investments could quickly be discredited. Not distinguishing supportive activities from strengthening ones will lead to unmet expectations of stronger health systems, as well as neglect of critical system strengthening activities. Distinguishing between these two types of activities will improve programming impact. Copyright © 2012 John Wiley & Sons, Ltd. PMID:22777839
NASA Technical Reports Server (NTRS)
Hendricks, R. C.
1994-01-01
A computer program, GASP, has been written to calculate the thermodynamic and transport properties of argon, carbon dioxide, carbon monoxide, fluorine, methane, neon, nitrogen, and oxygen. GASP accepts any two of pressure, temperature, or density as input. In addition, entropy and enthalpy are possible inputs. Outputs are temperature, density, pressure, entropy, enthalpy, specific heats, expansion coefficient, sonic velocity, viscosity, thermal conductivity, and surface tension. A special technique is provided to estimate the thermal conductivity near the thermodynamic critical point. GASP is a group of FORTRAN subroutines. The user typically would write a main program that invoked GASP to provide only the described outputs. Subroutines are structured so that the user may call only those subroutines needed for his particular calculations. Allowable pressures range from 0.l atmosphere to 100 to l,000 atmospheres, depending on the fluid. Similarly, allowable pressures range from the triple point of each substance to 300 degrees K to 2000 degrees K, depending on the substance. The GASP package was developed to be used with heat transfer and fluid flow applications. It is particularly useful in applications of cryogenic fluids. Some problems associated with the liquefication, storage, and gasification of liquefied natural gas and liquefied petroleum gas can also be studied using GASP. This program is written in FORTRAN IV for batch execution and is available for implementation on IBM 7000 series computers. GASP was developed in 1971.
Turbomachinery Forced Response Prediction System (FREPS): User's Manual
NASA Technical Reports Server (NTRS)
Morel, M. R.; Murthy, D. V.
1994-01-01
The turbomachinery forced response prediction system (FREPS), version 1.2, is capable of predicting the aeroelastic behavior of axial-flow turbomachinery blades. This document is meant to serve as a guide in the use of the FREPS code with specific emphasis on its use at NASA Lewis Research Center (LeRC). A detailed explanation of the aeroelastic analysis and its development is beyond the scope of this document, and may be found in the references. FREPS has been developed by the NASA LeRC Structural Dynamics Branch. The manual is divided into three major parts: an introduction, the preparation of input, and the procedure to execute FREPS. Part 1 includes a brief background on the necessity of FREPS, a description of the FREPS system, the steps needed to be taken before FREPS is executed, an example input file with instructions, presentation of the geometric conventions used, and the input/output files employed and produced by FREPS. Part 2 contains a detailed description of the command names needed to create the primary input file that is required to execute the FREPS code. Also, Part 2 has an example data file to aid the user in creating their own input files. Part 3 explains the procedures required to execute the FREPS code on the Cray Y-MP, a computer system available at the NASA LeRC.
ITS benefits : 2003 data needs survey
DOT National Transportation Integrated Search
2003-09-01
The 2003 Data Needs survey was the first to use a series of web-based survey forms to allow ITS : stakeholders to provide input regarding ITS evaluation priorities. Survey participants were asked : to rate ITS application areas based on their assessm...
Alternate Models of Needs Assessment: Selecting the Right One for Your Organization.
ERIC Educational Resources Information Center
Leigh, Doug; Watkins, Ryan; Platt, William A.; Kaufman, Roger
2000-01-01
Defines needs assessment and compares different models in terms of levels (mega, macro, micro) and process and input. Recommends assessment of strengths and weakness of a model before using it in human resource development. (SK)
Critical dynamics on a large human Open Connectome network
NASA Astrophysics Data System (ADS)
Ódor, Géza
2016-12-01
Extended numerical simulations of threshold models have been performed on a human brain network with N =836 733 connected nodes available from the Open Connectome Project. While in the case of simple threshold models a sharp discontinuous phase transition without any critical dynamics arises, variable threshold models exhibit extended power-law scaling regions. This is attributed to fact that Griffiths effects, stemming from the topological or interaction heterogeneity of the network, can become relevant if the input sensitivity of nodes is equalized. I have studied the effects of link directness, as well as the consequence of inhibitory connections. Nonuniversal power-law avalanche size and time distributions have been found with exponents agreeing with the values obtained in electrode experiments of the human brain. The dynamical critical region occurs in an extended control parameter space without the assumption of self-organized criticality.
Growth and yield model application in tropical rain forest management
James Atta-Boateng; John W., Jr. Moser
2000-01-01
Analytical tools are needed to evaluate the impact of management policies on the sustainable use of rain forest. Optimal decisions concerning the level of management inputs require accurate predictions of output at all relevant input levels. Using growth data from 40 l-hectare permanent plots obtained from the semi-deciduous forest of Ghana, a system of 77 differential...
ERIC Educational Resources Information Center
Lloyd, David; Norrie, Fiona
2004-01-01
Despite increased engagement of Indigenous representatives as participants on consultative panels charged with processes of natural resource management, concerns have been raised by both Indigenous representatives and management agencies regarding the ability of Indigenous people to have quality input into the decisions these processes produce. In…
The NASTRAN User's Manual (Level 15)
NASA Technical Reports Server (NTRS)
Mccormick, C. W. (Editor)
1972-01-01
The User's manual for the NASA Structural Analysis (NASTRAN) program is presented. The manual contains all information needed to solve problems with NASTRAN. The volume is instructional and encyclopedic. The manual includes instruction in structural modeling techniques, instruction in input preparation, and information to assist the interpretation of the output. Descriptions of all input data cards, restart procedures, and diagnostic messages are developed.
Spectrum Situational Awareness Capability: The Military Need and Potential Implementation Issues
2006-10-01
Management Sensor Systems Frequency Management EW Systems Frequency Management Allied Battlespace Spectrum Management Restricted Frequency List Frequency...Management Restricted Frequency List Frequency Allocation Table Civil Frequency Use Data Inputs Negotiation and allocation process © Dstl 2006 26th...Management Restricted Frequency List Data Inputs Negotiation and allocation process Frequency Allocation Table SSA ES INT COP etc WWW Spectrum
NASA Technical Reports Server (NTRS)
1976-01-01
Inputs from prospective LANDSAT-C data users are requested to aid NASA in defining LANDSAT-C mission and data requirements and in making decisions regarding the scheduling of satellite operations and ground data processing operations. Design specifications, multispectral band scanner performance characteristics, satellite schedule operations, and types of available data products are briefly described.
Fusion of Hard and Soft Information in Nonparametric Density Estimation
2015-06-10
and stochastic optimization models, in analysis of simulation output, and when instantiating probability models. We adopt a constrained maximum...particular, density estimation is needed for generation of input densities to simulation and stochastic optimization models, in analysis of simulation output...an essential step in simulation analysis and stochastic optimization is the generation of probability densities for input random variables; see for
Garden, Derek L. F.; Rinaldi, Arianna
2016-01-01
Key points We establish experimental preparations for optogenetic investigation of glutamatergic input to the inferior olive.Neurones in the principal olivary nucleus receive monosynaptic extra‐somatic glutamatergic input from the neocortex.Glutamatergic inputs to neurones in the inferior olive generate bidirectional postsynaptic potentials (PSPs), with a fast excitatory component followed by a slower inhibitory component.Small conductance calcium‐activated potassium (SK) channels are required for the slow inhibitory component of glutamatergic PSPs and oppose temporal summation of inputs at intervals ≤ 20 ms.Active integration of synaptic input within the inferior olive may play a central role in control of olivo‐cerebellar climbing fibre signals. Abstract The inferior olive plays a critical role in motor coordination and learning by integrating diverse afferent signals to generate climbing fibre inputs to the cerebellar cortex. While it is well established that climbing fibre signals are important for motor coordination, the mechanisms by which neurones in the inferior olive integrate synaptic inputs and the roles of particular ion channels are unclear. Here, we test the hypothesis that neurones in the inferior olive actively integrate glutamatergic synaptic inputs. We demonstrate that optogenetically activated long‐range synaptic inputs to the inferior olive, including projections from the motor cortex, generate rapid excitatory potentials followed by slower inhibitory potentials. Synaptic projections from the motor cortex preferentially target the principal olivary nucleus. We show that inhibitory and excitatory components of the bidirectional synaptic potentials are dependent upon AMPA (GluA) receptors, are GABAA independent, and originate from the same presynaptic axons. Consistent with models that predict active integration of synaptic inputs by inferior olive neurones, we find that the inhibitory component is reduced by blocking large conductance calcium‐activated potassium channels with iberiotoxin, and is abolished by blocking small conductance calcium‐activated potassium channels with apamin. Summation of excitatory components of synaptic responses to inputs at intervals ≤ 20 ms is increased by apamin, suggesting a role for the inhibitory component of glutamatergic responses in temporal integration. Our results indicate that neurones in the inferior olive implement novel rules for synaptic integration and suggest new principles for the contribution of inferior olive neurones to coordinated motor behaviours. PMID:27767209
Sensory-evoked perturbations of locomotor activity by sparse sensory input: a computational study
Brownstone, Robert M.
2015-01-01
Sensory inputs from muscle, cutaneous, and joint afferents project to the spinal cord, where they are able to affect ongoing locomotor activity. Activation of sensory input can initiate or prolong bouts of locomotor activity depending on the identity of the sensory afferent activated and the timing of the activation within the locomotor cycle. However, the mechanisms by which afferent activity modifies locomotor rhythm and the distribution of sensory afferents to the spinal locomotor networks have not been determined. Considering the many sources of sensory inputs to the spinal cord, determining this distribution would provide insights into how sensory inputs are integrated to adjust ongoing locomotor activity. We asked whether a sparsely distributed set of sensory inputs could modify ongoing locomotor activity. To address this question, several computational models of locomotor central pattern generators (CPGs) that were mechanistically diverse and generated locomotor-like rhythmic activity were developed. We show that sensory inputs restricted to a small subset of the network neurons can perturb locomotor activity in the same manner as seen experimentally. Furthermore, we show that an architecture with sparse sensory input improves the capacity to gate sensory information by selectively modulating sensory channels. These data demonstrate that sensory input to rhythm-generating networks need not be extensively distributed. PMID:25673740
Canfield, Christina; Taylor, Debi; Nagy, Kimberly; Strauser, Claire; VanKerkhove, Karen; Wills, Stephanie; Sawicki, Patricia; Sorrell, Jeanne
2016-05-01
The term spirituality is highly subjective. No common or universally accepted definition for the term exists. Without a clear definition, each nurse must reconcile his or her own beliefs within a framework mutually suitable for both nurse and patient. To examine individual critical care nurses' definition of spirituality, their comfort in providing spiritual care to patients, and their perceived need for education in providing this care. Individual interviews with 30 nurses who worked in a critical care unit at a large Midwestern teaching hospital. Nurses generally feel comfortable providing spiritual care to critically ill patients but need further education about multicultural considerations. Nurses identified opportunities to address spiritual needs throughout a patient's stay but noted that these needs are usually not addressed until the end of life. A working definition for spirituality in health care was developed: That part of person that gives meaning and purpose to the person's life. Belief in a higher power that may inspire hope, seek resolution, and transcend physical and conscious constraints. ©2016 American Association of Critical-Care Nurses.
Geriatric Training Needs of Nursing-Home Physicians
ERIC Educational Resources Information Center
Lubart, Emily; Segal, Refael; Rosenfeld, Vera; Madjar, Jack; Kakuriev, Michael; Leibovitz, Arthur
2009-01-01
Medical care in nursing homes is not provided by board-licensed geriatricians; it mainly comes from physicians in need of educational programs in the field of geriatrics. Such programs, based on curriculum guidelines, should be developed. The purpose of this study was to seek input from nursing home physicians on their perceived needs for training…
Differences between Employees' and Supervisors' Evaluations of Work Performance and Support Needs
ERIC Educational Resources Information Center
Bennett, Kyle; Frain, Michael; Brady, Michael P.; Rosenberg, Howard; Surinak, Tricia
2009-01-01
Assessment systems are needed that are sensitive to employees' work performance as well as their need for support, while incorporating the input from both employees and their supervisors. This study examined the correspondence of one such evaluation system, the Job Observation and Behavior Scale (JOBS) and the JOBS: Opportunity for…
The present status and problems in document retrieval system : document input type retrieval system
NASA Astrophysics Data System (ADS)
Inagaki, Hirohito
The office-automation (OA) made many changes. Many documents were begun to maintained in an electronic filing system. Therefore, it is needed to establish efficient document retrieval system to extract useful information. Current document retrieval systems are using simple word-matching, syntactic-matching, semantic-matching to obtain high retrieval efficiency. On the other hand, the document retrieval systems using special hardware devices, such as ISSP, were developed for aiming high speed retrieval. Since these systems can accept a single sentence or keywords as input, it is difficult to explain searcher's request. We demonstrated document input type retrieval system, which can directly accept document as an input, and can search similar documents from document data-base.
Delpierre, Nicolas; Berveiller, Daniel; Granda, Elena; Dufrêne, Eric
2016-04-01
Although the analysis of flux data has increased our understanding of the interannual variability of carbon inputs into forest ecosystems, we still know little about the determinants of wood growth. Here, we aimed to identify which drivers control the interannual variability of wood growth in a mesic temperate deciduous forest. We analysed a 9-yr time series of carbon fluxes and aboveground wood growth (AWG), reconstructed at a weekly time-scale through the combination of dendrometer and wood density data. Carbon inputs and AWG anomalies appeared to be uncorrelated from the seasonal to interannual scales. More than 90% of the interannual variability of AWG was explained by a combination of the growth intensity during a first 'critical period' of the wood growing season, occurring close to the seasonal maximum, and the timing of the first summer growth halt. Both atmospheric and soil water stress exerted a strong control on the interannual variability of AWG at the study site, despite its mesic conditions, whilst not affecting carbon inputs. Carbon sink activity, not carbon inputs, determined the interannual variations in wood growth at the study site. Our results provide a functional understanding of the dependence of radial growth on precipitation observed in dendrological studies. © 2015 The Authors. New Phytologist © 2015 New Phytologist Trust.
NEURAL NETWORK INTERACTIONS AND INGESTIVE BEHAVIOR CONTROL DURING ANOREXIA
Watts, Alan G.; Salter, Dawna S.; Neuner, Christina M.
2007-01-01
Many models have been proposed over the years to explain how motivated feeding behavior is controlled. One of the most compelling is based on the original concepts of Eliot Stellar whereby sets of interosensory and exterosensory inputs converge on a hypothalamic control network that can either stimulate or inhibit feeding. These inputs arise from information originating in the blood, the viscera, and the telencephalon. In this manner the relative strengths of the hypothalamic stimulatory and inhibitory networks at a particular time dictates how an animal feeds. Anorexia occurs when the balance within the networks consistently favors the restraint of feeding. This article discusses experimental evidence supporting a model whereby the increases in plasma osmolality that result from drinking hypertonic saline activate pathways projecting to neurons in the paraventricular nucleus of the hypothalamus (PVH) and lateral hypothalamic area (LHA). These neurons constitute the hypothalamic controller for ingestive behavior, and receive a set of afferent inputs from regions of the brain that process sensory information that is critical for different aspects of feeding. Important sets of inputs arise in the arcuate nucleus, the hindbrain, and in the telencephalon. Anorexia is generated in dehydrated animals by way of osmosensitive projections to the behavior control neurons in the PVH and LHA, rather than by actions on their afferent inputs. PMID:17531275
Kuhnen, Shirley; Stibuski, Rudinei Butka; Honorato, Luciana Aparecida; Pinheiro Machado Filho, Luiz Carlos
2015-01-01
Simple Summary This study provides the characteristics of the conventional high input (C-HI), conventional low input (C-LI), and organic low input (O-LI) pasture-based production systems used in Southern Brazil, and its consequences on production and milk quality. C-HI farms had larger farms and herds, annual pasture with higher inputs and milk yield, whereas O-LI had smaller farms and herds, perennial pastures with lowest input and milk yields; C-LI was in between. O-LI farms may contribute to eco-system services, but low milk yield is a major concern. Hygienic and microbiological milk quality was poor for all farms and needs to be improved. Abstract Pasture-based dairy production is used widely on family dairy farms in Southern Brazil. This study investigates conventional high input (C-HI), conventional low input (C-LI), and organic low input (O-LI) pasture-based systems and their effects on quantity and quality of the milk produced. We conducted technical site visits and interviews monthly over one year on 24 family farms (n = 8 per type). C-HI farms had the greatest total area (28.9 ha), greatest percentage of area with annual pasture (38.7%), largest number of lactating animals (26.2) and greatest milk yield per cow (22.8 kg·day−1). O-LI farms had the largest perennial pasture area (52.3%), with the greatest botanical richness during all seasons. Area of perennial pasture was positively correlated with number of species consumed by the animals (R2 = 0.74). Milk from O-LI farms had higher levels of fat and total solids only during the winter. Hygienic and microbiological quality of the milk was poor for all farms and need to be improved. C-HI farms had high milk yield related to high input, C-LI had intermediate characteristics and O-LI utilized a year round perennial pasture as a strategy to diminish the use of supplements in animal diets, which is an important aspect in ensuring production sustainability. PMID:26479369
Blank, Jos L T; van Hulst, Bart L
2017-02-17
Well-trained, well-distributed and productive health workers are crucial for access to high-quality, cost-effective healthcare. Because neither a shortage nor a surplus of health workers is wanted, policymakers use workforce planning models to get information on future labour markets and adjust policies accordingly. A neglected topic of workforce planning models is productivity growth, which has an effect on future demand for labour. However, calculating productivity growth for specific types of input is not as straightforward as it seems. This study shows how to calculate factor technical change (FTC) for specific types of input. The paper first theoretically derives FTCs from technical change in a consistent manner. FTC differs from a ratio of output and input, in that it deals with the multi-input, multi-output character of the production process in the health sector. Furthermore, it takes into account substitution effects between different inputs. An application of the calculation of FTCs is given for the Dutch hospital industry for the period 2003-2011. A translog cost function is estimated and used to calculate technical change and FTC for individual inputs, especially specific labour inputs. The results show that technical change increased by 2.8% per year in Dutch hospitals during 2003-2011. FTC differs amongst the various inputs. The FTC of nursing personnel increased by 3.2% per year, implying that fewer nurses were needed to let demand meet supply on the labour market. Sensitivity analyses show consistent results for the FTC of nurses. Productivity growth, especially of individual outputs, is a neglected topic in workforce planning models. FTC is a productivity measure that is consistent with technical change and accounts for substitution effects. An application to the Dutch hospital industry shows that the FTC of nursing personnel outpaced technical change during 2003-2011. The optimal input mix changed, resulting in fewer nurses being needed to let demand meet supply on the labour market. Policymakers should consider using more detailed and specific data on the nature of technical change when forecasting the future demand for health workers.
Microgravity Disturbance Predictions in the Combustion Integrated Rack
NASA Astrophysics Data System (ADS)
Just, M.; Grodsinsky, Carlos M.
2002-01-01
This paper will focus on the approach used to characterize microgravity disturbances in the Combustion Integrated Rack (CIR), currently scheduled for launch to the International Space Station (ISS) in 2005. Microgravity experiments contained within the CIR are extremely sensitive to vibratory and transient disturbances originating on-board and off-board the rack. Therefore, several techniques are implemented to isolate the critical science locations from external vibration. A combined testing and analysis approach is utilized to predict the resulting microgravity levels at the critical science location. The major topics to be addressed are: 1) CIR Vibration Isolation Approaches, 2) Disturbance Sources and Characterization, 3) Microgravity Predictive Modeling, 4) Science Microgravity Requirements, 6) Microgravity Control, and 7) On-Orbit Disturbance Measurement. The CIR is using the Passive Rack Isolation System (PaRIS) to isolate the rack from offboard rack disturbances. By utilizing this system, CIR is connected to the U.S. Lab module structure by either 13 or 14 umbilical lines and 8 spring / damper isolators. Some on-board CIR disturbers are locally isolated by grommets or wire ropes. CIR's environmental and science on board support equipment such as air circulation fans, pumps, water flow, air flow, solenoid valves, and computer hard drives cause disturbances within the rack. These disturbers along with the rack structure must be characterized to predict whether the on-orbit vibration levels during experimentation exceed the specified science microgravity vibration level requirements. Both vibratory and transient disturbance conditions are addressed. Disturbance levels/analytical inputs are obtained for each individual disturber in a "free floating" condition in the Glenn Research Center (GRC) Microgravity Emissions Lab (MEL). Flight spare hardware is tested on an Orbital Replacement Unit (ORU) basis. Based on test and analysis, maximum disturbance level allocations are developed for each ORU. The worst-case disturbances are input into an on-orbit analytical dynamic model of the rack. These models include both NASTRAN and MATLAB Simulink models , which include eigenvector and frequency inputs of the rack rigid body modes, the rack umbilical modes, and the racks' structural modes. The disturbance areas and science locations need to be modeled accurately to give valid predictions. The analytically determined microgravity vibration levels are compared to the CIR science requirements contained in the FCF Science Requirements Envelope Document (SRED). The predicted levels will be compared with the on-orbit measurements provided by the Space Acceleration Measurement System (SAMS) sensor, which is to be mounted on the CIR optics bench.
Rapid Diagnostics of Onboard Sequences
NASA Technical Reports Server (NTRS)
Starbird, Thomas W.; Morris, John R.; Shams, Khawaja S.; Maimone, Mark W.
2012-01-01
Keeping track of sequences onboard a spacecraft is challenging. When reviewing Event Verification Records (EVRs) of sequence executions on the Mars Exploration Rover (MER), operators often found themselves wondering which version of a named sequence the EVR corresponded to. The lack of this information drastically impacts the operators diagnostic capabilities as well as their situational awareness with respect to the commands the spacecraft has executed, since the EVRs do not provide argument values or explanatory comments. Having this information immediately available can be instrumental in diagnosing critical events and can significantly enhance the overall safety of the spacecraft. This software provides auditing capability that can eliminate that uncertainty while diagnosing critical conditions. Furthermore, the Restful interface provides a simple way for sequencing tools to automatically retrieve binary compiled sequence SCMFs (Space Command Message Files) on demand. It also enables developers to change the underlying database, while maintaining the same interface to the existing applications. The logging capabilities are also beneficial to operators when they are trying to recall how they solved a similar problem many days ago: this software enables automatic recovery of SCMF and RML (Robot Markup Language) sequence files directly from the command EVRs, eliminating the need for people to find and validate the corresponding sequences. To address the lack of auditing capability for sequences onboard a spacecraft during earlier missions, extensive logging support was added on the Mars Science Laboratory (MSL) sequencing server. This server is responsible for generating all MSL binary SCMFs from RML input sequences. The sequencing server logs every SCMF it generates into a MySQL database, as well as the high-level RML file and dictionary name inputs used to create the SCMF. The SCMF is then indexed by a hash value that is automatically included in all command EVRs by the onboard flight software. Second, both the binary SCMF result and the RML input file can be retrieved simply by specifying the hash to a Restful web interface. This interface enables command line tools as well as large sophisticated programs to download the SCMF and RMLs on-demand from the database, enabling a vast array of tools to be built on top of it. One such command line tool can retrieve and display RML files, or annotate a list of EVRs by interleaving them with the original sequence commands. This software has been integrated with the MSL sequencing pipeline where it will serve sequences useful in diagnostics, debugging, and situational awareness throughout the mission.
Computer Security: Improvements Needed to Reduce Risk to Critical Federal Operations and Assets
2001-11-09
COMPUTER SECURITY Improvements Needed to Reduce Risk to Critical Federal Operations and Assets Statement of Robert F. Dacey Director, Information...Improvements Needed to Reduce Risk to Critical Federal Operations and Assets Contract Number Grant Number Program Element Number Author(s...The benefits have been enormous. Vast amounts of information are now literally at our fingertips, facilitating research on virtually every topic
Impacts of vegetation change on groundwater recharge
NASA Astrophysics Data System (ADS)
Bond, W. J.; Verburg, K.; Smith, C. J.
2003-12-01
Vegetation change is the accepted cause of increasing river salt concentrations and the salinisation of millions of hectares of farm land in Australia. Replacement of perennial native vegetation by annual crops and pastures following European settlement has altered the water balance causing increased groundwater recharge and mobilising the naturally saline groundwater. The Redesigning Agriculture for Australian Landscapes Program, of which the work described here is a part, was established to develop agricultural practices that are more attuned to the delicate water balance described above. Results of field measurements will be presented that contrast the water balance characteristics of native vegetation with those of conventional agricultural plants, and indicate the functional characteristics required of new agricultural practices to reduce recharge. New agricultural practices may comprise different management of current crops and pastures, or may involve introducing totally new species. In either case, long-term testing is required to examine their impact on recharge over a long enough climate record to encompass the natural variability of rainfall that is characteristic of most Australian farming regions. Field experimentation therefore needs to be complemented and extended by computer simulation. This requires a modelling approach that is more robust than conventional crop modelling because (a) it needs to be sensitive enough to predict small changes in the residual recharge term, (b) it needs to be able to simulate a variety of vegetation in different sequences, (c) it needs to be able to simulate continuously for several decades of input data, and (d) it therefore needs to be able to simulate the period between crops, which often has a critical impact on recharge. The APSIM simulation framework will be used to illustrate these issues and to explore the effect of different vegetation combinations on recharge.
Lucchini, Roberto G; Hashim, Dana; Acquilla, Sushma; Basanets, Angela; Bertazzi, Pier Alberto; Bushmanov, Andrey; Crane, Michael; Harrison, Denise J; Holden, William; Landrigan, Philip J; Luft, Benjamin J; Mocarelli, Paolo; Mazitova, Nailya; Melius, James; Moline, Jacqueline M; Mori, Koji; Prezant, David; Reibman, Joan; Reissman, Dori B; Stazharau, Alexander; Takahashi, Ken; Udasin, Iris G; Todd, Andrew C
2017-01-07
The disasters at Seveso, Three Mile Island, Bhopal, Chernobyl, the World Trade Center (WTC) and Fukushima had historic health and economic sequelae for large populations of workers, responders and community members. Comparative data from these events were collected to derive indications for future preparedness. Information from the primary sources and a literature review addressed: i) exposure assessment; ii) exposed populations; iii) health surveillance; iv) follow-up and research outputs; v) observed physical and mental health effects; vi) treatment and benefits; and vii) outreach activities. Exposure assessment was conducted in Seveso, Chernobyl and Fukushima, although none benefited from a timely or systematic strategy, yielding immediate and sequential measurements after the disaster. Identification of exposed subjects was overall underestimated. Health surveillance, treatment and follow-up research were implemented in Seveso, Chernobyl, Fukushima, and at the WTC, mostly focusing on the workers and responders, and to a lesser extent on residents. Exposure-related physical and mental health consequences were identified, indicating the need for a long-term health care of the affected populations. Fukushima has generated the largest scientific output so far, followed by the WTCHP and Chernobyl. Benefits programs and active outreach figured prominently in only the WTC Health Program. The analysis of these programs yielded the following lessons: 1) Know who was there; 2) Have public health input to the disaster response; 3) Collect health and needs data rapidly; 4) Take care of the affected; 5) Emergency preparedness; 6) Data driven, needs assessment, advocacy. Given the long-lasting health consequences of natural and man-made disasters, health surveillance and treatment programs are critical for management of health conditions, and emergency preparedness plans are needed to prevent or minimize the impact of future threats.
NASA Astrophysics Data System (ADS)
Liu, Qianqian; Chai, Fei; Dugdale, Richard; Chao, Yi; Xue, Huijie; Rao, Shivanesh; Wilkerson, Frances; Farrara, John; Zhang, Hongchun; Wang, Zhengui; Zhang, Yinglong
2018-06-01
An open source coupled physical-biogeochemical model is developed for San Francisco Bay (SFB) to study nutrient cycling and plankton dynamics as well as to assist ecosystem based management and risk assessment. The biogeochemical model in this study is based on the Carbon, Silicate and Nitrogen Ecosystem (CoSiNE) model, and coupled to the unstructured grid, Semi-Implicit Cross-scale Hydroscience Integrated System Model (SCHISM). The SCHISM-CoSiNE model reproduces the spatial and temporal variability in nutrients and plankton biomass, and its physical and biogeochemical performance is successfully tested using comparisons with shipboard and fixed station observations. The biogeochemical characteristics of the SFB during wet and dry years are investigated by changing the input of the major rivers. River discharges from the Sacramento and San Joaquin Rivers affect the phytoplankton biomass in North SFB through both advection and dilution of nutrient (including ammonium, NH4) concentrations in the river. The reduction in residence time caused by increased inflows can result in decreased biomass accumulation, while the corresponding reduction in NH4 concentration favors the growth of biomass. In addition, the model is used to make a series of sensitivity experiments to examine the response of SFB to changes in 1) nutrient loading from rivers and wastewater treatment plants (WWTPs), 2) a parameter (ψ) defining NH4 inhibition of nitrate (NO3) uptake by phytoplankton, 3) bottom grazing and 4) suspended sediment concentration. The model results show that changes in NH4 input from rivers or WWTPs affect the likelihood of phytoplankton blooms via NH4 inhibition and that the choice of ψ is critical. Bottom grazing simulated here as increased plankton mortality demonstrates the potential for bivalve reduction of chlorophyll biomass and the need to include bivalve grazing in future models. Furthermore, the model demonstrates the need to include sediments and their contribution to turbidity and availability of light. This biogeochemical model is suitable for other estuaries with similar ecological issues and anthropogenic stressors.
NASA Astrophysics Data System (ADS)
Djomo, S. Njakou; Knudsen, M. T.; Andersen, M. S.; Hermansen, J. E.
2017-11-01
There is an ongoing debate regarding the influence of the source location of pollution on the fate of pollutants and their subsequent impacts. Several methods have been developed to derive site-dependent characterization factors (CFs) for use in life-cycle assessment (LCA). Consistent, precise, and accurate estimates of CFs are crucial for establishing long-term, sustainable air pollution abatement policies. We reviewed currently available studies on the regionalization of non-toxic air pollutants in LCA. We also extracted and converted data into indices for analysis. We showed that CFs can distinguish between emissions occurring in different locations, and that the different methods used to derive CFs map locations consistently from very sensitive to less sensitive. Seasonal variations are less important for the computation of CFs for acidification and eutrophication, but they are relevant for the calculation of CFs for tropospheric ozone formation. Large intra-country differences in estimated CFs suggest that an abatement policy relying on quantitative estimates based upon a single method may have undesirable outcomes. Within country differences in estimates of CFs for acidification and eutrophication are the results of the models used, category definitions, soil sensitivity factors, background emission concentration, critical loads database, and input data. Striking features in these studies were the lack of CFs for countries outside Europe, the USA, Japan, and Canada, the lack of quantification of uncertainties. Parameter and input data uncertainties are well quantified, but the uncertainty associated with the choice of category indicator is rarely quantified and this can be significant. Although CFs are scientifically robust, further refinements are needed before they can be integrated in LCA. Future research should include uncertainty analyses, and should develop a consensus model for CFs. CFs for countries outside Europe, Japan, Canada and the USA are urgently needed.
xLPR Sim Editor 1.0 User's Guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mariner, Paul E.
2017-03-01
The United States Nuclear Regulatory Commission in cooperation with the Electric Power Research Institute contracted Sandia National Laboratories to develop the framework of a probabilistic fracture mechanics assessment code called xLPR ( Extremely Low Probability of Rupture) Version 2.0 . The purpose of xLPR is to evaluate degradation mechanisms in piping systems at nuclear power plants and to predict the probability of rupture. This report is a user's guide for xLPR Sim Editor 1.0 , a graphical user interface for creating and editing the xLPR Version 2.0 input file and for creating, editing, and using the xLPR Version 2.0 databasemore » files . The xLPR Sim Editor, provides a user - friendly way for users to change simulation options and input values, s elect input datasets from xLPR data bases, identify inputs needed for a simulation, and create and modify an input file for xLPR.« less
Novel Models of Visual Topographic Map Alignment in the Superior Colliculus
El-Ghazawi, Tarek A.; Triplett, Jason W.
2016-01-01
The establishment of precise neuronal connectivity during development is critical for sensing the external environment and informing appropriate behavioral responses. In the visual system, many connections are organized topographically, which preserves the spatial order of the visual scene. The superior colliculus (SC) is a midbrain nucleus that integrates visual inputs from the retina and primary visual cortex (V1) to regulate goal-directed eye movements. In the SC, topographically organized inputs from the retina and V1 must be aligned to facilitate integration. Previously, we showed that retinal input instructs the alignment of V1 inputs in the SC in a manner dependent on spontaneous neuronal activity; however, the mechanism of activity-dependent instruction remains unclear. To begin to address this gap, we developed two novel computational models of visual map alignment in the SC that incorporate distinct activity-dependent components. First, a Correlational Model assumes that V1 inputs achieve alignment with established retinal inputs through simple correlative firing mechanisms. A second Integrational Model assumes that V1 inputs contribute to the firing of SC neurons during alignment. Both models accurately replicate in vivo findings in wild type, transgenic and combination mutant mouse models, suggesting either activity-dependent mechanism is plausible. In silico experiments reveal distinct behaviors in response to weakening retinal drive, providing insight into the nature of the system governing map alignment depending on the activity-dependent strategy utilized. Overall, we describe novel computational frameworks of visual map alignment that accurately model many aspects of the in vivo process and propose experiments to test them. PMID:28027309
Lee, Taehee; Kim, Uhnoh
2012-04-01
In the mammalian somatic system, peripheral inputs from cutaneous and deep receptors ascend via different subcortical channels and terminate in largely separate regions of the primary somatosensory cortex (SI). How these inputs are processed in SI and then projected back to the subcortical relay centers is critical for understanding how SI may regulate somatic information processing in the subcortex. Although it is now relatively well understood how SI cutaneous areas project to the subcortical structures, little is known about the descending projections from SI areas processing deep somatic input. We examined this issue by using the rodent somatic system as a model. In rat SI, deep somatic input is processed mainly in the dysgranular zone (DSZ) enclosed by the cutaneous barrel subfields. By using biotinylated dextran amine (BDA) as anterograde tracer, we characterized the topography of corticostriatal and corticofugal projections arising in the DSZ. The DSZ projections terminate mainly in the lateral subregions of the striatum that are also known as the target of certain SI cutaneous areas. This suggests that SI processing of deep and cutaneous information may be integrated, to a certain degree, in this striatal region. By contrast, at both thalamic and prethalamic levels as far as the spinal cord, descending projections from DSZ terminate in areas largely distinguishable from those that receive input from SI cutaneous areas. These subcortical targets of DSZ include not only the sensory but also motor-related structures, suggesting that SI processing of deep input may engage in regulating somatic and motor information flow between the cortex and periphery. Copyright © 2011 Wiley-Liss, Inc.
Neuromuscular mechanisms and neural strategies in the control of time-varying muscle contractions.
Erimaki, Sophia; Agapaki, Orsalia M; Christakos, Constantinos N
2013-09-01
The organization of the neural input to motoneurons that underlies time-varying muscle force is assumed to depend on muscle transfer characteristics and neural strategies or control modes utilizing sensory signals. We jointly addressed these interlinked, but previously studied individually and partially, issues for sinusoidal (range 0.5-5.0 Hz) force-tracking contractions of a human finger muscle. Using spectral and correlation analyses of target signal, force signal, and motor unit (MU) discharges, we studied 1) patterns of such discharges, allowing inferences on the motoneuronal input; 2) transformation of MU population activity (EMG) into quasi-sinusoidal force; and 3) relation of force oscillation to target, carrying information on the input's organization. A broad view of force control mechanisms and strategies emerged. Specifically, synchronized MU and EMG modulations, reflecting a frequency-modulated motoneuronal input, accompanied the force variations. Gain and delay drops between EMG modulation and force oscillation, critical for the appropriate organization of this input, occurred with increasing target frequency. According to our analyses, gain compensation was achieved primarily through rhythmical activation/deactivation of higher-threshold MUs and secondarily through the adaptation of the input's strength expected during tracking tasks. However, the input's timing was not adapted to delay behaviors and seemed to depend on the control modes employed. Thus, for low-frequency targets, the force oscillation was highly coherent with, but led, a target, this timing error being compatible with predictive feedforward control partly based on the target's derivatives. In contrast, the force oscillation was weakly coherent, but in phase, with high-frequency targets, suggesting control mainly based on a target's rhythm.
An Integrative Review of the Concealed Connection: Nurse Educators' Critical Thinking.
Raymond, Christy; Profetto-McGrath, Joanne; Myrick, Florence; Strean, William B
2017-11-01
The role of nurse educators in the development of students' critical thinking has been overlooked despite the emphasized need for effective teaching methods. An integrative review was performed to examine both quantitative and qualitative research published from 2000 to 2015 related to nurse educators' critical thinking. Many barriers and facilitators existing on individual, interpersonal, and contextual levels affected nurse educators' critical thinking. Various tools have been used to measure nurse educators' critical thinking. This review also highlighted the continued lack of a consensus definition of critical thinking and the limited presence of conceptual models to guide the use of critical thinking in nursing education. Continued examination of nurse educators' critical thinking is needed, given the limited number of studies that have been completed. Much needs to be explored further, including conceptualizations of critical thinking and confirmation of emerging themes identified in this review. [J Nurs Educ. 2017;56(11):648-654.]. © 2017 Raymond, Profetto-McGrath, Myrick, et al.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-21
... meeting basic human needs with the skills they need to lead positive, productive, and contributing lives... guidance to NIFA in restructuring the program and assisting NIFA leadership in more fully addressing...
What exactly is Universal Grammar, and has anyone seen it?
Dąbrowska, Ewa
2015-01-01
Universal Grammar (UG) is a suspect concept. There is little agreement on what exactly is in it; and the empirical evidence for it is very weak. This paper critically examines a variety of arguments that have been put forward as evidence for UG, focussing on the three most powerful ones: universality (all human languages share a number of properties), convergence (all language learners converge on the same grammar in spite of the fact that they are exposed to different input), and poverty of the stimulus (children know things about language which they could not have learned from the input available to them). I argue that these arguments are based on premises which are either false or unsubstantiated. Languages differ from each other in profound ways, and there are very few true universals, so the fundamental crosslinguistic fact that needs explaining is diversity, not universality. A number of recent studies have demonstrated the existence of considerable differences in adult native speakers’ knowledge of the grammar of their language, including aspects of inflectional morphology, passives, quantifiers, and a variety of more complex constructions, so learners do not in fact converge on the same grammar. Finally, the poverty of the stimulus argument presupposes that children acquire linguistic representations of the kind postulated by generative grammarians; constructionist grammars such as those proposed by Tomasello, Goldberg and others can be learned from the input. We are the only species that has language, so there must be something unique about humans that makes language learning possible. The extent of crosslinguistic diversity and the considerable individual differences in the rate, style and outcome of acquisition suggest that it is more promising to think in terms of a language-making capacity, i.e., a set of domain-general abilities, rather than an innate body of knowledge about the structural properties of the target system. PMID:26157406
Nutrient supply and mercury dynamics in marine ecosystems: A conceptual model
Chen, Celia Y.; Hammerschmidt, Chad R.; Mason, Robert P.; Gilmour, Cynthia C.; Sunderland, Elsie M.; Greenfield, Ben K.; Buckman, Kate L.; Lamborg, Carl H.
2013-01-01
There is increasing interest and concern over the impacts of mercury (Hg) inputs to marine ecosystems. One of the challenges in assessing these effects is that the cycling and trophic transfer of Hg are strongly linked to other contaminants and disturbances. In addition to Hg, a major problem facing coastal waters is the impacts of elevated nutrient, particularly nitrogen (N), inputs. Increases in nutrient loading alter coastal ecosystems in ways that should change the transport, transformations and fate of Hg, including increases in fixation of organic carbon and deposition to sediments, decreases in the redox status of sediments and changes in fish habitat. In this paper we present a conceptual model which suggests that increases in loading of reactive N to marine ecosystems might alter Hg dynamics, decreasing bioavailabilty and trophic transfer. This conceptual model is most applicable to coastal waters, but may also be relevant to the pelagic ocean. We present information from case studies that both support and challenge this conceptual model, including marine observations across a nutrient gradient; results of a nutrient-trophic transfer Hg model for pelagic and coastal ecosystems; observations of Hg species, and nutrients from coastal sediments in the northeastern U.S.; and an analysis of fish Hg concentrations in estuaries under different nutrient loadings. These case studies suggest that changes in nutrient loading can impact Hg dynamics in coastal and open ocean ecosystems. Unfortunately none of the case studies is comprehensive; each only addresses a portion of the conceptual model and has limitations. Nevertheless, our conceptual model has important management implications. Many estuaries near developed areas are impaired due to elevated nutrient inputs. Widespread efforts are underway to control N loading and restore coastal ecosystem function. An unintended consequence of nutrient control measures could be to exacerbate problems associated with Hg contamination. Additional focused research and monitoring are needed to critically examine the link between nutrient supply and Hg contamination of marine waters. PMID:22749872
Stochastic techno-economic analysis of alcohol-to-jet fuel production.
Yao, Guolin; Staples, Mark D; Malina, Robert; Tyner, Wallace E
2017-01-01
Alcohol-to-jet (ATJ) is one of the technical feasible biofuel technologies. It produces jet fuel from sugary, starchy, and lignocellulosic biomass, such as sugarcane, corn grain, and switchgrass, via fermentation of sugars to ethanol or other alcohols. This study assesses the ATJ biofuel production pathway for these three biomass feedstocks, and advances existing techno-economic analyses of biofuels in three ways. First, we incorporate technical uncertainty for all by-products and co-products though statistical linkages between conversion efficiencies and input and output levels. Second, future price uncertainty is based on case-by-case time-series estimation, and a local sensitivity analysis is conducted with respect to each uncertain variable. Third, breakeven price distributions are developed to communicate the inherent uncertainty in breakeven price. This research also considers uncertainties in utility input requirements, fuel and by-product outputs, as well as price uncertainties for all major inputs, products, and co-products. All analyses are done from the perspective of a private firm. The stochastic dominance results of net present values (NPV) and breakeven price distributions show that sugarcane is the lowest cost feedstock over the entire range of uncertainty with the least risks, followed by corn grain and switchgrass, with the mean breakeven jet fuel prices being $0.96/L ($3.65/gal), $1.01/L ($3.84/gal), and $1.38/L ($5.21/gal), respectively. The variation of revenues from by-products in corn grain pathway can significantly impact its profitability. Sensitivity analyses show that technical uncertainty significantly impacts breakeven price and NPV distributions. Technical uncertainty is critical in determining the economic performance of the ATJ fuel pathway. Technical uncertainty needs to be considered in future economic analyses. The variation of revenues from by-products plays a significant role in profitability. With the distribution of breakeven prices, potential investors can apply whatever risk preferences they like to determine an appropriate bid or breakeven price that matches their risk profile.
Phillips, Holly N; Blenkmann, Alejandro; Hughes, Laura E; Kochen, Silvia; Bekinschtein, Tristan A; Cam-Can; Rowe, James B
2016-09-01
We propose that sensory inputs are processed in terms of optimised predictions and prediction error signals within hierarchical neurocognitive models. The combination of non-invasive brain imaging and generative network models has provided support for hierarchical frontotemporal interactions in oddball tasks, including recent identification of a temporal expectancy signal acting on prefrontal cortex. However, these studies are limited by the need to invert magnetoencephalographic or electroencephalographic sensor signals to localise activity from cortical 'nodes' in the network, or to infer neural responses from indirect measures such as the fMRI BOLD signal. To overcome this limitation, we examined frontotemporal interactions estimated from direct cortical recordings from two human participants with cortical electrode grids (electrocorticography - ECoG). Their frontotemporal network dynamics were compared to those identified by magnetoencephalography (MEG) in forty healthy adults. All participants performed the same auditory oddball task with standard tones interspersed with five deviant tone types. We normalised post-operative electrode locations to standardised anatomic space, to compare across modalities, and inverted the MEG to cortical sources using the estimated lead field from subject-specific head models. A mismatch negativity signal in frontal and temporal cortex was identified in all subjects. Generative models of the electrocorticographic and magnetoencephalographic data were separately compared using the free-energy estimate of the model evidence. Model comparison confirmed the same critical features of hierarchical frontotemporal networks in each patient as in the group-wise MEG analysis. These features included bilateral, feedforward and feedback frontotemporal modulated connectivity, in addition to an asymmetric expectancy driving input on left frontal cortex. The invasive ECoG provides an important step in construct validation of the use of neural generative models of MEG, which in turn enables generalisation to larger populations. Together, they give convergent evidence for the hierarchical interactions in frontotemporal networks for expectation and processing of sensory inputs. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.
Nutrient supply and mercury dynamics in marine ecosystems: a conceptual model.
Driscoll, Charles T; Chen, Celia Y; Hammerschmidt, Chad R; Mason, Robert P; Gilmour, Cynthia C; Sunderland, Elsie M; Greenfield, Ben K; Buckman, Kate L; Lamborg, Carl H
2012-11-01
There is increasing interest and concern over the impacts of mercury (Hg) inputs to marine ecosystems. One of the challenges in assessing these effects is that the cycling and trophic transfer of Hg are strongly linked to other contaminants and disturbances. In addition to Hg, a major problem facing coastal waters is the impacts of elevated nutrient, particularly nitrogen (N), inputs. Increases in nutrient loading alter coastal ecosystems in ways that should change the transport, transformations and fate of Hg, including increases in fixation of organic carbon and deposition to sediments, decreases in the redox status of sediments and changes in fish habitat. In this paper we present a conceptual model which suggests that increases in loading of reactive N to marine ecosystems might alter Hg dynamics, decreasing bioavailabilty and trophic transfer. This conceptual model is most applicable to coastal waters, but may also be relevant to the pelagic ocean. We present information from case studies that both support and challenge this conceptual model, including marine observations across a nutrient gradient; results of a nutrient-trophic transfer Hg model for pelagic and coastal ecosystems; observations of Hg species, and nutrients from coastal sediments in the northeastern U.S.; and an analysis of fish Hg concentrations in estuaries under different nutrient loadings. These case studies suggest that changes in nutrient loading can impact Hg dynamics in coastal and open ocean ecosystems. Unfortunately none of the case studies is comprehensive; each only addresses a portion of the conceptual model and has limitations. Nevertheless, our conceptual model has important management implications. Many estuaries near developed areas are impaired due to elevated nutrient inputs. Widespread efforts are underway to control N loading and restore coastal ecosystem function. An unintended consequence of nutrient control measures could be to exacerbate problems associated with Hg contamination. Additional focused research and monitoring are needed to critically examine the link between nutrient supply and Hg contamination of marine waters. Copyright © 2012 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilder, Julie; Mancini, Giulio M.; Wakabi, Timothy
A survey was designed to query former Biorisk management (BRM) trainees in the East Africa region about their practices post-training and their perceived future training needs. A subset of those surveyed had been trained as BRM trainers. The survey was conducted to obtain a baseline of BRM practices that can serve as a benchmark for performance monitoring, to identify priorities for future BRM training and to gauge local BRM trainers' abilities to deliver effective training. The survey revealed that less than 50% of the respondents could identify evidence of a BRM system in their institute. Coaching and mentoring by BRMmore » experts was identified as being of highest benefit to enable success as BRM practitioners. Local trainers reached 1538 trainees in the previous year and reported that trainings positively correlated with desired BRM behavior. Acknowledgements The authors wish to sincerely thank all of the former biorisk management trainees in East Africa who agreed to participate in this survey. Their candid and honest input was extremely insightful. We also thank Lora Grainger (06826) and Ben Brodsky (Manager, 06824) for careful and critical review of the report. We are grateful for the financial support of the Defense Threat Reduction Agency, Cooperative Biological Engagement Program.« less
Moving across scales: Challenges and opportunities in upscaling carbon fluxes
NASA Astrophysics Data System (ADS)
Naithani, K. J.
2016-12-01
Light use efficiency (LUE) type models are commonly used to upscale terrestrial C fluxes and estimate regional and global C budgets. Model parameters are often estimated for each land cover type (LCT) using flux observations from one or more eddy covariance towers, and then spatially extrapolated by integrating land cover, meteorological, and remotely sensed data. Decisions regarding the type of input data (spatial resolution of land cover data, spatial and temporal length of flux data), representation of landscape structure (land use vs. disturbance regime), and the type of modeling framework (common risk vs. hierarchical) all influence the estimates CO2 fluxes and the associated uncertainties, but are rarely considered together. This work presents a synthesis of past and present efforts for upscaling CO2 fluxes and associated uncertainties in the ChEAS (Chequamegon Ecosystem Atmosphere Study) region in northern Wisconsin and the Upper Peninsula of Michigan. This work highlights two key future research needs. First, the characterization of uncertainties due to all of the abovementioned factors reflects only a (hopefully relevant) subset the overall uncertainties. Second, interactions among these factors are likely critical, but are poorly represented by the tower network at landscape scales. Yet, results indicate significant spatial and temporal heterogeneity of uncertainty in CO2 fluxes which can inform carbon management efforts and prioritize data needs.
In defense of abstract conceptual representations.
Binder, Jeffrey R
2016-08-01
An extensive program of research in the past 2 decades has focused on the role of modal sensory, motor, and affective brain systems in storing and retrieving concept knowledge. This focus has led in some circles to an underestimation of the need for more abstract, supramodal conceptual representations in semantic cognition. Evidence for supramodal processing comes from neuroimaging work documenting a large, well-defined cortical network that responds to meaningful stimuli regardless of modal content. The nodes in this network correspond to high-level "convergence zones" that receive broadly crossmodal input and presumably process crossmodal conjunctions. It is proposed that highly conjunctive representations are needed for several critical functions, including capturing conceptual similarity structure, enabling thematic associative relationships independent of conceptual similarity, and providing efficient "chunking" of concept representations for a range of higher order tasks that require concepts to be configured as situations. These hypothesized functions account for a wide range of neuroimaging results showing modulation of the supramodal convergence zone network by associative strength, lexicality, familiarity, imageability, frequency, and semantic compositionality. The evidence supports a hierarchical model of knowledge representation in which modal systems provide a mechanism for concept acquisition and serve to ground individual concepts in external reality, whereas broadly conjunctive, supramodal representations play an equally important role in concept association and situation knowledge.