Parallel sort with a ranged, partitioned key-value store in a high perfomance computing environment
Bent, John M.; Faibish, Sorin; Grider, Gary; Torres, Aaron; Poole, Stephen W.
2016-01-26
Improved sorting techniques are provided that perform a parallel sort using a ranged, partitioned key-value store in a high performance computing (HPC) environment. A plurality of input data files comprising unsorted key-value data in a partitioned key-value store are sorted. The partitioned key-value store comprises a range server for each of a plurality of ranges. Each input data file has an associated reader thread. Each reader thread reads the unsorted key-value data in the corresponding input data file and performs a local sort of the unsorted key-value data to generate sorted key-value data. A plurality of sorted, ranged subsets of each of the sorted key-value data are generated based on the plurality of ranges. Each sorted, ranged subset corresponds to a given one of the ranges and is provided to one of the range servers corresponding to the range of the sorted, ranged subset. Each range server sorts the received sorted, ranged subsets and provides a sorted range. A plurality of the sorted ranges are concatenated to obtain a globally sorted result.
Mobile source reference material for activity data collection from the Emissions Inventory Improvement Program (EIIP). Provides complete methods for collecting key inputs to onroad mobile and nonroad mobile emissions models.
76 FR 64083 - Loveland Area Projects-2025 Power Marketing Initiative Proposal
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-17
..., with amendments to key marketing plan principles. This Federal Register notice initiates Western's... published in the Federal Register (51 FR 4012, January 31, 1986) and provided the marketing plan principles... provided customers the opportunity to review current marketing plan principles and provide informal input...
ERIC Educational Resources Information Center
Lopez-Catalan, Blanca; Bañuls, Victor A.
2017-01-01
Purpose: The purpose of this paper is to present the results of national level Delphi study carried out in Spain aimed at providing inputs for higher education administrators and decision makers about key e-learning trends for supporting postgraduate courses. Design/methodology/approach: The ranking of the e-learning trends is based on a…
Balancing Authority Cooperation Concepts - Intra-Hour Scheduling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunsaker, Matthew; Samaan, Nader; Milligan, Michael
2013-03-29
The overall objective of this study was to understand, on an Interconnection-wide basis, the effects intra-hour scheduling compared to hourly scheduling. Moreover, the study sought to understand how the benefits of intra-hour scheduling would change by altering the input assumptions in different scenarios. This report describes results of three separate scenarios with differing key assumptions and comparing the production costs between hourly scheduling and 10-minute scheduling performance. The different scenarios were chosen to provide insight into how the estimated benefits might change by altering input assumptions. Several key assumptions were different in the three scenarios, however most assumptions were similarmore » and/or unchanged among the scenarios.« less
Entropy-as-a-Service: Unlocking the Full Potential of Cryptography.
Vassilev, Apostol; Staples, Robert
2016-09-01
Securing the Internet requires strong cryptography, which depends on the availability of good entropy for generating unpredictable keys and accurate clocks. Attacks abusing weak keys or old inputs portend challenges for the Internet. EaaS is a novel architecture providing entropy and timestamps from a decentralized root of trust, scaling gracefully across diverse geopolitical locales and remaining trustworthy unless much of the collective is compromised.
Lifelong Learning in Public Libraries in 12 European Union Countries: Policy and Considerations
ERIC Educational Resources Information Center
Stanziola, Javier
2010-01-01
Public libraries have traditionally provided key inputs to support lifelong learning. More recently, significant social and technological changes have challenged this sector to redefine their role in this field. For most public libraries in Europe this has meant continuing their role as providers of information and advice while increasing services…
76 FR 71015 - Pick-Sloan Missouri Basin Program-Eastern Division-2021 Power Marketing Initiative
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-16
... (45 FR 71860, October 30, 1980) and provided the marketing plan principles used to market P-SMBP--ED... opportunity to review current Marketing Plan principles and provide informal input to Western for consideration in the 2021 PMI proposal. Key Marketing Plan principles discussed with firm power customers...
MST radar transmitter control and monitor system
NASA Technical Reports Server (NTRS)
Brosnahan, J. W.
1983-01-01
A generalized transmitter control and monitor card was developed using the Intel 8031 (8051 family) microprocessor. The design was generalized so that this card can be utilized for virtually any control application with only firmware changes. The block diagram appears in Figure 2. The card provides for local control using a 16 key keypad (up to 64 keys are supported). The local display is four digits of 7 segment LEDs. The display can indicate the status of all major system parameters and provide voltage readout for the analog signal inputs. The card can be populated with only the chips required for a given application. Fully populated, the card has two RS-232 serial ports for computer communications. It has a total of 48 TTL parallel lines that can define as either inputs or outputs in groups of four. A total of 32 analog inputs with a 0-5 volt range are supported. In addition, a real-time clock/calendar is available if required. A total of 16 k bytes of ROM and 16 k bytes of RAM is available for programming. This card can be the basis of virtually any monitor or control system with appropriate software.
Entropy-as-a-Service: Unlocking the Full Potential of Cryptography
Vassilev, Apostol; Staples, Robert
2016-01-01
Securing the Internet requires strong cryptography, which depends on the availability of good entropy for generating unpredictable keys and accurate clocks. Attacks abusing weak keys or old inputs portend challenges for the Internet. EaaS is a novel architecture providing entropy and timestamps from a decentralized root of trust, scaling gracefully across diverse geopolitical locales and remaining trustworthy unless much of the collective is compromised. PMID:28003687
Program for creating an operating system generation cross reference index (SGINDEX)
NASA Technical Reports Server (NTRS)
Barth, C. W.
1972-01-01
Computer program to collect key data from Stage Two input of OS/360 system and to prepare formatted listing of index entries collected is discussed. Program eliminates manual paging through system output by providing comprehensive cross reference.
1992 NASA Life Support Systems Analysis workshop
NASA Technical Reports Server (NTRS)
Evanich, Peggy L.; Crabb, Thomas M.; Gartrell, Charles F.
1992-01-01
The 1992 Life Support Systems Analysis Workshop was sponsored by NASA's Office of Aeronautics and Space Technology (OAST) to integrate the inputs from, disseminate information to, and foster communication among NASA, industry, and academic specialists. The workshop continued discussion and definition of key issues identified in the 1991 workshop, including: (1) modeling and experimental validation; (2) definition of systems analysis evaluation criteria; (3) integration of modeling at multiple levels; and (4) assessment of process control modeling approaches. Through both the 1991 and 1992 workshops, NASA has continued to seek input from industry and university chemical process modeling and analysis experts, and to introduce and apply new systems analysis approaches to life support systems. The workshop included technical presentations, discussions, and interactive planning, with sufficient time allocated for discussion of both technology status and technology development recommendations. Key personnel currently involved with life support technology developments from NASA, industry, and academia provided input to the status and priorities of current and future systems analysis methods and requirements.
Intelligent Visual Input: A Graphical Method for Rapid Entry of Patient-Specific Data
Bergeron, Bryan P.; Greenes, Robert A.
1987-01-01
Intelligent Visual Input (IVI) provides a rapid, graphical method of data entry for both expert system interaction and medical record keeping purposes. Key components of IVI include: a high-resolution graphic display; an interface supportive of rapid selection, i.e., one utilizing a mouse or light pen; algorithm simplification modules; and intelligent graphic algorithm expansion modules. A prototype IVI system, designed to facilitate entry of physical exam findings, is used to illustrates the potential advantages of this approach.
Development of a Stochastically-driven, Forward Predictive Performance Model for PEMFCs
NASA Astrophysics Data System (ADS)
Harvey, David Benjamin Paul
A one-dimensional multi-scale coupled, transient, and mechanistic performance model for a PEMFC membrane electrode assembly has been developed. The model explicitly includes each of the 5 layers within a membrane electrode assembly and solves for the transport of charge, heat, mass, species, dissolved water, and liquid water. Key features of the model include the use of a multi-step implementation of the HOR reaction on the anode, agglomerate catalyst sub-models for both the anode and cathode catalyst layers, a unique approach that links the composition of the catalyst layer to key properties within the agglomerate model and the implementation of a stochastic input-based approach for component material properties. The model employs a new methodology for validation using statistically varying input parameters and statistically-based experimental performance data; this model represents the first stochastic input driven unit cell performance model. The stochastic input driven performance model was used to identify optimal ionomer content within the cathode catalyst layer, demonstrate the role of material variation in potential low performing MEA materials, provide explanation for the performance of low-Pt loaded MEAs, and investigate the validity of transient-sweep experimental diagnostic methods.
Input from Key Stakeholders in the National Security Technology Incubator
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
This report documents the input from key stakeholders of the National Security Technology Incubator (NSTI) in developing a new technology incubator and related programs for southern New Mexico. The technology incubator is being developed as part of the National Security Preparedness Project (NSPP), funded by a Department of Energy (DOE)/National Nuclear Security Administration (NNSA) grant. This report includes identification of key stakeholders as well as a description and analysis of their input for the development of an incubator.
Replacing Fortran Namelists with JSON
NASA Astrophysics Data System (ADS)
Robinson, T. E., Jr.
2017-12-01
Maintaining a log of input parameters for a climate model is very important to understanding potential causes for answer changes during the development stages. Additionally, since modern Fortran is now interoperable with C, a more modern approach to software infrastructure to include code written in C is necessary. Merging these two separate facets of climate modeling requires a quality control for monitoring changes to input parameters and model defaults that can work with both Fortran and C. JSON will soon replace namelists as the preferred key/value pair input in the GFDL model. By adding a JSON parser written in C into the model, the input can be used by all functions and subroutines in the model, errors can be handled by the model instead of by the internal namelist parser, and the values can be output into a single file that is easily parsable by readily available tools. Input JSON files can handle all of the functionality of a namelist while being portable between C and Fortran. Fortran wrappers using unlimited polymorphism are crucial to allow for simple and compact code which avoids the need for many subroutines contained in an interface. Errors can be handled with more detail by providing information about location of syntax errors or typos. The output JSON provides a ground truth for values that the model actually uses by providing not only the values loaded through the input JSON, but also any default values that were not included. This kind of quality control on model input is crucial for maintaining reproducibility and understanding any answer changes resulting from changes in the input.
Excessive input of nitrogen to coastal waters leads to eutrophication and hypoxia that reduce biodiversity and impair key ecosystem services provided by benthic communities; for example, fish and shellfish production, bioturbation, nutrient cycling, and water filtration. Hypoxia ...
Linking Deep-Waer Prey Fields with Odontocete Population Structure and Behavior
2015-09-30
potentially mitigate beaked whale responses to disturbance, providing direct input data to PCOD models for beaked whales • Leverage previous...principles of cetacean foraging ecology and responses to disturbance • Identify key prey metrics for future analyses and incorporation into PCOD
Summary of the key features of seven biomathematical models of human fatigue and performance.
Mallis, Melissa M; Mejdal, Sig; Nguyen, Tammy T; Dinges, David F
2004-03-01
Biomathematical models that quantify the effects of circadian and sleep/wake processes on the regulation of alertness and performance have been developed in an effort to predict the magnitude and timing of fatigue-related responses in a variety of contexts (e.g., transmeridian travel, sustained operations, shift work). This paper summarizes key features of seven biomathematical models reviewed as part of the Fatigue and Performance Modeling Workshop held in Seattle, WA, on June 13-14, 2002. The Workshop was jointly sponsored by the National Aeronautics and Space Administration, U.S. Department of Defense, U.S. Army Medical Research and Materiel Command, Office of Naval Research, Air Force Office of Scientific Research, and U.S. Department of Transportation. An invitation was sent to developers of seven biomathematical models that were commonly cited in scientific literature and/or supported by government funding. On acceptance of the invitation to attend the Workshop, developers were asked to complete a survey of the goals, capabilities, inputs, and outputs of their biomathematical models of alertness and performance. Data from the completed surveys were summarized and juxtaposed to provide a framework for comparing features of the seven models. Survey responses revealed that models varied greatly relative to their reported goals and capabilities. While all modelers reported that circadian factors were key components of their capabilities, they differed markedly with regard to the roles of sleep and work times as input factors for prediction: four of the seven models had work time as their sole input variable(s), while the other three models relied on various aspects of sleep timing for model input. Models also differed relative to outputs: five sought to predict results from laboratory experiments, field, and operational data, while two models were developed without regard to predicting laboratory experimental results. All modelers provided published papers describing their models, with three of the models being proprietary. Although all models appear to have been fundamentally influenced by the two-process model of sleep regulation by Borbély, there is considerable diversity among them in the number and type of input and output variables, and their stated goals and capabilities.
Summary of the key features of seven biomathematical models of human fatigue and performance
NASA Technical Reports Server (NTRS)
Mallis, Melissa M.; Mejdal, Sig; Nguyen, Tammy T.; Dinges, David F.
2004-01-01
BACKGROUND: Biomathematical models that quantify the effects of circadian and sleep/wake processes on the regulation of alertness and performance have been developed in an effort to predict the magnitude and timing of fatigue-related responses in a variety of contexts (e.g., transmeridian travel, sustained operations, shift work). This paper summarizes key features of seven biomathematical models reviewed as part of the Fatigue and Performance Modeling Workshop held in Seattle, WA, on June 13-14, 2002. The Workshop was jointly sponsored by the National Aeronautics and Space Administration, U.S. Department of Defense, U.S. Army Medical Research and Materiel Command, Office of Naval Research, Air Force Office of Scientific Research, and U.S. Department of Transportation. METHODS: An invitation was sent to developers of seven biomathematical models that were commonly cited in scientific literature and/or supported by government funding. On acceptance of the invitation to attend the Workshop, developers were asked to complete a survey of the goals, capabilities, inputs, and outputs of their biomathematical models of alertness and performance. Data from the completed surveys were summarized and juxtaposed to provide a framework for comparing features of the seven models. RESULTS: Survey responses revealed that models varied greatly relative to their reported goals and capabilities. While all modelers reported that circadian factors were key components of their capabilities, they differed markedly with regard to the roles of sleep and work times as input factors for prediction: four of the seven models had work time as their sole input variable(s), while the other three models relied on various aspects of sleep timing for model input. Models also differed relative to outputs: five sought to predict results from laboratory experiments, field, and operational data, while two models were developed without regard to predicting laboratory experimental results. All modelers provided published papers describing their models, with three of the models being proprietary. CONCLUSIONS: Although all models appear to have been fundamentally influenced by the two-process model of sleep regulation by Borbely, there is considerable diversity among them in the number and type of input and output variables, and their stated goals and capabilities.
NASA Technical Reports Server (NTRS)
Meyn, Larry A.
2018-01-01
One of the goals of NASA's Revolutionary Vertical Lift Technology Project (RVLT) is to provide validated tools for multidisciplinary design, analysis and optimization (MDAO) of vertical lift vehicles. As part of this effort, the software package, RotorCraft Optimization Tools (RCOTOOLS), is being developed to facilitate incorporating key rotorcraft conceptual design codes into optimizations using the OpenMDAO multi-disciplinary optimization framework written in Python. RCOTOOLS, also written in Python, currently supports the incorporation of the NASA Design and Analysis of RotorCraft (NDARC) vehicle sizing tool and the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics II (CAMRAD II) analysis tool into OpenMDAO-driven optimizations. Both of these tools use detailed, file-based inputs and outputs, so RCOTOOLS provides software wrappers to update input files with new design variable values, execute these codes and then extract specific response variable values from the file outputs. These wrappers are designed to be flexible and easy to use. RCOTOOLS also provides several utilities to aid in optimization model development, including Graphical User Interface (GUI) tools for browsing input and output files in order to identify text strings that are used to identify specific variables as optimization input and response variables. This paper provides an overview of RCOTOOLS and its use
Industrial wood productivity in the United States, 1900-1998
Peter J. Ince
2000-01-01
The productivity of U.S. wood and paper product output in terms of wood input is computed and displayed in graphs. Background tables provide supporting data. The productivity trend parallels trends in the recovered paper utilization rate. Recycling and wood residue use are key factors in productivity gains.
The HTM Spatial Pooler-A Neocortical Algorithm for Online Sparse Distributed Coding.
Cui, Yuwei; Ahmad, Subutai; Hawkins, Jeff
2017-01-01
Hierarchical temporal memory (HTM) provides a theoretical framework that models several key computational principles of the neocortex. In this paper, we analyze an important component of HTM, the HTM spatial pooler (SP). The SP models how neurons learn feedforward connections and form efficient representations of the input. It converts arbitrary binary input patterns into sparse distributed representations (SDRs) using a combination of competitive Hebbian learning rules and homeostatic excitability control. We describe a number of key properties of the SP, including fast adaptation to changing input statistics, improved noise robustness through learning, efficient use of cells, and robustness to cell death. In order to quantify these properties we develop a set of metrics that can be directly computed from the SP outputs. We show how the properties are met using these metrics and targeted artificial simulations. We then demonstrate the value of the SP in a complete end-to-end real-world HTM system. We discuss the relationship with neuroscience and previous studies of sparse coding. The HTM spatial pooler represents a neurally inspired algorithm for learning sparse representations from noisy data streams in an online fashion.
NASA Astrophysics Data System (ADS)
Arsenault, Louis-François; Neuberg, Richard; Hannah, Lauren A.; Millis, Andrew J.
2017-11-01
We present a supervised machine learning approach to the inversion of Fredholm integrals of the first kind as they arise, for example, in the analytic continuation problem of quantum many-body physics. The approach provides a natural regularization for the ill-conditioned inverse of the Fredholm kernel, as well as an efficient and stable treatment of constraints. The key observation is that the stability of the forward problem permits the construction of a large database of outputs for physically meaningful inputs. Applying machine learning to this database generates a regression function of controlled complexity, which returns approximate solutions for previously unseen inputs; the approximate solutions are then projected onto the subspace of functions satisfying relevant constraints. Under standard error metrics the method performs as well or better than the Maximum Entropy method for low input noise and is substantially more robust to increased input noise. We suggest that the methodology will be similarly effective for other problems involving a formally ill-conditioned inversion of an integral operator, provided that the forward problem can be efficiently solved.
NGNP Infrastructure Readiness Assessment: Consolidation Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brian K Castle
2011-02-01
The Next Generation Nuclear Plant (NGNP) project supports the development, demonstration, and deployment of high temperature gas-cooled reactors (HTGRs). The NGNP project is being reviewed by the Nuclear Energy Advisory Council (NEAC) to provide input to the DOE, who will make a recommendation to the Secretary of Energy, whether or not to continue with Phase 2 of the NGNP project. The NEAC review will be based on, in part, the infrastructure readiness assessment, which is an assessment of industry's current ability to provide specified components for the FOAK NGNP, meet quality assurance requirements, transport components, have the necessary workforce inmore » place, and have the necessary construction capabilities. AREVA and Westinghouse were contracted to perform independent assessments of industry's capabilities because of their experience with nuclear supply chains, which is a result of their experiences with the EPR and AP-1000 reactors. Both vendors produced infrastructure readiness assessment reports that identified key components and categorized these components into three groups based on their ability to be deployed in the FOAK plant. The NGNP project has several programs that are developing key components and capabilities. For these components, the NGNP project have provided input to properly assess the infrastructure readiness for these components.« less
Challenging Misconceptions about Student Ratings of Instruction. IDEA Paper #58
ERIC Educational Resources Information Center
Benton, Stephen L.; Ryalls, Kenneth R.
2016-01-01
Data from student ratings of instruction (SRI) are used ubiquitously as a key element in providing instructors with valuable feedback and evaluators with critical student input. Nonetheless, calls for the elimination of SRI continue to appear in academic journals and higher education periodicals. This paper brings to bear the huge body of research…
Coral Reef Remote Sensing Using Simulated VIIRS and LDCM Imagery
NASA Technical Reports Server (NTRS)
Estep, Leland; Spruce, Joseph P.; Blonski, Slawomir; Moore, Roxzana
2008-01-01
The Rapid Prototyping Capability (RPC) node at NASA Stennis Space Center, MS, was used to simulate NASA next-generation sensor imagery over well-known coral reef areas: Looe Key, FL, and Kaneohe Bay, HI. The objective was to assess the degree to which next-generation sensor systems-the Visible/Infrared Imager/Radiometer Suite (VIIRS) and the Landsat Data Continuity Mission (LDCM)- might provide key input to the National Oceanographic and Atmospheric Administration (NOAA) Integrated Coral Observing Network (ICON)/Coral Reef Early Warning System (CREWS) Decision Support Tool (DST). The DST data layers produced from the simulated imagery concerned water quality and benthic classification map layers. The water optical parameters of interest were chlorophyll (Chl) and the absorption coefficient (a). The input imagery used by the RPC for simulation included spaceborne (Hyperion) and airborne (AVIRIS) hyperspectral data. Specific field data to complement and aid in validation of the overflight data was used when available. The results of the experiment show that the next-generation sensor systems are capable of providing valuable data layer resources to NOAA s ICON/CREWS DST.
Coral Reef Remote Sensing using Simulated VIIRS and LDCM Imagery
NASA Technical Reports Server (NTRS)
Estep, Leland; Spruce, Joseph P.
2007-01-01
The Rapid Prototyping Capability (RPC) node at NASA Stennis Space Center, MS, was used to simulate NASA next-generation sensor imagery over well-known coral reef areas: Looe Key, FL, and Kaneohe Bay, HI. The objective was to assess the degree to which next-generation sensor systems the Visible/Infrared Imager/Radiometer Suite (VIIRS) and the Landsat Data Continuity Mission (LDCM) might provide key input to the National Oceanographic and Atmospheric Administration (NOAA) Integrated Coral Observing Network (ICON)/Coral Reef Early Warning System (CREWS) Decision Support Tool (DST). The DST data layers produced from the simulated imagery concerned water quality and benthic classification map layers. The water optical parameters of interest were chlorophyll (Chl) and the absorption coefficient (a). The input imagery used by the RPC for simulation included spaceborne (Hyperion) and airborne (AVIRIS) hyperspectral data. Specific field data to complement and aid in validation of the overflight data was used when available. The results of the experiment show that the next-generation sensor systems are capable of providing valuable data layer resources to NOAA's ICON/CREWS DST.
Food choice as a key management strategy for functional gastrointestinal symptoms.
Gibson, Peter R; Shepherd, Susan J
2012-05-01
Recognition of food components that induce functional gut symptoms in patient's functional bowel disorders (FBD) has been challenging. Food directly or indirectly provides considerable afferent input into the enteric nervous system. There is an altered relationship between the afferent input and perception/efferent response in FBD. Defining the nature of food-related stimuli may provide a means of minimizing such an input and gut symptoms. Using this premise, reducing the intake of FODMAPs (fermentable oligo-, di-, and mono-saccharides and polyols)--poorly absorbed short-chain carbohydrates that, by virtue of their small molecular size and rapid fermentability, will distend the intestinal lumen with liquid and gas--improves symptoms in the majority of patients. Well-developed methodologies to deliver the diet via dietician-led education are available. Another abundant source of afferent input is natural and added food chemicals (such as salicylates, amines, and glutamates). Studies are needed to assess the efficacy of the low food chemical dietary approach. A recent placebo-controlled trial of FODMAP-poor gluten provided the first valid evidence that non-celiac gluten intolerance might actually exist, but its prevalence and underlying mechanisms require elucidation. Food choice via the low FODMAP and potentially other dietary strategies is now a realistic and efficacious therapeutic approach for functional gut symptoms.
2007-05-01
National Association of Clean Water Agencies Shelly Foston Meridian Institute Michael Gritzuk Pima County (AZ) Wastewater Management Department Genevieve...agencies to assist small and medium systems, and it has helped fund and develop a variety of Web casts and security trainings. Although drinking water...trainings, conference calls, Web casts , and other communica- tions; (2) provide administrative support; (3) provide technical support; and (4
Regulation of spatial selectivity by crossover inhibition.
Cafaro, Jon; Rieke, Fred
2013-04-10
Signals throughout the nervous system diverge into parallel excitatory and inhibitory pathways that later converge on downstream neurons to control their spike output. Converging excitatory and inhibitory synaptic inputs can exhibit a variety of temporal relationships. A common motif is feedforward inhibition, in which an increase (decrease) in excitatory input precedes a corresponding increase (decrease) in inhibitory input. The delay of inhibitory input relative to excitatory input originates from an extra synapse in the circuit shaping inhibitory input. Another common motif is push-pull or "crossover" inhibition, in which increases (decreases) in excitatory input occur together with decreases (increases) in inhibitory input. Primate On midget ganglion cells receive primarily feedforward inhibition and On parasol cells receive primarily crossover inhibition; this difference provides an opportunity to study how each motif shapes the light responses of cell types that play a key role in visual perception. For full-field stimuli, feedforward inhibition abbreviated and attenuated responses of On midget cells, while crossover inhibition, though plentiful, had surprisingly little impact on the responses of On parasol cells. Spatially structured stimuli, however, could cause excitatory and inhibitory inputs to On parasol cells to increase together, adopting a temporal relation very much like that for feedforward inhibition. In this case, inhibitory inputs substantially abbreviated a cell's spike output. Thus inhibitory input shapes the temporal stimulus selectivity of both midget and parasol ganglion cells, but its impact on responses of parasol cells depends strongly on the spatial structure of the light inputs.
On a two-pass scheme without a faraday mirror for free-space relativistic quantum cryptography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kravtsov, K. S.; Radchenko, I. V.; Korol'kov, A. V.
2013-05-15
The stability of destructive interference independent of the input polarization and the state of a quantum communication channel in fiber optic systems used in quantum cryptography plays a principal role in providing the security of communicated keys. A novel optical scheme is proposed that can be used both in relativistic quantum cryptography for communicating keys in open space and for communicating them over fiber optic lines. The scheme ensures stability of destructive interference and admits simple automatic balancing of a fiber interferometer.
Oldenhuis, Hilbrand KE; de Groot, Martijn; Polstra, Louis; Velthuijsen, Hugo; van Gemert-Pijnen, Julia EWC
2017-01-01
Background The combination of self-tracking and persuasive eCoaching in automated interventions is a new and promising approach for healthy lifestyle management. Objective The aim of this study was to identify key components of self-tracking and persuasive eCoaching in automated healthy lifestyle interventions that contribute to their effectiveness on health outcomes, usability, and adherence. A secondary aim was to identify the way in which these key components should be designed to contribute to improved health outcomes, usability, and adherence. Methods The scoping review methodology proposed by Arskey and O’Malley was applied. Scopus, EMBASE, PsycINFO, and PubMed were searched for publications dated from January 1, 2013 to January 31, 2016 that included (1) self-tracking, (2) persuasive eCoaching, and (3) healthy lifestyle intervention. Results The search resulted in 32 publications, 17 of which provided results regarding the effect on health outcomes, 27 of which provided results regarding usability, and 13 of which provided results regarding adherence. Among the 32 publications, 27 described an intervention. The most commonly applied persuasive eCoaching components in the described interventions were personalization (n=24), suggestion (n=19), goal-setting (n=17), simulation (n=17), and reminders (n=15). As for self-tracking components, most interventions utilized an accelerometer to measure steps (n=11). Furthermore, the medium through which the user could access the intervention was usually a mobile phone (n=10). The following key components and their specific design seem to influence both health outcomes and usability in a positive way: reduction by setting short-term goals to eventually reach long-term goals, personalization of goals, praise messages, reminders to input self-tracking data into the technology, use of validity-tested devices, integration of self-tracking and persuasive eCoaching, and provision of face-to-face instructions during implementation. In addition, health outcomes or usability were not negatively affected when more effort was requested from participants to input data into the technology. The data extracted from the included publications provided limited ability to identify key components for adherence. However, one key component was identified for both usability and adherence, namely the provision of personalized content. Conclusions This scoping review provides a first overview of the key components in automated healthy lifestyle interventions combining self-tracking and persuasive eCoaching that can be utilized during the development of such interventions. Future studies should focus on the identification of key components for effects on adherence, as adherence is a prerequisite for an intervention to be effective. PMID:28765103
Predicting yields from Appalachian red oak logs and lumber
Daniel E. Dunmire
1971-01-01
One utilization problem is in pinpointing how to efficiently and effectively recover usable parts from logs, bolts, and lumber. Yields, which are output divided by input, provide a key to managers who make processing decisions. Research results are applied to indicate yields of graded lumber and dimension stock from graded Appalachian red oak (group) logs. How to...
A comprehensive evaluation of input data-induced uncertainty in nonpoint source pollution modeling
NASA Astrophysics Data System (ADS)
Chen, L.; Gong, Y.; Shen, Z.
2015-11-01
Watershed models have been used extensively for quantifying nonpoint source (NPS) pollution, but few studies have been conducted on the error-transitivity from different input data sets to NPS modeling. In this paper, the effects of four input data, including rainfall, digital elevation models (DEMs), land use maps, and the amount of fertilizer, on NPS simulation were quantified and compared. A systematic input-induced uncertainty was investigated using watershed model for phosphorus load prediction. Based on the results, the rain gauge density resulted in the largest model uncertainty, followed by DEMs, whereas land use and fertilizer amount exhibited limited impacts. The mean coefficient of variation for errors in single rain gauges-, multiple gauges-, ASTER GDEM-, NFGIS DEM-, land use-, and fertilizer amount information was 0.390, 0.274, 0.186, 0.073, 0.033 and 0.005, respectively. The use of specific input information, such as key gauges, is also highlighted to achieve the required model accuracy. In this sense, these results provide valuable information to other model-based studies for the control of prediction uncertainty.
Stereoscopic Feature Tracking System for Retrieving Velocity of Surface Waters
NASA Astrophysics Data System (ADS)
Zuniga Zamalloa, C. C.; Landry, B. J.
2017-12-01
The present work is concerned with the surface velocity retrieval of flows using a stereoscopic setup and finding the correspondence in the images via feature tracking (FT). The feature tracking provides a key benefit of substantially reducing the level of user input. In contrast to other commonly used methods (e.g., normalized cross-correlation), FT does not require the user to prescribe interrogation window sizes and removes the need for masking when specularities are present. The results of the current FT methodology are comparable to those obtained via Large Scale Particle Image Velocimetry while requiring little to no user input which allowed for rapid, automated processing of imagery.
Segmentation and learning in the quantitative analysis of microscopy images
NASA Astrophysics Data System (ADS)
Ruggiero, Christy; Ross, Amy; Porter, Reid
2015-02-01
In material science and bio-medical domains the quantity and quality of microscopy images is rapidly increasing and there is a great need to automatically detect, delineate and quantify particles, grains, cells, neurons and other functional "objects" within these images. These are challenging problems for image processing because of the variability in object appearance that inevitably arises in real world image acquisition and analysis. One of the most promising (and practical) ways to address these challenges is interactive image segmentation. These algorithms are designed to incorporate input from a human operator to tailor the segmentation method to the image at hand. Interactive image segmentation is now a key tool in a wide range of applications in microscopy and elsewhere. Historically, interactive image segmentation algorithms have tailored segmentation on an image-by-image basis, and information derived from operator input is not transferred between images. But recently there has been increasing interest to use machine learning in segmentation to provide interactive tools that accumulate and learn from the operator input over longer periods of time. These new learning algorithms reduce the need for operator input over time, and can potentially provide a more dynamic balance between customization and automation for different applications. This paper reviews the state of the art in this area, provides a unified view of these algorithms, and compares the segmentation performance of various design choices.
ERIC Educational Resources Information Center
Csizér, Kata; Tankó, Gyula
2017-01-01
Apart from L2 motivation, self-regulation is also increasingly seen as a key variable in L2 learning in many foreign language learning contexts because classroom-centered instructive language teaching might not be able to provide sufficient input for students. Therefore, taking responsibility and regulating the learning processes and positive…
David M. Bell; Eric J. Ward; A. Christopher Oishi; Ram Oren; Paul G. Flikkema; James S. Clark; David Whitehead
2015-01-01
Uncertainties in ecophysiological responses to environment, such as the impact of atmospheric and soil moisture conditions on plant water regulation, limit our ability to estimate key inputs for ecosystem models. Advanced statistical frameworks provide coherent methodologies for relating observed data, such as stem sap flux density, to unobserved processes, such as...
Development of a GIS interface for WEPP Model application to Great Lakes forested watersheds
J. R. Frankenberger; S. Dun; D. C. Flanagan; J. Q. Wu; W. J. Elliot
2011-01-01
This presentation will highlight efforts on development of a new online WEPP GIS interface, targeted toward application in forested regions bordering the Great Lakes. The key components and algorithms of the online GIS system will be outlined. The general procedures used to provide input to the WEPP model and to display model output will be demonstrated.
Integrated Technology Assessment Center (ITAC) Update
NASA Technical Reports Server (NTRS)
Taylor, J. L.; Neely, M. A.; Curran, F. M.; Christensen, E. R.; Escher, D.; Lovell, N.; Morris, Charles (Technical Monitor)
2002-01-01
The Integrated Technology Assessment Center (ITAC) has developed a flexible systems analysis framework to identify long-term technology needs, quantify payoffs for technology investments, and assess the progress of ASTP-sponsored technology programs in the hypersonics area. For this, ITAC has assembled an experienced team representing a broad sector of the aerospace community and developed a systematic assessment process complete with supporting tools. Concepts for transportation systems are selected based on relevance to the ASTP and integrated concept models (ICM) of these concepts are developed. Key technologies of interest are identified and projections are made of their characteristics with respect to their impacts on key aspects of the specific concepts of interest. Both the models and technology projections are then fed into the ITAC's probabilistic systems analysis framework in ModelCenter. This framework permits rapid sensitivity analysis, single point design assessment, and a full probabilistic assessment of each concept with respect to both embedded and enhancing technologies. Probabilistic outputs are weighed against metrics of interest to ASTP using a multivariate decision making process to provide inputs for technology prioritization within the ASTP. ITAC program is currently finishing the assessment of a two-stage-to-orbit (TSTO), rocket-based combined cycle (RBCC) concept and a TSTO turbine-based combined cycle (TBCC) concept developed by the team with inputs from NASA. A baseline all rocket TSTO concept is also being developed for comparison. Boeing has recently submitted a performance model for their Flexible Aerospace System Solution for Tomorrow (FASST) concept and the ISAT program will provide inputs for a single-stage-to-orbit (SSTO) TBCC based concept in the near-term. Both of these latter concepts will be analyzed within the ITAC framework over the summer. This paper provides a status update of the ITAC program.
Multilevel Modeling in Psychosomatic Medicine Research
Myers, Nicholas D.; Brincks, Ahnalee M.; Ames, Allison J.; Prado, Guillermo J.; Penedo, Frank J.; Benedict, Catherine
2012-01-01
The primary purpose of this manuscript is to provide an overview of multilevel modeling for Psychosomatic Medicine readers and contributors. The manuscript begins with a general introduction to multilevel modeling. Multilevel regression modeling at two-levels is emphasized because of its prevalence in psychosomatic medicine research. Simulated datasets based on some core ideas from the Familias Unidas effectiveness study are used to illustrate key concepts including: communication of model specification, parameter interpretation, sample size and power, and missing data. Input and key output files from Mplus and SAS are provided. A cluster randomized trial with repeated measures (i.e., three-level regression model) is then briefly presented with simulated data based on some core ideas from a cognitive behavioral stress management intervention in prostate cancer. PMID:23107843
Round-Robin approach to data flow optimization
NASA Technical Reports Server (NTRS)
Witt, J.
1978-01-01
A large data base, circular in structure, was required (for the Voyager Mission to Jupiter/Saturn) with the capability to completely update the data every four hours during high activity periods. The data is stored in key ordered format for retrieval but is not input in key order. Existing access methods for large data bases with rapid data replacement by keys become inefficient as the volume of data being replaced grows. The Round-Robin method was developed to alleviate this problem. The Round-Robin access method allows rapid updating of the data with continuous self cleaning where the oldest data (by key) is deleted and the newest data (by key) is kept regardless of the order of input.
Fourier-Mellin moment-based intertwining map for image encryption
NASA Astrophysics Data System (ADS)
Kaur, Manjit; Kumar, Vijay
2018-03-01
In this paper, a robust image encryption technique that utilizes Fourier-Mellin moments and intertwining logistic map is proposed. Fourier-Mellin moment-based intertwining logistic map has been designed to overcome the issue of low sensitivity of an input image. Multi-objective Non-Dominated Sorting Genetic Algorithm (NSGA-II) based on Reinforcement Learning (MNSGA-RL) has been used to optimize the required parameters of intertwining logistic map. Fourier-Mellin moments are used to make the secret keys more secure. Thereafter, permutation and diffusion operations are carried out on input image using secret keys. The performance of proposed image encryption technique has been evaluated on five well-known benchmark images and also compared with seven well-known existing encryption techniques. The experimental results reveal that the proposed technique outperforms others in terms of entropy, correlation analysis, a unified average changing intensity and the number of changing pixel rate. The simulation results reveal that the proposed technique provides high level of security and robustness against various types of attacks.
Ranson, Kent; Law, Tyler J; Bennett, Sara
2010-06-01
Donor funding for health systems financing (HSF) research is inadequate and often poorly aligned with national priorities. This study aimed to generate consensus about a core set of research issues that urgently require attention in order to facilitate policy development. There were three key inputs into the priority setting process: key-informant interviews with health policy makers, researchers, community and civil society representatives across twenty-four low- and middle-income countries in four regions; an overview of relevant reviews to identify research completed to date; and inputs from 12 key informants (largely researchers) at a consultative workshop. Nineteen priority research questions emerged from key-informant interviews. The overview of reviews was instructive in showing which health financing topics have had comparatively little written about them, despite being identified as important by key informants. The questions ranked as most important at the consultative workshop were: It is hoped that this work on HSF research priorities will complement calls for increased health systems research and evaluation by providing specific suggestions as to where new and existing research resources can best be invested. The list of high priority HSF research questions is being communicated to research funders and researchers in order to seek to influence global patterns of HSF research funding and activity. A "bottom up" approach to setting global research priorities such as that employed here should ensure that priorities are more sensitive to user needs. Copyright 2010 Elsevier Ltd. All rights reserved.
Lentferink, Aniek J; Oldenhuis, Hilbrand Ke; de Groot, Martijn; Polstra, Louis; Velthuijsen, Hugo; van Gemert-Pijnen, Julia Ewc
2017-08-01
The combination of self-tracking and persuasive eCoaching in automated interventions is a new and promising approach for healthy lifestyle management. The aim of this study was to identify key components of self-tracking and persuasive eCoaching in automated healthy lifestyle interventions that contribute to their effectiveness on health outcomes, usability, and adherence. A secondary aim was to identify the way in which these key components should be designed to contribute to improved health outcomes, usability, and adherence. The scoping review methodology proposed by Arskey and O'Malley was applied. Scopus, EMBASE, PsycINFO, and PubMed were searched for publications dated from January 1, 2013 to January 31, 2016 that included (1) self-tracking, (2) persuasive eCoaching, and (3) healthy lifestyle intervention. The search resulted in 32 publications, 17 of which provided results regarding the effect on health outcomes, 27 of which provided results regarding usability, and 13 of which provided results regarding adherence. Among the 32 publications, 27 described an intervention. The most commonly applied persuasive eCoaching components in the described interventions were personalization (n=24), suggestion (n=19), goal-setting (n=17), simulation (n=17), and reminders (n=15). As for self-tracking components, most interventions utilized an accelerometer to measure steps (n=11). Furthermore, the medium through which the user could access the intervention was usually a mobile phone (n=10). The following key components and their specific design seem to influence both health outcomes and usability in a positive way: reduction by setting short-term goals to eventually reach long-term goals, personalization of goals, praise messages, reminders to input self-tracking data into the technology, use of validity-tested devices, integration of self-tracking and persuasive eCoaching, and provision of face-to-face instructions during implementation. In addition, health outcomes or usability were not negatively affected when more effort was requested from participants to input data into the technology. The data extracted from the included publications provided limited ability to identify key components for adherence. However, one key component was identified for both usability and adherence, namely the provision of personalized content. This scoping review provides a first overview of the key components in automated healthy lifestyle interventions combining self-tracking and persuasive eCoaching that can be utilized during the development of such interventions. Future studies should focus on the identification of key components for effects on adherence, as adherence is a prerequisite for an intervention to be effective. ©Aniek J Lentferink, Hilbrand KE Oldenhuis, Martijn de Groot, Louis Polstra, Hugo Velthuijsen, Julia EWC van Gemert-Pijnen. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 01.08.2017.
Space station data system analysis/architecture study. Task 3: Trade studies, DR-5, volume 1
NASA Technical Reports Server (NTRS)
1985-01-01
The primary objective of Task 3 is to provide additional analysis and insight necessary to support key design/programmatic decision for options quantification and selection for system definition. This includes: (1) the identification of key trade study topics; (2) the definition of a trade study procedure for each topic (issues to be resolved, key inputs, criteria/weighting, methodology); (3) conduct tradeoff and sensitivity analysis; and (4) the review/verification of results within the context of evolving system design and definition. The trade study topics addressed in this volume include space autonomy and function automation, software transportability, system network topology, communications standardization, onboard local area networking, distributed operating system, software configuration management, and the software development environment facility.
NASA Astrophysics Data System (ADS)
Li, Xianye; Meng, Xiangfeng; Wang, Yurong; Yang, Xiulun; Yin, Yongkai; Peng, Xiang; He, Wenqi; Dong, Guoyan; Chen, Hongyi
2017-09-01
A multiple-image encryption method is proposed that is based on row scanning compressive ghost imaging, (t, n) threshold secret sharing, and phase retrieval in the Fresnel domain. In the encryption process, after wavelet transform and Arnold transform of the target image, the ciphertext matrix can be first detected using a bucket detector. Based on a (t, n) threshold secret sharing algorithm, the measurement key used in the row scanning compressive ghost imaging can be decomposed and shared into two pairs of sub-keys, which are then reconstructed using two phase-only mask (POM) keys with fixed pixel values, placed in the input plane and transform plane 2 of the phase retrieval scheme, respectively; and the other POM key in the transform plane 1 can be generated and updated by the iterative encoding of each plaintext image. In each iteration, the target image acts as the input amplitude constraint in the input plane. During decryption, each plaintext image possessing all the correct keys can be successfully decrypted by measurement key regeneration, compression algorithm reconstruction, inverse wavelet transformation, and Fresnel transformation. Theoretical analysis and numerical simulations both verify the feasibility of the proposed method.
Organization of ESOMM-2014 Conference
2015-09-30
protected) LONG-TERM GOALS Connect key players of marine mammal science community with associated regulators and other stakeholders...implementa- tion of noise as indicator in the European Union’s Marine Strategy Framework Directive. The ESOMM special issue of Aquatic Mammals Journal...provided input from many relvant projects on Marine mammal research. Some examples are: SOCAL-BRS, 3S-BRS, AUTEC-BRS, MOCHA, PCAD, IOGP-JIP, CET-Map, CET
Attributing Crop Production in the United States Using Artificial Neural Network
NASA Astrophysics Data System (ADS)
Ma, Y.; Zhang, Z.; Pan, B.
2017-12-01
Crop production plays key role in supporting life, economy and shaping environment. It is on one hand influenced by natural factors including precipitation, temperature, energy, and on the other hand shaped by the investment of fertilizers, pesticides and human power. Successful attributing of crop production to different factors can help optimize resources and improve productivity. Based on the meteorological records from National Center for Environmental Prediction and state-wise crop production related data provided by the United States Department of Agriculture Economic Research Service, an artificial neural network was constructed to connect crop production with precipitation and temperature anormlies, capital input, labor input, energy input, pesticide consumption and fertilizer consumption. Sensitivity analysis were carried out to attribute their specific influence on crop production for each grid. Results confirmed that the listed factors can generally determine the crop production. Different state response differently to the pertubation of predictands. Their spatial distribution is visulized and discussed.
Consumer involvement in systematic reviews of comparative effectiveness research.
Kreis, Julia; Puhan, Milo A; Schünemann, Holger J; Dickersin, Kay
2013-12-01
The Institute of Medicine recently recommended that comparative effectiveness research (CER) should involve input from consumers. While systematic reviews are a major component of CER, little is known about consumer involvement. To explore current approaches to involving consumers in US-based and key international organizations and groups conducting or commissioning systematic reviews ('organizations'). In-depth, semi-structured interviews with key informants and review of organizations' websites. Seventeen highly regarded US-based and international (Cochrane Collaboration, Campbell Collaboration) organizations. Organizations that usually involve consumers (seven of 17 in our sample) involve them at a programmatic level in the organization or in individual reviews through one-time consultation or on-going collaboration. For example, consumers may suggest topics, provide input on the key questions of the review, provide comments on draft protocols and reports, serve as co-authors or on an advisory group. Organizations involve different types of consumers (individual patients, consumer advocates, families and caregivers), recruiting them mainly through patient organizations and consumer networks. Some offer training in research methods, and one developed training for researchers on how to involve consumers. Little formal evaluation of the effects of consumer involvement is being carried out. Consumers are currently involved in systematic reviews in a variety of ways and for various reasons. Assessing which approaches are most effective in achieving different aims of consumer involvement is now required to inform future recommendations on consumer involvement in CER. © 2012 John Wiley & Sons Ltd.
Consumer involvement in systematic reviews of comparative effectiveness research
Kreis, Julia; Puhan, Milo A.; Schünemann, Holger J.; Dickersin, Kay
2012-01-01
Abstract Background The Institute of Medicine recently recommended that comparative effectiveness research (CER) should involve input from consumers. While systematic reviews are a major component of CER, little is known about consumer involvement. Objective To explore current approaches to involving consumers in US‐based and key international organizations and groups conducting or commissioning systematic reviews (‘organizations’). Design In‐depth, semi‐structured interviews with key informants and review of organizations’ websites. Setting and participants Seventeen highly regarded US‐based and international (Cochrane Collaboration, Campbell Collaboration) organizations. Results Organizations that usually involve consumers (seven of 17 in our sample) involve them at a programmatic level in the organization or in individual reviews through one‐time consultation or on‐going collaboration. For example, consumers may suggest topics, provide input on the key questions of the review, provide comments on draft protocols and reports, serve as co‐authors or on an advisory group. Organizations involve different types of consumers (individual patients, consumer advocates, families and caregivers), recruiting them mainly through patient organizations and consumer networks. Some offer training in research methods, and one developed training for researchers on how to involve consumers. Little formal evaluation of the effects of consumer involvement is being carried out. Conclusions Consumers are currently involved in systematic reviews in a variety of ways and for various reasons. Assessing which approaches are most effective in achieving different aims of consumer involvement is now required to inform future recommendations on consumer involvement in CER. PMID:22390732
ERIC Educational Resources Information Center
Neuman, Susan B.; Koskinen, Patricia
1992-01-01
Analyzes whether comprehensible input via captioned television influences acquisition of science vocabulary and concepts using 129 bilingual seventh and eighth graders. Finds that comprehensible input is a key ingredient in language acquisition and reading development. (MG)
A low power low noise analog front end for portable healthcare system
NASA Astrophysics Data System (ADS)
Yanchao, Wang; Keren, Ke; Wenhui, Qin; Yajie, Qin; Ting, Yi; Zhiliang, Hong
2015-10-01
The presented analog front end (AFE) used to process human bio-signals consists of chopping instrument amplifier (IA), chopping spikes filter and programmable gain and bandwidth amplifier. The capacitor-coupling input of AFE can reject the DC electrode offset. The power consumption of current-feedback based IA is reduced by adopting capacitor divider in the input and feedback network. Besides, IA's input thermal noise is decreased by utilizing complementary CMOS input pairs which can offer higher transconductance. Fabricated in Global Foundry 0.35 μm CMOS technology, the chip consumes 3.96 μA from 3.3 V supply. The measured input noise is 0.85 μVrms (0.5-100 Hz) and the achieved noise efficient factor is 6.48. Project supported by the Science and Technology Commission of Shanghai Municipality (No. 13511501100), the State Key Laboratory Project of China (No. 11MS002), and the State Key Laboratory of ASIC & System, Fudan University.
NASA Astrophysics Data System (ADS)
Bondareva, A. P.; Cheremkhin, P. A.; Evtikhiev, N. N.; Krasnov, V. V.; Starikov, S. N.
Scheme of optical image encryption with digital information input and dynamic encryption key based on two liquid crystal spatial light modulators and operating with spatially-incoherent monochromatic illumination is experimentally implemented. Results of experiments on images optical encryption and numerical decryption are presented. Satisfactory decryption error of 0.20÷0.27 is achieved.
Engbers, Jordan D T; Anderson, Dustin; Asmara, Hadhimulya; Rehak, Renata; Mehaffey, W Hamish; Hameed, Shahid; McKay, Bruce E; Kruskic, Mirna; Zamponi, Gerald W; Turner, Ray W
2012-02-14
Encoding sensory input requires the expression of postsynaptic ion channels to transform key features of afferent input to an appropriate pattern of spike output. Although Ca(2+)-activated K(+) channels are known to control spike frequency in central neurons, Ca(2+)-activated K(+) channels of intermediate conductance (KCa3.1) are believed to be restricted to peripheral neurons. We now report that cerebellar Purkinje cells express KCa3.1 channels, as evidenced through single-cell RT-PCR, immunocytochemistry, pharmacology, and single-channel recordings. Furthermore, KCa3.1 channels coimmunoprecipitate and interact with low voltage-activated Cav3.2 Ca(2+) channels at the nanodomain level to support a previously undescribed transient voltage- and Ca(2+)-dependent current. As a result, subthreshold parallel fiber excitatory postsynaptic potentials (EPSPs) activate Cav3 Ca(2+) influx to trigger a KCa3.1-mediated regulation of the EPSP and subsequent after-hyperpolarization. The Cav3-KCa3.1 complex provides powerful control over temporal summation of EPSPs, effectively suppressing low frequencies of parallel fiber input. KCa3.1 channels thus contribute to a high-pass filter that allows Purkinje cells to respond preferentially to high-frequency parallel fiber bursts characteristic of sensory input.
NASA Astrophysics Data System (ADS)
Pan, Minqiang; Zhong, Yujian
2018-01-01
Porous structure can effectively enhance the heat transfer efficiency. A kind of micro vaporizer using the oriented linear cutting copper fiber sintered felt is proposed in this work. Multiple long cutting copper fibers are firstly fabricated with a multi-tooth tool and then sintered together in parallel to form uniform thickness metal fiber sintered felts that provided a characteristic of oriented microchannels. The temperature rise response and thermal conversion efficiency are experimentally investigated to evaluate the influences of porosity, surface structure, feed flow rate and input power on the evaporation characteristics. It is indicated that the temperature rise response of water is mainly affected by input power and feed flow rate. High input power and low feed flow rate present better temperature rise response of water. Porosity rather than surface structure plays an important role in the temperature rise response of water at a relatively high input power. The thermal conversion efficiency is dominated by the input power and surface structure. The oriented linear cutting copper fiber sintered felts for three kinds of porosities show better thermal conversion efficiency than that of the oriented linear copper wire sintered felt when the input power is less than 115 W. All the sintered felts have almost the same performance of thermal conversion at a high input power.
NASA Astrophysics Data System (ADS)
Lumb, D.
2016-07-01
Athena has been selected by ESA for its second large mission opportunity of the Cosmic Visions programme, to address the theme of the Hot and Energetic Universe. Following the submission of a proposal from the community, the technical and programmatic aspects of the mission design were reviewed in ESA's Concurrent Design Facility. The proposed concept was deemed to betechnically feasible, but with potential constraints from cost and schedule. Two parallel industry study contracts have been conducted to explore these conclusions more thoroughly, with the key aim of providing consolidated inputs to a Mission Consolidation Review that was conducted in April-May 2016. This MCR has recommended a baseline design, which allows the agency to solicit proposals for a community provided payload. Key design aspects arising from the studies are described, and the new reference design is summarised.
The reform of home care services in Ontario: opportunity lost or lesson learned?
Randall, Glen
2007-06-01
With the release of the Romanow Commission report, Canadian governments are poised to consider the creation of a national home care program. If occupational and physical therapists are to have input in shaping such a program, they will need to learn from lost opportunities of the past. This paper provides an overview of recent reforms to home care in Ontario with an emphasis on rehabilitation services. Data were collected from documents and 28 key informant interviews with rehabilitation professionals. Home care in Ontario has evolved in a piecemeal manner without rehabilitation professionals playing a prominent role in program design. Rehabilitation services play a critical role in facilitating hospital discharges, minimizing readmissions, and improving the quality of peoples' lives. Canadians will benefit if occupational and physical therapists seize the unique opportunity before them to provide meaningful input into creating a national home care program.
NASA Technical Reports Server (NTRS)
Briggs, Maxwell; Schifer, Nicholas
2011-01-01
Test hardware used to validate net heat prediction models. Problem: Net Heat Input cannot be measured directly during operation. Net heat input is a key parameter needed in prediction of efficiency for convertor performance. Efficiency = Electrical Power Output (Measured) divided by Net Heat Input (Calculated). Efficiency is used to compare convertor designs and trade technology advantages for mission planning.
Chief of Naval Air Training Automated Management Information System (CAMIS). User’s Guide.
1982-04-01
display. This display allows the user to insert, update, delete , or analyze various data elements, or generate reports. The Flight Schedule Input...instructor, student, and aircraft utiliza- tion. Additionally, it provides a means to delete any erroneous sortie information previously entered. 5...appear: 10 Technical Report 121 VT-## FLIGHT SCHEDULE PROGRAM 1. ADD NEW FLIGHT SCHEDULE DATA 2. DELETE ERRONEOUS SORTIES PREVIOUSLY ENTERED KEY IN
XML-Based Generator of C++ Code for Integration With GUIs
NASA Technical Reports Server (NTRS)
Hua, Hook; Oyafuso, Fabiano; Klimeck, Gerhard
2003-01-01
An open source computer program has been developed to satisfy a need for simplified organization of structured input data for scientific simulation programs. Typically, such input data are parsed in from a flat American Standard Code for Information Interchange (ASCII) text file into computational data structures. Also typically, when a graphical user interface (GUI) is used, there is a need to completely duplicate the input information while providing it to a user in a more structured form. Heretofore, the duplication of the input information has entailed duplication of software efforts and increases in susceptibility to software errors because of the concomitant need to maintain two independent input-handling mechanisms. The present program implements a method in which the input data for a simulation program are completely specified in an Extensible Markup Language (XML)-based text file. The key benefit for XML is storing input data in a structured manner. More importantly, XML allows not just storing of data but also describing what each of the data items are. That XML file contains information useful for rendering the data by other applications. It also then generates data structures in the C++ language that are to be used in the simulation program. In this method, all input data are specified in one place only, and it is easy to integrate the data structures into both the simulation program and the GUI. XML-to-C is useful in two ways: 1. As an executable, it generates the corresponding C++ classes and 2. As a library, it automatically fills the objects with the input data values.
Nutrients in estuaries--an overview and the potential impacts of climate change.
Statham, Peter J
2012-09-15
The fate and cycling of macronutrients introduced into estuaries depend upon a range of interlinked processes. Hydrodynamics and morphology in combination with freshwater inflow control the freshwater flushing time, and the timescale for biogeochemical processes to operate that include microbial activity, particle-dissolved phase interactions, and benthic exchanges. In some systems atmospheric inputs and exchanges with coastal waters can also be important. Climate change will affect nutrient inputs and behaviour through modifications to temperature, wind patterns, the hydrological cycle, and sea level rise. Resulting impacts include: 1) inundation of freshwater systems 2) changes in stratification, flushing times and phytoplankton productivity 3) increased coastal storm activity 4) changes in species and ecosystem function. A combination of continuing high inputs of nutrients through human activity and climate change is anticipated to lead to enhanced eutrophication in the future. The most obvious impacts of increasing global temperature will be in sub-arctic systems where permafrost zones will be reduced in combination with enhanced inputs from glacial systems. Improved process understanding in several key areas including cycling of organic N and P, benthic exchanges, resuspension, impact of bio-irrigation, particle interactions, submarine groundwater discharges, and rates and magnitude of bacterially-driven recycling processes, is needed. Development of high frequency in situ nutrient analysis systems will provide data to improve predictive models that need to incorporate a wider variety of key factors, although the complexity of estuarine systems makes such modelling a challenge. However, overall a more holistic approach is needed to effectively understand, predict and manage the impact of macronutrients on estuaries. Copyright © 2011 Elsevier B.V. All rights reserved.
Plowright, Alleyn T; Johnstone, Craig; Kihlberg, Jan; Pettersson, Jonas; Robb, Graeme; Thompson, Richard A
2012-01-01
In drug discovery, the central process of constructing and testing hypotheses, carefully conducting experiments and analysing the associated data for new findings and information is known as the design-make-test-analyse cycle. Each step relies heavily on the inputs and outputs of the other three components. In this article we report our efforts to improve and integrate all parts to enable smooth and rapid flow of high quality ideas. Key improvements include enhancing multi-disciplinary input into 'Design', increasing the use of knowledge and reducing cycle times in 'Make', providing parallel sets of relevant data within ten working days in 'Test' and maximising the learning in 'Analyse'. Copyright © 2011 Elsevier Ltd. All rights reserved.
Fast implementation of length-adaptive privacy amplification in quantum key distribution
NASA Astrophysics Data System (ADS)
Zhang, Chun-Mei; Li, Mo; Huang, Jing-Zheng; Patcharapong, Treeviriyanupab; Li, Hong-Wei; Li, Fang-Yi; Wang, Chuan; Yin, Zhen-Qiang; Chen, Wei; Keattisak, Sripimanwat; Han, Zhen-Fu
2014-09-01
Post-processing is indispensable in quantum key distribution (QKD), which is aimed at sharing secret keys between two distant parties. It mainly consists of key reconciliation and privacy amplification, which is used for sharing the same keys and for distilling unconditional secret keys. In this paper, we focus on speeding up the privacy amplification process by choosing a simple multiplicative universal class of hash functions. By constructing an optimal multiplication algorithm based on four basic multiplication algorithms, we give a fast software implementation of length-adaptive privacy amplification. “Length-adaptive” indicates that the implementation of privacy amplification automatically adapts to different lengths of input blocks. When the lengths of the input blocks are 1 Mbit and 10 Mbit, the speed of privacy amplification can be as fast as 14.86 Mbps and 10.88 Mbps, respectively. Thus, it is practical for GHz or even higher repetition frequency QKD systems.
Symmetric encryption algorithms using chaotic and non-chaotic generators: A review
Radwan, Ahmed G.; AbdElHaleem, Sherif H.; Abd-El-Hafiz, Salwa K.
2015-01-01
This paper summarizes the symmetric image encryption results of 27 different algorithms, which include substitution-only, permutation-only or both phases. The cores of these algorithms are based on several discrete chaotic maps (Arnold’s cat map and a combination of three generalized maps), one continuous chaotic system (Lorenz) and two non-chaotic generators (fractals and chess-based algorithms). Each algorithm has been analyzed by the correlation coefficients between pixels (horizontal, vertical and diagonal), differential attack measures, Mean Square Error (MSE), entropy, sensitivity analyses and the 15 standard tests of the National Institute of Standards and Technology (NIST) SP-800-22 statistical suite. The analyzed algorithms include a set of new image encryption algorithms based on non-chaotic generators, either using substitution only (using fractals) and permutation only (chess-based) or both. Moreover, two different permutation scenarios are presented where the permutation-phase has or does not have a relationship with the input image through an ON/OFF switch. Different encryption-key lengths and complexities are provided from short to long key to persist brute-force attacks. In addition, sensitivities of those different techniques to a one bit change in the input parameters of the substitution key as well as the permutation key are assessed. Finally, a comparative discussion of this work versus many recent research with respect to the used generators, type of encryption, and analyses is presented to highlight the strengths and added contribution of this paper. PMID:26966561
Intraglomerular inhibition shapes the strength and temporal structure of glomerular output
Shao, Zuoyi; Puche, Adam C.; Liu, Shaolin
2012-01-01
Odor signals are transmitted to the olfactory bulb by olfactory nerve (ON) synapses onto mitral/tufted cells (MCs) and external tufted cells (ETCs). ETCs, in turn, provide feedforward excitatory input to MCs. MC and ETCs are also regulated by inhibition: intraglomerular and interglomerular inhibitory circuits act at MC and ETC apical dendrites; granule cells (GCs) inhibit MC lateral dendrites via the MC→GC→MC circuit. We investigated the contribution of intraglomerular inhibition to MC and ETCs responses to ON input. ON input evokes initial excitation followed by early, strongly summating inhibitory postsynaptic currents (IPSCs) in MCs; this is followed by prolonged, intermittent IPSCs. The N-methyl-d-aspartate receptor antagonist dl-amino-5-phosphovaleric acid, known to suppress GABA release by GCs, reduced late IPSCs but had no effect on early IPSCs. In contrast, selective intraglomerular block of GABAA receptors eliminated all early IPSCs and caused a 5-fold increase in ON-evoked MC spiking and a 10-fold increase in response duration. ETCs also receive intraglomerular inhibition; blockade of inhibition doubled ETC spike responses. By reducing ETC excitatory drive and directly inhibiting MCs, intraglomerular inhibition is a key factor shaping the strength and temporal structure of MC responses to sensory input. Sensory input generates an intraglomerular excitation-inhibition sequence that limits MC spike output to a brief temporal window. Glomerular circuits may dynamically regulate this input-output window to optimize MC encoding across sniff-sampled inputs. PMID:22592311
Intraglomerular inhibition shapes the strength and temporal structure of glomerular output.
Shao, Zuoyi; Puche, Adam C; Liu, Shaolin; Shipley, Michael T
2012-08-01
Odor signals are transmitted to the olfactory bulb by olfactory nerve (ON) synapses onto mitral/tufted cells (MCs) and external tufted cells (ETCs). ETCs, in turn, provide feedforward excitatory input to MCs. MC and ETCs are also regulated by inhibition: intraglomerular and interglomerular inhibitory circuits act at MC and ETC apical dendrites; granule cells (GCs) inhibit MC lateral dendrites via the MC→GC→MC circuit. We investigated the contribution of intraglomerular inhibition to MC and ETCs responses to ON input. ON input evokes initial excitation followed by early, strongly summating inhibitory postsynaptic currents (IPSCs) in MCs; this is followed by prolonged, intermittent IPSCs. The N-methyl-d-aspartate receptor antagonist dl-amino-5-phosphovaleric acid, known to suppress GABA release by GCs, reduced late IPSCs but had no effect on early IPSCs. In contrast, selective intraglomerular block of GABA(A) receptors eliminated all early IPSCs and caused a 5-fold increase in ON-evoked MC spiking and a 10-fold increase in response duration. ETCs also receive intraglomerular inhibition; blockade of inhibition doubled ETC spike responses. By reducing ETC excitatory drive and directly inhibiting MCs, intraglomerular inhibition is a key factor shaping the strength and temporal structure of MC responses to sensory input. Sensory input generates an intraglomerular excitation-inhibition sequence that limits MC spike output to a brief temporal window. Glomerular circuits may dynamically regulate this input-output window to optimize MC encoding across sniff-sampled inputs.
Rule-Based Expert Systems in the Command Estimate: An Operational Perspective
1990-06-01
control measures. 5. Prepare COA statement(s) and sketch(es). The key inputs for developing courses of action are the DFD process of IPB, data stores...mission, or a change of information provides new direction to this process for that particular operation." Formal scientific analysis of the command...30 5. Delivery of outside news . This feature contributes to the commanders insatiable need for current information. Artificial intelligence ana rule
ERIC Educational Resources Information Center
Noonan, James H.; Vavra, Malissa C.
2007-01-01
Data from a variety of sources about crime in schools and colleges and characteristics of the people who commit these offenses provide key input in developing theories and operational applications that can help combat crime in this nation's schools, colleges, and universities. Given the myriad of data available, the objective of this study is to…
Estimation of biophysical properties of upland Sitka spruce (Picea sitchensis) plantations
NASA Technical Reports Server (NTRS)
Green, Robert M.
1993-01-01
It is widely accepted that estimates of forest above-ground biomass are required as inputs to forest ecosystem models, and that SAR data have the potential to provide such information. This study describes relationships between polarimetric radar backscatter and key biophysical properties of a coniferous plantation in upland central Wales, U.K. Over the test site, topography was relatively complex and was expected to influence the amount of radar backscatter.
Encryption and decryption using FPGA
NASA Astrophysics Data System (ADS)
Nayak, Nikhilesh; Chandak, Akshay; Shah, Nisarg; Karthikeyan, B.
2017-11-01
In this paper, we are performing multiple cryptography methods on a set of data and comparing their outputs. Here AES algorithm and RSA algorithm are used. Using AES Algorithm an 8 bit input (plain text) gets encrypted using a cipher key and the result is displayed on tera term (serially). For simulation a 128 bit input is used and operated with a 128 bit cipher key to generate encrypted text. The reverse operations are then performed to get decrypted text. In RSA Algorithm file handling is used to input plain text. This text is then operated on to get the encrypted and decrypted data, which are then stored in a file. Finally the results of both the algorithms are compared.
Dudal, Sherri; Staack, Roland F; Stoellner, Daniela; Fjording, Marianne Scheel; Vieser, Eva; Pascual, Marie-Hélène; Brudny-Kloeppel, Margarete; Golob, Michaela
2014-05-01
The bioanalytical scientist plays a key role in the project team for the drug development of biotherapeutics from the discovery to the marketing phase. Information from the project team members is required for assay development and sample analysis during the discovery, preclinical and clinical phases of the project and input is needed from the bioanalytical scientist to help data interpretation. The European Bioanalysis Forum target team 20 discussed many of the gaps in information and communication between the bioanalytical scientist and project team members as a base for providing a perspective on the bioanalytical scientist's role and interactions within the project team.
Alternative management and funding options for aeronautics programs, Task 1
NASA Technical Reports Server (NTRS)
1975-01-01
Research and technology will be at lower program levels with basic military research for aviation decreasing as fewer aircraft programs are initiated and the present new aircraft programs move into the prototype and production status. The key question is can industry take on the management and financing role and meet the criteria and characteristics considered essential for a viable research and technology program. The criteria for evaluating alternative approaches include an examination of the nature of the product to be provided, responsiveness to changing needs, efficiency in terms of costs, ability to provide leadership, and to provide impartial and independent evaluation of approaches, and to provide technological inputs for regulating functions.
A high speed sequential decoder
NASA Technical Reports Server (NTRS)
Lum, H., Jr.
1972-01-01
The performance and theory of operation for the High Speed Hard Decision Sequential Decoder are delineated. The decoder is a forward error correction system which is capable of accepting data from binary-phase-shift-keyed and quadriphase-shift-keyed modems at input data rates up to 30 megabits per second. Test results show that the decoder is capable of maintaining a composite error rate of 0.00001 at an input E sub b/N sub o of 5.6 db. This performance has been obtained with minimum circuit complexity.
Tukey, David S; Lee, Michelle; Xu, Duo; Eberle, Sarah E; Goffer, Yossef; Manders, Toby R; Ziff, Edward B; Wang, Jing
2013-07-09
Pain and natural rewards such as food elicit different behavioral effects. Both pain and rewards, however, have been shown to alter synaptic activities in the nucleus accumbens (NAc), a key component of the brain reward system. Mechanisms by which external stimuli regulate plasticity at NAc synapses are largely unexplored. Medium spiny neurons (MSNs) from the NAc receive excitatory glutamatergic inputs and modulatory dopaminergic and cholinergic inputs from a variety of cortical and subcortical structures. Glutamate inputs to the NAc arise primarily from prefrontal cortex, thalamus, amygdala, and hippocampus, and different glutamate projections provide distinct synaptic and ultimately behavioral functions. The family of vesicular glutamate transporters (VGLUTs 1-3) plays a key role in the uploading of glutamate into synaptic vesicles. VGLUT1-3 isoforms have distinct expression patterns in the brain, but the effects of external stimuli on their expression patterns have not been studied. In this study, we use a sucrose self-administration paradigm for natural rewards, and spared nerve injury (SNI) model for chronic pain. We examine the levels of VGLUTs (1-3) in synaptoneurosomes of the NAc in these two behavioral models. We find that chronic pain leads to a decrease of VGLUT1, likely reflecting decreased projections from the cortex. Pain also decreases VGLUT3 levels, likely representing a decrease in projections from GABAergic, serotonergic, and/or cholinergic interneurons. In contrast, chronic consumption of sucrose increases VGLUT3 in the NAc, possibly reflecting an increase from these interneuron projections. Our study shows that natural rewards and pain have distinct effects on the VGLUT expression pattern in the NAc, indicating that glutamate inputs to the NAc are differentially modulated by rewards and pain.
Stable isotopes and Digital Elevation Models to study nutrient inputs in high-Arctic lakes
NASA Astrophysics Data System (ADS)
Calizza, Edoardo; Rossi, David; Costantini, Maria Letizia; Careddu, Giulio; Rossi, Loreto
2016-04-01
Ice cover, run-off from the watershed, aquatic and terrestrial primary productivity, guano deposition from birds are key factors controlling nutrient and organic matter inputs in high-Arctic lakes. All these factors are expected to be significantly affected by climate change. Quantifying these controls is a key baseline step to understand what combination of factors subtends the biological productivity in Arctic lakes and will drive their ecological response to environmental change. Basing on Digital Elevation Models, drainage maps, and C and N elemental content and stable isotope analysis in sediments, aquatic vegetation and a dominant macroinvertebrate species (Lepidurus arcticus Pallas 1973) belonging to Tvillingvatnet, Storvatnet and Kolhamna, three lakes located in North Spitsbergen (Svalbard), we propose an integrated approach for the analysis of (i) nutrient and organic matter inputs in lakes; (ii) the role of catchment hydro-geomorphology in determining inter-lake differences in the isotopic composition of sediments; (iii) effects of diverse nutrient inputs on the isotopic niche of Lepidurus arcticus. Given its high run-off and large catchment, organic deposits in Tvillingvatnet where dominated by terrestrial inputs, whereas inputs were mainly of aquatic origin in Storvatnet, a lowland lake with low potential run-off. In Kolhamna, organic deposits seem to be dominated by inputs from birds, which actually colonise the area. Isotopic signatures were similar between samples within each lake, representing precise tracers for studies on the effect of climate change on biogeochemical cycles in lakes. The isotopic niche of L. aricticus reflected differences in sediments between lakes, suggesting a bottom-up effect of hydro-geomorphology characterizing each lake on nutrients assimilated by this species. The presented approach proven to be an effective research pathway for the identification of factors subtending to nutrient and organic matter inputs and transfer within each water body, as well as for the modelling of expected changes in nutrient content associated to changes in isotopic composition of sediments. Key words: nitrogen; carbon, sediment; biogeochemical cycle; climate change; hydro-ecology; isotopic niche; Svalbard
Finke, Mareike; Sandmann, Pascale; Bönitz, Hanna; Kral, Andrej; Büchner, Andreas
2016-01-01
Single-sided deaf subjects with a cochlear implant (CI) provide the unique opportunity to compare central auditory processing of the electrical input (CI ear) and the acoustic input (normal-hearing, NH, ear) within the same individual. In these individuals, sensory processing differs between their two ears, while cognitive abilities are the same irrespectively of the sensory input. To better understand perceptual-cognitive factors modulating speech intelligibility with a CI, this electroencephalography study examined the central-auditory processing of words, the cognitive abilities, and the speech intelligibility in 10 postlingually single-sided deaf CI users. We found lower hit rates and prolonged response times for word classification during an oddball task for the CI ear when compared with the NH ear. Also, event-related potentials reflecting sensory (N1) and higher-order processing (N2/N4) were prolonged for word classification (targets versus nontargets) with the CI ear compared with the NH ear. Our results suggest that speech processing via the CI ear and the NH ear differs both at sensory (N1) and cognitive (N2/N4) processing stages, thereby affecting the behavioral performance for speech discrimination. These results provide objective evidence for cognition to be a key factor for speech perception under adverse listening conditions, such as the degraded speech signal provided from the CI. © 2016 S. Karger AG, Basel.
Word recognition and phonetic structure acquisition: Possible relations
NASA Astrophysics Data System (ADS)
Morgan, James
2002-05-01
Several accounts of possible relations between the emergence of the mental lexicon and acquisition of native language phonological structure have been propounded. In one view, acquisition of word meanings guides infants' attention toward those contrasts that are linguistically significant in their language. In the opposing view, native language phonological categories may be acquired from statistical patterns of input speech, prior to and independent of learning at the lexical level. Here, a more interactive account will be presented, in which phonological structure is modeled as emerging consequentially from the self-organization of perceptual space underlying word recognition. A key prediction of this model is that early native language phonological categories will be highly context specific. Data bearing on this prediction will be presented which provide clues to the nature of infants' statistical analysis of input.
Successful partnerships are the key to improving Aboriginal health.
Bailey, Sandra; Hunt, Jennifer
2012-06-01
Partnership is a process that must be recognised as a fundamental part of any strategy for improving health outcomes for Aboriginal people. Addressing the inequities in health outcomes between Aboriginal people and other Australians will require a sustained, coordinated and well-informed approach that works to a set of goals and targets developed with input from the Aboriginal community. Partnerships provide the most effective mechanism for obtaining this essential input from Aboriginal communities and their representative organisations, enabling Aboriginal people to have an influence at all stages of the health-care process. Within the health sector, effective partnerships harness the efforts of governments and the expertise of Aboriginal Community Controlled Health Services, which offer the most effective means of delivering comprehensive primary health care to Aboriginal people.
OBIST methodology incorporating modified sensitivity of pulses for active analogue filter components
NASA Astrophysics Data System (ADS)
Khade, R. H.; Chaudhari, D. S.
2018-03-01
In this paper, oscillation-based built-in self-test method is used to diagnose catastrophic and parametric faults in integrated circuits. Sallen-Key low pass filter and high pass filter circuits with different gains are used to investigate defects. Variation in seven parameters of operational amplifier (OP-AMP) like gain, input impedance, output impedance, slew rate, input bias current, input offset current, input offset voltage and catastrophic as well as parametric defects in components outside OP-AMP are introduced in the circuit and simulation results are analysed. Oscillator output signal is converted to pulses which are used to generate a signature of the circuit. The signature and pulse count changes with the type of fault present in the circuit under test (CUT). The change in oscillation frequency is observed for fault detection. Designer has flexibility to predefine tolerance band of cut-off frequency and range of pulses for which circuit should be accepted. The fault coverage depends upon the required tolerance band of the CUT. We propose a modification of sensitivity of parameter (pulses) to avoid test escape and enhance yield. Result shows that the method provides 100% fault coverage for catastrophic faults.
NASA Astrophysics Data System (ADS)
Jiang, Yao; Li, Tie-Min; Wang, Li-Ping
2015-09-01
This paper investigates the stiffness modeling of compliant parallel mechanism (CPM) based on the matrix method. First, the general compliance matrix of a serial flexure chain is derived. The stiffness modeling of CPMs is next discussed in detail, considering the relative positions of the applied load and the selected displacement output point. The derived stiffness models have simple and explicit forms, and the input, output, and coupling stiffness matrices of the CPM can easily be obtained. The proposed analytical model is applied to the stiffness modeling and performance analysis of an XY parallel compliant stage with input and output decoupling characteristics. Then, the key geometrical parameters of the stage are optimized to obtain the minimum input decoupling degree. Finally, a prototype of the compliant stage is developed and its input axial stiffness, coupling characteristics, positioning resolution, and circular contouring performance are tested. The results demonstrate the excellent performance of the compliant stage and verify the effectiveness of the proposed theoretical model. The general stiffness models provided in this paper will be helpful for performance analysis, especially in determining coupling characteristics, and the structure optimization of the CPM.
Physical layer security in fiber-optic MIMO-SDM systems: An overview
NASA Astrophysics Data System (ADS)
Guan, Kyle; Cho, Junho; Winzer, Peter J.
2018-02-01
Fiber-optic transmission systems provide large capacities over enormous distances but are vulnerable to simple eavesdropping attacks at the physical layer. We classify key-based and keyless encryption and physical layer security techniques and discuss them in the context of optical multiple-input-multiple-output space-division multiplexed (MIMO-SDM) fiber-optic communication systems. We show that MIMO-SDM not only increases system capacity, but also ensures the confidentiality of information transmission. Based on recent numerical and experimental results, we review how the unique channel characteristics of MIMO-SDM can be exploited to provide various levels of physical layer security.
2015-01-01
of the Actuary cheerfully provided key input for our analysis, no matter the time pressure. We also thank our RAND colleagues David Knapp and... Science Board Task Force on Human Resources Strategy (Defense Science Board, 2000), the Defense Advisory Committee on Mili- tary Compensation (2006...DoD Actuary . The final task was to analyze the cost savings and change in government outlays during the transition to the steady state, including
1990-09-01
channel. Erosion susceptibility, similar to spillway evaluation, must emphasize rock-mass rating or classification systems (e.g. rippability ) which, when...recommends site-specific "proof of concept" testing of an Erosion Probability Index (EPI) based on rock-mass rippability rating and lithostratigraphic...and rock-mass parameters that provide key input parameters to Weaver’s (1975) Rippability Rating (RR) scheme (or Bieniawski’s (1974) Rock Mass Rating
Using Natural Language to Enhance Mission Effectiveness
NASA Technical Reports Server (NTRS)
Trujillo, Anna C.; Meszaros, Erica
2016-01-01
The availability of highly capable, yet relatively cheap, unmanned aerial vehicles (UAVs) is opening up new areas of use for hobbyists and for professional-related activities. The driving function of this research is allowing a non-UAV pilot, an operator, to define and manage a mission. This paper describes the preliminary usability measures of an interface that allows an operator to define the mission using speech to make inputs. An experiment was conducted to begin to enumerate the efficacy and user acceptance of using voice commands to define a multi-UAV mission and to provide high-level vehicle control commands such as "takeoff." The primary independent variable was input type - voice or mouse. The primary dependent variables consisted of the correctness of the mission parameter inputs and the time needed to make all inputs. Other dependent variables included NASA-TLX workload ratings and subjective ratings on a final questionnaire. The experiment required each subject to fill in an online form that contained comparable required information that would be needed for a package dispatcher to deliver packages. For each run, subjects typed in a simple numeric code for the package code. They then defined the initial starting position, the delivery location, and the return location using either pull-down menus or voice input. Voice input was accomplished using CMU Sphinx4-5prealpha for speech recognition. They then inputted the length of the package. These were the option fields. The subject had the system "Calculate Trajectory" and then "Takeoff" once the trajectory was calculated. Later, the subject used "Land" to finish the run. After the voice and mouse input blocked runs, subjects completed a NASA-TLX. At the conclusion of all runs, subjects completed a questionnaire asking them about their experience in inputting the mission parameters, and starting and stopping the mission using mouse and voice input. In general, the usability of voice commands is acceptable. With a relatively well-defined and simple vocabulary, the operator can input the vast majority of the mission parameters using simple, intuitive voice commands. However, voice input may be more applicable to initial mission specification rather than for critical commands such as the need to land immediately due to time and feedback constraints. It would also be convenient to retrieve relevant mission information using voice input. Therefore, further on-going research is looking at using intent from operator utterances to provide the relevant mission information to the operator. The information displayed will be inferred from the operator's utterances just before key phrases are spoken. Linguistic analysis of the context of verbal communication provides insight into the intended meaning of commonly heard phrases such as "What's it doing now?" Analyzing the semantic sphere surrounding these common phrases enables us to predict the operator's intent and supply the operator's desired information to the interface. This paper also describes preliminary investigations into the generation of the semantic space of UAV operation and the success at providing information to the interface based on the operator's utterances.
Nursing acceptance of a speech-input interface: a preliminary investigation.
Dillon, T W; McDowell, D; Norcio, A F; DeHaemer, M J
1994-01-01
Many new technologies are being developed to improve the efficiency and productivity of nursing staffs. User acceptance is a key to the success of these technologies. In this article, the authors present a discussion of nursing acceptance of computer systems, review the basic design issues for creating a speech-input interface, and report preliminary findings of a study of nursing acceptance of a prototype speech-input interface. Results of the study showed that the 19 nursing subjects expressed acceptance of the prototype speech-input interface.
Sensory neurons that detect stretch and nutrients in the digestive system
Williams, Erika K.; Chang, Rui B.; Strochlic, David E.; Umans, Benjamin D.; Lowell, Bradford B.; Liberles, Stephen D.
2016-01-01
SUMMARY Neural inputs from internal organs are essential for normal autonomic function. The vagus nerve is a key body-brain connection that monitors the digestive, cardiovascular, and respiratory systems. Within the gastrointestinal tract, vagal sensory neurons detect gut hormones and organ distension. Here, we investigate the molecular diversity of vagal sensory neurons and their roles in sensing gastrointestinal inputs. Genetic approaches allowed targeted investigation of gut-to-brain afferents involved in homeostatic responses to ingested nutrients (GPR65 neurons) and mechanical distension of the stomach and intestine (GLP1R neurons). Optogenetics, in vivo ganglion imaging, and genetically guided anatomical mapping provide direct links between neuron identity, peripheral anatomy, central anatomy, conduction velocity, response properties in vitro and in vivo, and physiological function. These studies clarify the roles of vagal afferents in mediating particular gut hormone responses. Moreover, genetic control over gut-to-brain neurons provides a molecular framework for understanding neural control of gastrointestinal physiology. PMID:27238020
Spatio-temporal variation of coarse woody debris input in woodland key habitats in central Sweden
Mari Jonsson; Shawn Fraver; Bengt Gunnar. Jonsson
2011-01-01
The persistence of many saproxylic (wood-living) species depends on a readily available supply of coarse woody debris (CWD). Most studies of CWD inputs address stand-level patterns, despite the fact that many saproxylic species depend on landscape-level supplies of CWD. In the present study we used dated CWD inputs (tree mortality events) at each of 14 Norway spruce (...
Rethinking key–value store for parallel I/O optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kougkas, Anthony; Eslami, Hassan; Sun, Xian-He
2015-01-26
Key-value stores are being widely used as the storage system for large-scale internet services and cloud storage systems. However, they are rarely used in HPC systems, where parallel file systems are the dominant storage solution. In this study, we examine the architecture differences and performance characteristics of parallel file systems and key-value stores. We propose using key-value stores to optimize overall Input/Output (I/O) performance, especially for workloads that parallel file systems cannot handle well, such as the cases with intense data synchronization or heavy metadata operations. We conducted experiments with several synthetic benchmarks, an I/O benchmark, and a real application.more » We modeled the performance of these two systems using collected data from our experiments, and we provide a predictive method to identify which system offers better I/O performance given a specific workload. The results show that we can optimize the I/O performance in HPC systems by utilizing key-value stores.« less
Ponce Wong, Ruben D; Hellman, Randall B; Santos, Veronica J
2014-01-01
Upper-limb amputees rely primarily on visual feedback when using their prostheses to interact with others or objects in their environment. A constant reliance upon visual feedback can be mentally exhausting and does not suffice for many activities when line-of-sight is unavailable. Upper-limb amputees could greatly benefit from the ability to perceive edges, one of the most salient features of 3D shape, through touch alone. We present an approach for estimating edge orientation with respect to an artificial fingertip through haptic exploration using a multimodal tactile sensor on a robot hand. Key parameters from the tactile signals for each of four exploratory procedures were used as inputs to a support vector regression model. Edge orientation angles ranging from -90 to 90 degrees were estimated with an 85-input model having an R (2) of 0.99 and RMS error of 5.08 degrees. Electrode impedance signals provided the most useful inputs by encoding spatially asymmetric skin deformation across the entire fingertip. Interestingly, sensor regions that were not in direct contact with the stimulus provided particularly useful information. Methods described here could pave the way for semi-autonomous capabilities in prosthetic or robotic hands during haptic exploration, especially when visual feedback is unavailable.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaufmann, John R.; Hand, James R.; Halverson, Mark A.
This report evaluates how and when to best integrate renewable energy requirements into building energy codes. The basic goals were to: (1) provide a rough guide of where we’re going and how to get there; (2) identify key issues that need to be considered, including a discussion of various options with pros and cons, to help inform code deliberations; and (3) to help foster alignment among energy code-development organizations. The authors researched current approaches nationally and internationally, conducted a survey of key stakeholders to solicit input on various approaches, and evaluated the key issues related to integration of renewable energymore » requirements and various options to address those issues. The report concludes with recommendations and a plan to engage stakeholders. This report does not evaluate whether the use of renewable energy should be required on buildings; that question involves a political decision that is beyond the scope of this report.« less
Transported Geothermal Energy Technoeconomic Screening Tool - Calculation Engine
Liu, Xiaobing
2016-09-21
This calculation engine estimates technoeconomic feasibility for transported geothermal energy projects. The TGE screening tool (geotool.exe) takes input from input file (input.txt), and list results into output file (output.txt). Both the input and ouput files are in the same folder as the geotool.exe. To use the tool, the input file containing adequate information of the case should be prepared in the format explained below, and the input file should be put into the same folder as geotool.exe. Then the geotool.exe can be executed, which will generate a output.txt file in the same folder containing all key calculation results. The format and content of the output file is explained below as well.
Automatic oscillator frequency control system
NASA Technical Reports Server (NTRS)
Smith, S. F. (Inventor)
1985-01-01
A frequency control system makes an initial correction of the frequency of its own timing circuit after comparison against a frequency of known accuracy and then sequentially checks and corrects the frequencies of several voltage controlled local oscillator circuits. The timing circuit initiates the machine cycles of a central processing unit which applies a frequency index to an input register in a modulo-sum frequency divider stage and enables a multiplexer to clock an accumulator register in the divider stage with a cyclical signal derived from the oscillator circuit being checked. Upon expiration of the interval, the processing unit compares the remainder held as the contents of the accumulator against a stored zero error constant and applies an appropriate correction word to a correction stage to shift the frequency of the oscillator being checked. A signal from the accumulator register may be used to drive a phase plane ROM and, with periodic shifts in the applied frequency index, to provide frequency shift keying of the resultant output signal. Interposition of a phase adder between the accumulator register and phase plane ROM permits phase shift keying of the output signal by periodic variation in the value of a phase index applied to one input of the phase adder.
Team performance in the Italian NHS: the role of reflexivity.
Urbini, Flavio; Callea, Antonino; Chirumbolo, Antonio; Talamo, Alessandra; Ingusci, Emanuela; Ciavolino, Enrico
2018-04-09
Purpose The purpose of this paper is twofold: first, to investigate the goodness of the input-process-output (IPO) model in order to evaluate work team performance within the Italian National Health Care System (NHS); and second, to test the mediating role of reflexivity as an overarching process factor between input and output. Design/methodology/approach The Italian version of the Aston Team Performance Inventory was administered to 351 employees working in teams in the Italian NHS. Mediation analyses with latent variables were performed via structural equation modeling (SEM); the significance of total, direct, and indirect effect was tested via bootstrapping. Findings Underpinned by the IPO framework, the results of SEM supported mediational hypotheses. First, the application of the IPO model in the Italian NHS showed adequate fit indices, showing that the process mediates the relationship between input and output factors. Second, reflexivity mediated the relationship between input and output, influencing some aspects of team performance. Practical implications The results provide useful information for HRM policies improving process dimensions of the IPO model via the mediating role of reflexivity as a key role in team performance. Originality/value This study is one of a limited number of studies that applied the IPO model in the Italian NHS. Moreover, no study has yet examined the role of reflexivity as a mediator between input and output factors in the IPO model.
Radiometric calibration of the Earth observing system's imaging sensors
NASA Technical Reports Server (NTRS)
Slater, P. N.
1987-01-01
Philosophy, requirements, and methods of calibration of multispectral space sensor systems as applicable to the Earth Observing System (EOS) are discussed. Vicarious methods for calibration of low spatial resolution systems, with respect to the Advanced Very High Resolution Radiometer (AVHRR), are then summarized. Finally, a theoretical introduction is given to a new vicarious method of calibration using the ratio of diffuse-to-global irradiance at the Earth's surfaces as the key input. This may provide an additional independent method for in-flight calibration.
A pipelined FPGA implementation of an encryption algorithm based on genetic algorithm
NASA Astrophysics Data System (ADS)
Thirer, Nonel
2013-05-01
With the evolution of digital data storage and exchange, it is essential to protect the confidential information from every unauthorized access. High performance encryption algorithms were developed and implemented by software and hardware. Also many methods to attack the cipher text were developed. In the last years, the genetic algorithm has gained much interest in cryptanalysis of cipher texts and also in encryption ciphers. This paper analyses the possibility to use the genetic algorithm as a multiple key sequence generator for an AES (Advanced Encryption Standard) cryptographic system, and also to use a three stages pipeline (with four main blocks: Input data, AES Core, Key generator, Output data) to provide a fast encryption and storage/transmission of a large amount of data.
RNA-SeQC: RNA-seq metrics for quality control and process optimization.
DeLuca, David S; Levin, Joshua Z; Sivachenko, Andrey; Fennell, Timothy; Nazaire, Marc-Danie; Williams, Chris; Reich, Michael; Winckler, Wendy; Getz, Gad
2012-06-01
RNA-seq, the application of next-generation sequencing to RNA, provides transcriptome-wide characterization of cellular activity. Assessment of sequencing performance and library quality is critical to the interpretation of RNA-seq data, yet few tools exist to address this issue. We introduce RNA-SeQC, a program which provides key measures of data quality. These metrics include yield, alignment and duplication rates; GC bias, rRNA content, regions of alignment (exon, intron and intragenic), continuity of coverage, 3'/5' bias and count of detectable transcripts, among others. The software provides multi-sample evaluation of library construction protocols, input materials and other experimental parameters. The modularity of the software enables pipeline integration and the routine monitoring of key measures of data quality such as the number of alignable reads, duplication rates and rRNA contamination. RNA-SeQC allows investigators to make informed decisions about sample inclusion in downstream analysis. In summary, RNA-SeQC provides quality control measures critical to experiment design, process optimization and downstream computational analysis. See www.genepattern.org to run online, or www.broadinstitute.org/rna-seqc/ for a command line tool.
DOT National Transportation Integrated Search
2000-03-01
Seven Key ITS Application Goals emerged from the document review, key contact interviews and input from attendees at ITS committee meetings. They were to use the ITS applications to improve the overall safety of the transportation network, to improve...
Agriculture and Climate Change in Global Scenarios: Why Don't the Models Agree
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, Gerald; van der Mensbrugghe, Dominique; Ahammad, Helal
Agriculture is unique among economic sectors in the nature of impacts from climate change. The production activity that transforms inputs into agricultural outputs makes direct use of weather inputs. Previous studies of the impacts of climate change on agriculture have reported substantial differences in outcomes of key variables such as prices, production, and trade. These divergent outcomes arise from differences in model inputs and model specification. The goal of this paper is to review climate change results and underlying determinants from a model comparison exercise with 10 of the leading global economic models that include significant representation of agriculture. Bymore » providing common productivity drivers that include climate change effects, differences in model outcomes are reduced. All models show higher prices in 2050 because of negative productivity shocks from climate change. The magnitude of the price increases, and the adaptation responses, differ significantly across the various models. Substantial differences exist in the structural parameters affecting demand, area, and yield, and should be a topic for future research.« less
Zhou, Li; Liu, Ming-Zhe; Li, Qing; Deng, Juan; Mu, Di; Sun, Yan-Gang
2017-03-21
Serotonergic neurons play key roles in various biological processes. However, circuit mechanisms underlying tight control of serotonergic neurons remain largely unknown. Here, we systematically investigated the organization of long-range synaptic inputs to serotonergic neurons and GABAergic neurons in the dorsal raphe nucleus (DRN) of mice with a combination of viral tracing, slice electrophysiological, and optogenetic techniques. We found that DRN serotonergic neurons and GABAergic neurons receive largely comparable synaptic inputs from six major upstream brain areas. Upon further analysis of the fine functional circuit structures, we found both bilateral and ipsilateral patterns of topographic connectivity in the DRN for the axons from different inputs. Moreover, the upstream brain areas were found to bidirectionally control the activity of DRN serotonergic neurons by recruiting feedforward inhibition or via a push-pull mechanism. Our study provides a framework for further deciphering the functional roles of long-range circuits controlling the activity of serotonergic neurons in the DRN. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.
Voltage Amplifier Based on Organic Electrochemical Transistor.
Braendlein, Marcel; Lonjaret, Thomas; Leleux, Pierre; Badier, Jean-Michel; Malliaras, George G
2017-01-01
Organic electrochemical transistors (OECTs) are receiving a great deal of attention as amplifying transducers for electrophysiology. A key limitation of this type of transistors, however, lies in the fact that their output is a current, while most electrophysiology equipment requires a voltage input. A simple circuit is built and modeled that uses a drain resistor to produce a voltage output. It is shown that operating the OECT in the saturation regime provides increased sensitivity while maintaining a linear signal transduction. It is demonstrated that this circuit provides high quality recordings of the human heart using readily available electrophysiology equipment, paving the way for the use of OECTs in the clinic.
High-sensitivity DPSK receiver for high-bandwidth free-space optical communication links.
Juarez, Juan C; Young, David W; Sluz, Joseph E; Stotts, Larry B
2011-05-23
A high-sensitivity modem and high-dynamic range optical automatic gain controller (OAGC) have been developed to provide maximum link margin and to overcome the dynamic nature of free-space optical links. A sensitivity of -48.9 dBm (10 photons per bit) at 10 Gbps was achieved employing a return-to-zero differential phase shift keying based modem and a commercial Reed-Solomon forward error correction system. Low-noise optical gain was provided by an OAGC with a noise figure of 4.1 dB (including system required input loses) and a dynamic range of greater than 60 dB.
Keyword Extraction from Multiple Words for Report Recommendations in Media Wiki
NASA Astrophysics Data System (ADS)
Elakiya, K.; Sahayadhas, Arun
2017-03-01
This paper addresses the problem of multiple words search, with the goal of using these multiple word search to retrieve, relevant wiki page which will be recommended to end user. However, the existing system provides a link to wiki page for only a single keyword only which is available in Wikipedia. Therefore it is difficult to get the correct result when search input has multiple keywords or a sentence. We have introduced a ‘FastStringSearch’ technique which will provide option for efficient search with multiple key words and which will increase the flexibility for the end user to get his expected content easily.
Academic Primer Series: Five Key Papers about Team Collaboration Relevant to Emergency Medicine.
Gottlieb, Michael; Grossman, Catherine; Rose, Emily; Sanderson, William; Ankel, Felix; Swaminathan, Anand; Chan, Teresa M
2017-02-01
Team collaboration is an essential for success both within academics and the clinical environment. Often, team collaboration is not explicitly taught during medical school or even residency, and must be learned during one's early career. In this article, we aim to summarize five key papers about team collaboration for early career clinician educators. We conducted a consensus-building process among the writing team to generate a list of key papers that describe the importance or significance of team collaboration, seeking input from social media sources. The authors then used a three-round voting methodology akin to a Delphi study to determine the most important papers from the initially generated list. The five most important papers on the topic of team collaboration, as determined by this mixed group of junior faculty members and faculty developers, are presented in this paper. For each included publication, a summary was provided along with its relevance to junior faculty members and faculty developers. Five key papers about team collaboration are presented in this publication. These papers provide a foundational background to help junior faculty members with collaborating in teams both clinically and academically. This list may also inform senior faculty and faculty developers about the needs of junior faculty members.
Measuring Changes in the Economics of Medical Practice.
Fleming, Christopher; Rich, Eugene; DesRoches, Catherine; Reschovsky, James; Kogan, Rachel
2015-08-01
For the latter third of the twentieth century, researchers have estimated production and cost functions for physician practices. Today, those attempting to measure the inputs and outputs of physician practice must account for many recent changes in models of care delivery. In this paper, we review practice inputs and outputs as typically described in research on the economics of medical practice, and consider the implications of the changing organization of medical practice and nature of physician work. This evolving environment has created conceptual challenges in what are the appropriate measures of output from physician work, as well as what inputs should be measured. Likewise, the increasing complexity of physician practice organizations has introduced challenges to finding the appropriate data sources for measuring these constructs. Both these conceptual and data challenges pose measurement issues that must be overcome to study the economics of modern medical practice. Despite these challenges, there are several promising initiatives involving data sharing at the organizational level that could provide a starting point for developing the needed new data sources and metrics for physician inputs and outputs. However, additional efforts will be required to establish data collection approaches and measurements applicable to smaller and single specialty practices. Overcoming these measurement and data challenges will be key to supporting policy-relevant research on the changing economics of medical practice.
WRF/CMAQ AQMEII3 Simulations of U.S. Regional-Scale Ozone: Sensitivity to Processes and Inputs
Chemical boundary conditions are a key input to regional-scale photochemical models. In this study, performed during the third phase of the Air Quality Model Evaluation International Initiative (AQMEII3), we perform annual simulations over North America with chemical boundary con...
Coats, Heather; Paganelli, Tia; Starks, Helene; Lindhorst, Taryn; Starks Acosta, Anne; Mauksch, Larry; Doorenbos, Ardith
2017-03-01
There is a known shortage of trained palliative care professionals, and an even greater shortage of professionals who have been trained through interprofessional curricula. As part of an institutional Palliative Care Training Center grant, a core team of interprofessional palliative care academic faculty and staff completed a state-wide palliative care educational assessment to determine the needs for an interprofessional palliative care training program. The purpose of this article is to describe the process and results of our community needs assessment of interprofessional palliative care educational needs in Washington state. We approached the needs assessment through a cross-sectional descriptive design by using mixed-method inquiry. Each phase incorporated a variety of settings and subjects. The assessment incorporated multiple phases with diverse methodological approaches: a preparatory phase-identifying key informants; Phase I-key informant interviews; Phase II-survey; and Phase III-steering committee endorsement. The multiple phases of the needs assessment helped create a conceptual framework for the Palliative Care Training Center and developed an interprofessional palliative care curriculum. The input from key informants at multiple phases also allowed us to define priority needs and to refine an interprofessional palliative care curriculum. This curriculum will provide an interprofessional palliative care educational program that crosses disciplinary boundaries to integrate knowledge that is beneficial for all palliative care clinicians. The input from a range of palliative care clinicians and professionals at every phase of the needs assessment was critical for creating an interprofessional palliative care curriculum.
Nutrient inputs from the watershed and coastal eutrophication in the Florida Keys
DOE Office of Scientific and Technical Information (OSTI.GOV)
LaPointe, B.E.; Clark, M.W.
1992-12-01
Widespread use of septic tanks in the Florida Keys increase the nutrient concentrations of limestone ground waters that discharge into shallow nearshore waters, resulting in coastal eutrophication. This study characterizes watershed nutrient inputs, transformations, and effects along a land-sea gradient stratified into four ecosystems that occur with increasing distance from land: manmade canal systems, seagrass meadows, patch reefs, and offshore bank reefs. Soluble reactive phosphorus (SRP), the primary limiting nutrient, was significantly elevated in canal systems, while dissolved inorganic nitrogen (DIN; NH[sub 4][sup =] and NO[sub 3][sup [minus
Using a Personal Device to Strengthen Password Authentication from an Untrusted Computer
NASA Astrophysics Data System (ADS)
Mannan, Mohammad; van Oorschot, P. C.
Keylogging and phishing attacks can extract user identity and sensitive account information for unauthorized access to users' financial accounts. Most existing or proposed solutions are vulnerable to session hijacking attacks. We propose a simple approach to counter these attacks, which cryptographically separates a user's long-term secret input from (typically untrusted) client PCs; a client PC performs most computations but has access only to temporary secrets. The user's long-term secret (typically short and low-entropy) is input through an independent personal trusted device such as a cellphone. The personal device provides a user's long-term secrets to a client PC only after encrypting the secrets using a pre-installed, "correct" public key of a remote service (the intended recipient of the secrets). The proposed protocol (
CALiPER Exploratory Study: Accounting for Uncertainty in Lumen Measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bergman, Rolf; Paget, Maria L.; Richman, Eric E.
2011-03-31
With a well-defined and shared understanding of uncertainty in lumen measurements, testing laboratories can better evaluate their processes, contributing to greater consistency and credibility of lighting testing a key component of the U.S. Department of Energy (DOE) Commercially Available LED Product Evaluation and Reporting (CALiPER) program. Reliable lighting testing is a crucial underlying factor contributing toward the success of many energy-efficient lighting efforts, such as the DOE GATEWAY demonstrations, Lighting Facts Label, ENERGY STAR® energy efficient lighting programs, and many others. Uncertainty in measurements is inherent to all testing methodologies, including photometric and other lighting-related testing. Uncertainty exists for allmore » equipment, processes, and systems of measurement in individual as well as combined ways. A major issue with testing and the resulting accuracy of the tests is the uncertainty of the complete process. Individual equipment uncertainties are typically identified, but their relative value in practice and their combined value with other equipment and processes in the same test are elusive concepts, particularly for complex types of testing such as photometry. The total combined uncertainty of a measurement result is important for repeatable and comparative measurements for light emitting diode (LED) products in comparison with other technologies as well as competing products. This study provides a detailed and step-by-step method for determining uncertainty in lumen measurements, working closely with related standards efforts and key industry experts. This report uses the structure proposed in the Guide to Uncertainty Measurements (GUM) for evaluating and expressing uncertainty in measurements. The steps of the procedure are described and a spreadsheet format adapted for integrating sphere and goniophotometric uncertainty measurements is provided for entering parameters, ordering the information, calculating intermediate values and, finally, obtaining expanded uncertainties. Using this basis and examining each step of the photometric measurement and calibration methods, mathematical uncertainty models are developed. Determination of estimated values of input variables is discussed. Guidance is provided for the evaluation of the standard uncertainties of each input estimate, covariances associated with input estimates and the calculation of the result measurements. With this basis, the combined uncertainty of the measurement results and finally, the expanded uncertainty can be determined.« less
Performance and Usability of Various Robotic Arm Control Modes from Human Force Signals
Mick, Sébastien; Cattaert, Daniel; Paclet, Florent; Oudeyer, Pierre-Yves; de Rugy, Aymar
2017-01-01
Elaborating an efficient and usable mapping between input commands and output movements is still a key challenge for the design of robotic arm prostheses. In order to address this issue, we present and compare three different control modes, by assessing them in terms of performance as well as general usability. Using an isometric force transducer as the command device, these modes convert the force input signal into either a position or a velocity vector, whose magnitude is linearly or quadratically related to force input magnitude. With the robotic arm from the open source 3D-printed Poppy Humanoid platform simulating a mobile prosthesis, an experiment was carried out with eighteen able-bodied subjects performing a 3-D target-reaching task using each of the three modes. The subjects were given questionnaires to evaluate the quality of their experience with each mode, providing an assessment of their global usability in the context of the task. According to performance metrics and questionnaire results, velocity control modes were found to perform better than position control mode in terms of accuracy and quality of control as well as user satisfaction and comfort. Subjects also seemed to favor quadratic velocity control over linear (proportional) velocity control, even if these two modes did not clearly distinguish from one another when it comes to performance and usability assessment. These results highlight the need to take into account user experience as one of the key criteria for the design of control modes intended to operate limb prostheses. PMID:29118699
NASA Astrophysics Data System (ADS)
Xu, R.; Tian, H.; Pan, S.; Yang, J.; Lu, C.; Zhang, B.
2016-12-01
Human activities have caused significant perturbations of the nitrogen (N) cycle, resulting in about 21% increase of atmospheric N2O concentration since the pre-industrial era. This large increase is mainly caused by intensive agricultural activities including the application of nitrogen fertilizer and the expansion of leguminous crops. Substantial efforts have been made to quantify the global and regional N2O emission from agricultural soils in the last several decades using a wide variety of approaches, such as ground-based observation, atmospheric inversion, and process-based model. However, large uncertainties exist in those estimates as well as methods themselves. In this study, we used a coupled biogeochemical model (DLEM) to estimate magnitude, spatial, and temporal patterns of N2O emissions from global croplands in the past five decades (1961-2012). To estimate uncertainties associated with input data and model parameters, we have implemented a number of simulation experiments with DLEM, accounting for key parameter values that affect calculation of N2O fluxes (i.e., maximum nitrification and denitrification rates, N fixation rate, and the adsorption coefficient for soil ammonium and nitrate), different sets of input data including climate, land management practices (i.e., nitrogen fertilizer types, application rates and timings, with/without irrigation), N deposition, and land use and land cover change. This work provides a robust estimate of global N2O emissions from agricultural soils as well as identifies key gaps and limitations in the existing model and data that need to be investigated in the future.
Kupferschmidt, David A; Lovinger, David M
2015-01-01
Cortical inputs to the dorsolateral striatum (DLS) are dynamically regulated during skill learning and habit formation, and are dysregulated in disorders characterized by impaired action control. Therefore, a mechanistic investigation of the processes regulating corticostriatal transmission is key to understanding DLS-associated circuit function, behaviour and pathology. Presynaptic GABAB and group II metabotropic glutamate (mGlu2/3) receptors exert marked inhibitory control over corticostriatal glutamate release in the DLS, yet the signalling pathways through which they do so are unclear. We developed a novel approach using the genetically encoded calcium (Ca2+) indicator GCaMP6 to assess presynaptic Ca2+ in corticostriatal projections to the DLS. Using simultaneous photometric presynaptic Ca2+ and striatal field potential recordings, we report that relative to P/Q-type Ca2+ channels, N-type channels preferentially contributed to evoked presynaptic Ca2+ influx in motor cortex projections to, and excitatory transmission in, the DLS. Activation of GABAB or mGlu2/3 receptors inhibited both evoked presynaptic Ca2+ transients and striatal field potentials. mGlu2/3 receptor-mediated depression did not require functional N-type Ca2+ channels, but was attenuated by blockade of P/Q-type channels. These findings reveal presynaptic mechanisms of inhibitory modulation of corticostriatal function that probably contribute to the selection and shaping of behavioural repertoires. Key points Plastic changes at cortical inputs to the dorsolateral striatum (DLS) underlie skill learning and habit formation, so characterizing the mechanisms by which these inputs are regulated is important for understanding the neural basis of action control. We developed a novel approach using the genetically encoded calcium (Ca2+) indicator GCaMP6 and brain slice photometry to assess evoked presynaptic Ca2+ transients in cortical inputs to the DLS and study their regulation by GABAB and mGlu2/3 receptors. GABAB and mGlu2/3 receptor activation caused clear reductions in electrical stimulus-evoked presynaptic Ca2+ transients in corticostriatal inputs to the DLS. Functional P/Q-type voltage-gated Ca2+ channels were required for the normal inhibitory action of corticostriatal mGlu2/3 receptors. We provide direct evidence of presynaptic Ca2+ inhibition by G protein-coupled receptors at corticostriatal projections. PMID:25781000
Xarray: multi-dimensional data analysis in Python
NASA Astrophysics Data System (ADS)
Hoyer, Stephan; Hamman, Joe; Maussion, Fabien
2017-04-01
xarray (http://xarray.pydata.org) is an open source project and Python package that provides a toolkit and data structures for N-dimensional labeled arrays, which are the bread and butter of modern geoscientific data analysis. Key features of the package include label-based indexing and arithmetic, interoperability with the core scientific Python packages (e.g., pandas, NumPy, Matplotlib, Cartopy), out-of-core computation on datasets that don't fit into memory, a wide range of input/output options, and advanced multi-dimensional data manipulation tools such as group-by and resampling. In this contribution we will present the key features of the library and demonstrate its great potential for a wide range of applications, from (big-)data processing on super computers to data exploration in front of a classroom.
How to Display Hazards and other Scientific Data Using Google Maps
NASA Astrophysics Data System (ADS)
Venezky, D. Y.; Fee, J. M.
2007-12-01
The U.S. Geological Survey's (USGS) Volcano Hazard Program (VHP) is launching a map-based interface to display hazards information using the Google® Map API (Application Program Interface). Map-based interfaces provide a synoptic view of data, making patterns easier to detect and allowing users to quickly ascertain where hazards are in relation to major population and infrastructure centers. Several map-based interfaces are now simple to run on a web server, providing ideal platforms for sharing information with colleagues, emergency managers, and the public. There are three main steps to making data accessible on a map-based interface; formatting the input data, plotting the data on the map, and customizing the user interface. The presentation, "Creating Geospatial RSS and ATOM feeds for Map-based Interfaces" (Fee and Venezky, this session), reviews key features for map input data. Join us for this presentation on how to plot data in a geographic context and then format the display with images, custom markers, and links to external data. Examples will show how the VHP Volcano Status Map was created and how to plot a field trip with driving directions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Belencan, Helen L.; Guevara, Karen C.; Spears, Terrel J.
2013-07-01
The Department of Energy, Office of Environmental Management (DOE EM) program has invested in site specific advisory boards since 1994. These boards have served as a portal to the communities surrounding the DOE sites, provided a key avenue for public involvement, and have actively engaged in providing input and feedback that has informed clean up and priority decisions made by EM. Although the EM program has made considerable progress in completing its mission, work will continue for decades, including work at the Savannah River Site (SRS). It is reasonable to assume the advisory boards will continue in their role providingmore » input and feedback to EM. The SRS Citizen Advisory Board (CAB) formed in 1994 and has issued 298 recommendations through September 2012. Although the effectiveness of the board is not measured by the number of recommendations issued, the recommendations themselves serve to illustrate the areas in which the CAB is particularly interested, and offer insight to the overall effectiveness of the CAB as a means for public participation in the EM decision making process. (authors)« less
Uncertainty Analysis for a Jet Flap Airfoil
NASA Technical Reports Server (NTRS)
Green, Lawrence L.; Cruz, Josue
2006-01-01
An analysis of variance (ANOVA) study was performed to quantify the potential uncertainties of lift and pitching moment coefficient calculations from a computational fluid dynamics code, relative to an experiment, for a jet flap airfoil configuration. Uncertainties due to a number of factors including grid density, angle of attack and jet flap blowing coefficient were examined. The ANOVA software produced a numerical model of the input coefficient data, as functions of the selected factors, to a user-specified order (linear, 2-factor interference, quadratic, or cubic). Residuals between the model and actual data were also produced at each of the input conditions, and uncertainty confidence intervals (in the form of Least Significant Differences or LSD) for experimental, computational, and combined experimental / computational data sets were computed. The LSD bars indicate the smallest resolvable differences in the functional values (lift or pitching moment coefficient) attributable solely to changes in independent variable, given just the input data points from selected data sets. The software also provided a collection of diagnostics which evaluate the suitability of the input data set for use within the ANOVA process, and which examine the behavior of the resultant data, possibly suggesting transformations which should be applied to the data to reduce the LSD. The results illustrate some of the key features of, and results from, the uncertainty analysis studies, including the use of both numerical (continuous) and categorical (discrete) factors, the effects of the number and range of the input data points, and the effects of the number of factors considered simultaneously.
NASA Astrophysics Data System (ADS)
Baine, G. C., II; Caffrey, J. M.
2016-02-01
The estuarine system at Grand Bay National Estuarine Research Reserve in Mississippi is a near pristine wetland home to a diversity of flora and fauna. While seasonal fluctuations in water quality are well understood, less is known about the coupled dynamics of water quality and phytoplankton production. Light availability and nutrient levels are key factors regulating phytoplankton. Previous studies have revealed Grand Bay to primarily be limited by nitrogen rather than phosphorus or light. Since then, extended phosphate inputs from the neighboring Mississippi Phosphates fertilizer plant have occurred provoking the question: will the phosphate inputs affect the growth and structure of the phytoplankton communities? This study is investigating the effects of inputs of an array of nutrients (ammonium, nitrate, silicon, and phosphate) on phytoplankton growth, community structure, and production over an annual cycle. Phytoplankton production is being monitored by accumulation of biomass (chlorophyll a concentration) and C14 incorporation. We are also evaluating changes in the phytoplankton community composition using Flowcam imaging over the course of the incubation. Currently the summer months have shown nitrogen limitation as previously observed, with little difference between nitrate and ammonium additions. Flowcam images have revealed increases in ciliate abundance in all treatments. C14 experiments show significant decreases in efficiency for all treatments compared to the initial condition, however there is no significant variation among treatments. The results of this study will provide a strong foundation in understanding the nature of phytoplankton response to various nutrient inputs in Grand Bay.
Experimental investigation of nonlinear characteristics of a smart fluid damper
NASA Astrophysics Data System (ADS)
Rahman, Mahmudur; Ong, Zhi Chao; Chong, Wen Tong; Julai, Sabariah; Ahamed, Raju
2018-05-01
Smart fluids, known as smart material, are used to form controllable dampers in vibration control applications. Magnetorheological(MR) fluid damper is a well-known smart fluid damper which has a reputation to provide high damping force with low-power input. However, the force/velocity of the MR damper is significantly nonlinear and proper characteristic analysis are required to be studied for optimal implementation in structural vibration control. In this study, an experimental investigation is carried out to test the damping characteristics of MR damper. Dynamic testing is performed with a long-stroke MR damper model no RD-80410-1 from Lord corporation on a universal testing machine(UTM). The force responses of MR damper are measured under different stroke lengths, velocities and current inputs and their performances are analyzed. This study will play a key role to implement MR damper in many structural vibration control applications.
Bengtson Nash, Susan; Rintoul, Stephen R; Kawaguchi, So; Staniland, Iain; van den Hoff, John; Tierney, Megan; Bossi, Rossana
2010-09-01
In order to investigate the extent to which Perfluorinated Contaminants (PFCs) have permeated the Southern Ocean food web to date, a range of Antarctic, sub-Antarctic and Antarctic-migratory biota were analysed for key ionic PFCs. Based upon the geographical distribution pattern and ecology of biota with detectable vs. non-detectable PFC burdens, an evaluation of the potential contributory roles of alternative system input pathways is made. Our analytical findings, together with previous reports, reveal only the occasional occurrence of PFCs in migratory biota and vertebrate predators with foraging ranges extending into or north of the Antarctic Circumpolar Current (ACC). Geographical contamination patterns observed correspond most strongly with those expected from delivery via hydrospheric transport as governed by the unique oceanographic features of the Southern Ocean. We suggest that hydrospheric transport will form a slow, but primary, input pathway of PFCs to the Antarctic region. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
Shewchuk, Richard; O'Connor, Stephen J
2002-01-01
This article describes a process that can be used for eliciting and systematically organizing perceptions held by key stakeholders. An example using a limited sample of older Medicare recipients is developed to illustrate how this approach can be used. Internally, a nominal group technique (NGT) meeting was conducted to identify an array of health care issues that were perceived as important by this group. These perceptions were then used as stimuli to develop an unforced card sort task. Data from the card sorts were analyzed using multidimensional scaling and hierarchical cluster analysis to demonstrate how qualitative input of participants can be organized. The results of these analyses are described to illustrate an example of an interpretive framework that might be used when seeking input from relevant constituents. Suggestions for how this process might be extended to health care planning/marketing efforts are provided.
Input-Based Grammar Pedagogy: A Comparison of Two Possibilities
ERIC Educational Resources Information Center
Marsden, Emma
2005-01-01
This article presents arguments for using listening and reading activities as an option for techniques in grammar pedagogy. It describes two possible approaches: Processing Instruction (PI) and Enriched Input (EI), and examples of their key features are included in the appendices. The article goes on to report on a classroom based quasi-experiment…
NASA Technical Reports Server (NTRS)
Dash, S. M.; Pergament, H. S.
1978-01-01
The basic code structure is discussed, including the overall program flow and a brief description of all subroutines. Instructions on the preparation of input data, definitions of key FORTRAN variables, sample input and output, and a complete listing of the code are presented.
NASA Astrophysics Data System (ADS)
Stackhouse, P. W., Jr.; Cox, S. J.; Mikovitz, J. C.; Zhang, T.; Gupta, S. K.
2016-12-01
The NASA/GEWEX Surface Radiation Budget (SRB) project produces, validates and analyzes shortwave and longwave surface and top of atmosphere radiative fluxes for the 1983-near present time period. The current release 3.0/3.1 consists of 1x1 degree radiative fluxes (available at gewex-srb.larc.nasa.gov) and is produced using the International Satellite Cloud Climatology Project (ISCCP) DX product for pixel level radiance and cloud information. This ISCCP DX product is subsampled to 30 km. ISCCP is currently recalibrating and reprocessing their entire data series, to be released as the H product series, with its highest resolution at 10km pixel resolution. The nine-fold increase in number of pixels will allow SRB to produce a higher resolution gridded product (e.g. 0.5 degree or higher), as well as the production of pixel-level fluxes. Other key input improvements include a detailed aerosol history using the Max Planck Institute Aerosol Climatology (MAC), temperature and moisture profiles from HIRS, and new topography, surface type, and snow/ice maps. Here we present results for the improved GEWEX Shortwave and Longwave algorithm (GSW and GLW) with new ISCCP data (for at least 5 years, 2005-2009), various other improved input data sets and incorporation of many additional internal SRB model improvements. We assess the radiative fluxes from new SRB products and contrast these at various resolutions. All these fluxes are compared to both surface measurements and to CERES SYN1Deg and EBAF data products for assessment of the effect of improvements. The SRB data produced will be released as part of the Release 4.0 Integrated Product that shares key input and output quantities with other GEWEX global products providing estimates of the Earth's global water and energy cycle (i.e., ISCCP, SeaFlux, LandFlux, NVAP, etc.).
Distinguishing Representations as Origin and Representations as Input: Roles for Individual Neurons.
Edwards, Jonathan C W
2016-01-01
It is widely perceived that there is a problem in giving a naturalistic account of mental representation that deals adequately with the issue of meaning, interpretation, or significance (semantic content). It is suggested here that this problem may arise partly from the conflation of two vernacular senses of representation: representation-as-origin and representation-as-input. The flash of a neon sign may in one sense represent a popular drink, but to function as a representation it must provide an input to a 'consumer' in the street. The arguments presented draw on two principles - the neuron doctrine and the need for a venue for 'presentation' or 'reception' of a representation at a specified site, consistent with the locality principle. It is also argued that domains of representation cannot be defined by signal traffic, since they can be expected to include 'null' elements based on non-firing cells. In this analysis, mental representations-as-origin are distributed patterns of cell firing. Each firing cell is given semantic value in its own right - some form of atomic propositional significance - since different axonal branches may contribute to integration with different populations of signals at different downstream sites. Representations-as-input are patterns of local co-arrival of signals in the form of synaptic potentials in dendrites. Meaning then draws on the relationships between active and null inputs, forming 'scenarios' comprising a molecular combination of 'premises' from which a new output with atomic propositional significance is generated. In both types of representation, meaning, interpretation or significance pivots on events in an individual cell. (This analysis only applies to 'occurrent' representations based on current neural activity.) The concept of representations-as-input emphasizes the need for an internal 'consumer' of a representation and the dependence of meaning on the co-relationships involved in an input interaction between signals and consumer. The acceptance of this necessity provides a basis for resolving the problem that representations appear both as distributed (representation-as-origin) and as local (representation-as-input). The key implications are that representations in the brain are massively multiple both in series and in parallel, and that individual cells play specific semantic roles. These roles are discussed in relation to traditional concepts of 'gnostic' cell types.
Distinguishing Representations as Origin and Representations as Input: Roles for Individual Neurons
Edwards, Jonathan C. W.
2016-01-01
It is widely perceived that there is a problem in giving a naturalistic account of mental representation that deals adequately with the issue of meaning, interpretation, or significance (semantic content). It is suggested here that this problem may arise partly from the conflation of two vernacular senses of representation: representation-as-origin and representation-as-input. The flash of a neon sign may in one sense represent a popular drink, but to function as a representation it must provide an input to a ‘consumer’ in the street. The arguments presented draw on two principles – the neuron doctrine and the need for a venue for ‘presentation’ or ‘reception’ of a representation at a specified site, consistent with the locality principle. It is also argued that domains of representation cannot be defined by signal traffic, since they can be expected to include ‘null’ elements based on non-firing cells. In this analysis, mental representations-as-origin are distributed patterns of cell firing. Each firing cell is given semantic value in its own right – some form of atomic propositional significance – since different axonal branches may contribute to integration with different populations of signals at different downstream sites. Representations-as-input are patterns of local co-arrival of signals in the form of synaptic potentials in dendrites. Meaning then draws on the relationships between active and null inputs, forming ‘scenarios’ comprising a molecular combination of ‘premises’ from which a new output with atomic propositional significance is generated. In both types of representation, meaning, interpretation or significance pivots on events in an individual cell. (This analysis only applies to ‘occurrent’ representations based on current neural activity.) The concept of representations-as-input emphasizes the need for an internal ‘consumer’ of a representation and the dependence of meaning on the co-relationships involved in an input interaction between signals and consumer. The acceptance of this necessity provides a basis for resolving the problem that representations appear both as distributed (representation-as-origin) and as local (representation-as-input). The key implications are that representations in the brain are massively multiple both in series and in parallel, and that individual cells play specific semantic roles. These roles are discussed in relation to traditional concepts of ‘gnostic’ cell types. PMID:27746760
Corcoran, Jennifer M.; Knight, Joseph F.; Gallant, Alisa L.
2013-01-01
Wetland mapping at the landscape scale using remotely sensed data requires both affordable data and an efficient accurate classification method. Random forest classification offers several advantages over traditional land cover classification techniques, including a bootstrapping technique to generate robust estimations of outliers in the training data, as well as the capability of measuring classification confidence. Though the random forest classifier can generate complex decision trees with a multitude of input data and still not run a high risk of over fitting, there is a great need to reduce computational and operational costs by including only key input data sets without sacrificing a significant level of accuracy. Our main questions for this study site in Northern Minnesota were: (1) how does classification accuracy and confidence of mapping wetlands compare using different remote sensing platforms and sets of input data; (2) what are the key input variables for accurate differentiation of upland, water, and wetlands, including wetland type; and (3) which datasets and seasonal imagery yield the best accuracy for wetland classification. Our results show the key input variables include terrain (elevation and curvature) and soils descriptors (hydric), along with an assortment of remotely sensed data collected in the spring (satellite visible, near infrared, and thermal bands; satellite normalized vegetation index and Tasseled Cap greenness and wetness; and horizontal-horizontal (HH) and horizontal-vertical (HV) polarization using L-band satellite radar). We undertook this exploratory analysis to inform decisions by natural resource managers charged with monitoring wetland ecosystems and to aid in designing a system for consistent operational mapping of wetlands across landscapes similar to those found in Northern Minnesota.
Automated Rock Identification for Future Mars Exploration Missions
NASA Technical Reports Server (NTRS)
Gulick, V. C.; Morris, R. L.; Gazis, P.; Bishop, J. L.; Alena, R.; Hart, S. D.; Horton, A.
2003-01-01
A key task for human or robotic explorers on the surface of Mars is choosing which particular rock or mineral samples should be selected for more intensive study. The usual challenges of such a task are compounded by the lack of sensory input available to a suited astronaut or the limited downlink bandwidth available to a rover. Additional challenges facing a human mission include limited surface time and the similarities in appearance of important minerals (e.g. carbonates, silicates, salts). Yet the choice of which sample to collect is critical. To address this challenge we are developing science analysis algorithms to interface with a Geologist's Field Assistant (GFA) device that will allow robotic or human remote explorers to better sense and explore their surroundings during limited surface excursions. We aim for our algorithms to interpret spectral and imaging data obtained by various sensors. The algorithms, for example, will identify key minerals, rocks, and sediments from mid-IR, Raman, and visible/near-IR spectra as well as from high resolution and microscopic images to help interpret data and to provide high-level advice to the remote explorer. A top-level system will consider multiple inputs from raw sensor data output by imagers and spectrometers (visible/near-IR, mid-IR, and Raman) as well as human opinion to identify rock and mineral samples.
The effect of long-term changes in plant inputs on soil carbon stocks
NASA Astrophysics Data System (ADS)
Georgiou, K.; Li, Z.; Torn, M. S.
2017-12-01
Soil organic carbon (SOC) is the largest actively-cycling terrestrial reservoir of C and an integral component of thriving natural and managed ecosystems. C input interventions (e.g., litter removal or organic amendments) are common in managed landscapes and present an important decision for maintaining healthy soils in sustainable agriculture and forestry. Furthermore, climate and land-cover change can also affect the amount of plant C inputs that enter the soil through changes in plant productivity, allocation, and rooting depth. Yet, the processes that dictate the response of SOC to such changes in C inputs are poorly understood and inadequately represented in predictive models. Long-term litter manipulations are an invaluable resource for exploring key controls of SOC storage and validating model representations. Here we explore the response of SOC to long-term changes in plant C inputs across a range of biomes and soil types. We synthesize and analyze data from long-term litter manipulation field experiments, and focus our meta-analysis on changes to total SOC stocks, microbial biomass carbon, and mineral-associated (`protected') carbon pools and explore the relative contribution of above- versus below-ground C inputs. Our cross-site data comparison reveals that divergent SOC responses are observed between forest sites, particularly for treatments that increase C inputs to the soil. We explore trends among key variables (e.g., microbial biomass to SOC ratios) that inform soil C model representations. The assembled dataset is an important benchmark for evaluating process-based hypotheses and validating divergent model formulations.
Challenges of Moving IPG into Production
NASA Technical Reports Server (NTRS)
Schulbach, Cathy
2004-01-01
Over the past 5-6 years, NASA has been developing the Information Power Grid and has a persistent testbed currently based on GT2.4.2. This presentation will begin with an overview of IPG status and services, discuss key milestones in IPG development, and present early as well as expected applications. The presentation will discuss some of the issues encountered in developing a grid including the tension between providing centralized and distributed computing. These issues also affect how the grid is moved into full production. Finally, the presentation will provide current plans for moving IPG into full production, including gaining broad user input, developing acceptance criteria from the production operations group, planning upgrades, and training users.
Surgical center: challenges and strategies for nurses in managerial activities.
Martins, Fabiana Zerbieri; Dall'Agnoll, Clarice Maria
2017-02-23
Analyze the challenges and strategies of nurses performing managerial activities in a surgical center. Exploratory, descriptive study with a qualitative approach, involving six nurses by means of the Focus Group Technique, between April and August 2013. Data were submitted to thematic content analysis. The main challenges noted were deficiency of material resources, communication noise, adequacy of personnel downsizing, and relationships with the multidisciplinary team. Key strategies include construction of co-management spaces to promote integration among professionals, conflict resolution and exchange of knowledge. Managerial activities involve the promotion of dialogic moments to coordinate the different processes in the surgical center to provide inputs to expand safety and quality of services provided.
A microprocessor-based control system for the Vienna PDS microdensitometer
NASA Technical Reports Server (NTRS)
Jenkner, H.; Stoll, M.; Hron, J.
1984-01-01
The Motorola Exorset 30 system, based on a Motorola 6809 microprocessor which serves as control processor for the microdensitometer is presented. User communication and instrument control are implemented in this syatem; data transmission to a host computer is provided via standard interfaces. The Vienna PDS system (VIPS) software was developed in BASIC and M6809 assembler. It provides efficient user interaction via function keys and argument input in a menu oriented environment. All parameters can be stored on, and retrieved from, minifloppy disks, making it possible to set up large scanning tasks. Extensive user information includes continuously updated status and coordinate displays, as well as a real time graphic display during scanning.
Encrypting Digital Camera with Automatic Encryption Key Deletion
NASA Technical Reports Server (NTRS)
Oakley, Ernest C. (Inventor)
2007-01-01
A digital video camera includes an image sensor capable of producing a frame of video data representing an image viewed by the sensor, an image memory for storing video data such as previously recorded frame data in a video frame location of the image memory, a read circuit for fetching the previously recorded frame data, an encryption circuit having an encryption key input connected to receive the previously recorded frame data from the read circuit as an encryption key, an un-encrypted data input connected to receive the frame of video data from the image sensor and an encrypted data output port, and a write circuit for writing a frame of encrypted video data received from the encrypted data output port of the encryption circuit to the memory and overwriting the video frame location storing the previously recorded frame data.
NASA Astrophysics Data System (ADS)
Bulthuis, Kevin; Arnst, Maarten; Pattyn, Frank; Favier, Lionel
2017-04-01
Uncertainties in sea-level rise projections are mostly due to uncertainties in Antarctic ice-sheet predictions (IPCC AR5 report, 2013), because key parameters related to the current state of the Antarctic ice sheet (e.g. sub-ice-shelf melting) and future climate forcing are poorly constrained. Here, we propose to improve the predictions of Antarctic ice-sheet behaviour using new uncertainty quantification methods. As opposed to ensemble modelling (Bindschadler et al., 2013) which provides a rather limited view on input and output dispersion, new stochastic methods (Le Maître and Knio, 2010) can provide deeper insight into the impact of uncertainties on complex system behaviour. Such stochastic methods usually begin with deducing a probabilistic description of input parameter uncertainties from the available data. Then, the impact of these input parameter uncertainties on output quantities is assessed by estimating the probability distribution of the outputs by means of uncertainty propagation methods such as Monte Carlo methods or stochastic expansion methods. The use of such uncertainty propagation methods in glaciology may be computationally costly because of the high computational complexity of ice-sheet models. This challenge emphasises the importance of developing reliable and computationally efficient ice-sheet models such as the f.ETISh ice-sheet model (Pattyn, 2015), a new fast thermomechanical coupled ice sheet/ice shelf model capable of handling complex and critical processes such as the marine ice-sheet instability mechanism. Here, we apply these methods to investigate the role of uncertainties in sub-ice-shelf melting, calving rates and climate projections in assessing Antarctic contribution to sea-level rise for the next centuries using the f.ETISh model. We detail the methods and show results that provide nominal values and uncertainty bounds for future sea-level rise as a reflection of the impact of the input parameter uncertainties under consideration, as well as a ranking of the input parameter uncertainties in the order of the significance of their contribution to uncertainty in future sea-level rise. In addition, we discuss how limitations posed by the available information (poorly constrained data) pose challenges that motivate our current research.
Noble gas as tracers for CO2 deep input in petroleum reservoirs
NASA Astrophysics Data System (ADS)
Pujol, Magali; Stuart, Finlay; Gilfillan, Stuart; Montel, François; Masini, Emmanuel
2016-04-01
The sub-salt hydrocarbon reservoirs in the deep offshore part of the Atlantic Ocean passive margins are a new key target for frontier oil and gas exploration. Type I source rocks locally rich in TOC (Total Organic Carbon) combined with an important secondary connected porosity of carbonate reservoirs overlain by an impermeable salt layer gives rise to reservoirs with high petroleum potential. However, some target structures have been found to be mainly filled with CO2 rich fluids. δ13C of the CO2 is generally between -9 and -4 permil, compatible with a deep source (metamorphic or mantle). Understanding the origin of the CO2 and the relative timing of its input into reservoir layers in regard to the geodynamic context appears to be a key issue for CO2 risk evaluation. The inertness and ubiquity of noble gases in crustal fluids make them powerful tools to trace the origin and migration of mixed fluids (Ballentine and Burnard 2002). The isotopic signature of He, Ne and Ar and the elemental pattern (He to Xe) of reservoir fluid from pressurized bottom hole samples provide an insight into fluid source influences at each reservoir depth. Three main end-members can be mixed into reservoir fluids (e.g. Gilfillan et al., 2008): atmospheric signature due to aquifer recharge, radiogenic component from organic fluid ± metamorphic influence, and mantle input. Their relative fractionation provides insights into the nature of fluid transport (Burnard et al., 2012)and its relative migration timing. In the studied offshore passive margin reservoirs, from both sides of South Atlantic margin, a strong MORB-like magmatic CO2 influence is clear. Hence, CO2 charge must have occurred during or after lithospheric break-up. CO2 charge(s) history appears to be complex, and in some cases requires several inputs to generate the observed noble gas pattern. Combining the knowledge obtained from noble gas (origin, relative timing, number of charges) with organic geochemical and thermodynamic understanding of the fluid, in regards with the geodynamical context, helps us to unravel the complex fluid history of these deep environments. Ballentine C.J. and Burnard P.G. (2002). Rev. Mineral. Geochem., vol. 47, pp 481-538. Burnard P et al. (2012) EPSL 341, pp 68-78. Gilfillan, S.M.V. et al. (2008) GCA, vol. 72, pp 1174-1198.
MRAS: A Close but Understudied Member of the RAS Family.
Young, Lucy C; Rodriguez-Viciana, Pablo
2018-01-08
MRAS is the closest relative to the classical RAS oncoproteins and shares most regulatory and effector interactions. However, it also has unique functions, including its ability to function as a phosphatase regulatory subunit when in complex with SHOC2 and protein phosphatase 1 (PP1). This phosphatase complex regulates a crucial step in the activation cycle of RAF kinases and provides a key coordinate input required for efficient ERK pathway activation and transformation by RAS. MRAS mutations rarely occur in cancer but deregulated expression may play a role in tumorigenesis in some settings. Activating mutations in MRAS (as well as SHOC2 and PP1) do occur in the RASopathy Noonan syndrome, underscoring a key role for MRAS within the RAS-ERK pathway. MRAS also has unique roles in cell migration and differentiation and has properties consistent with a key role in the regulation of cell polarity. Further investigations should shed light on what remains a relatively understudied RAS family member. Copyright © 2018 Cold Spring Harbor Laboratory Press; all rights reserved.
Estimating pesticide runoff in small streams.
Schriever, Carola A; von der Ohe, Peter C; Liess, Matthias
2007-08-01
Surface runoff is one of the most important pathways for pesticides to enter surface waters. Mathematical models are employed to characterize its spatio-temporal variability within landscapes, but they must be simple owing to the limited availability and low resolution of data at this scale. This study aimed to validate a simplified spatially-explicit model that is developed for the regional scale to calculate the runoff potential (RP). The RP is a generic indicator of the magnitude of pesticide inputs into streams via runoff. The underlying runoff model considers key environmental factors affecting runoff (precipitation, topography, land use, and soil characteristics), but predicts losses of a generic substance instead of any one pesticide. We predicted and evaluated RP for 20 small streams. RP input data were extracted from governmental databases. Pesticide measurements from a triennial study were used for validation. Measured pesticide concentrations were standardized by the applied mass per catchment and the water solubility of the relevant compounds. The maximum standardized concentration per site and year (runoff loss, R(Loss)) provided a generalized measure of observed pesticide inputs into the streams. Average RP explained 75% (p<0.001) of the variance in R(Loss). Our results imply that the generic indicator can give an adequate estimate of runoff inputs into small streams, wherever data of similar resolution are available. Therefore, we suggest RP for a first quick and cost-effective location of potential runoff hot spots at the landscape level.
The human motor neuron pools receive a dominant slow‐varying common synaptic input
Negro, Francesco; Yavuz, Utku Şükrü
2016-01-01
Key points Motor neurons in a pool receive both common and independent synaptic inputs, although the proportion and role of their common synaptic input is debated.Classic correlation techniques between motor unit spike trains do not measure the absolute proportion of common input and have limitations as a result of the non‐linearity of motor neurons.We propose a method that for the first time allows an accurate quantification of the absolute proportion of low frequency common synaptic input (<5 Hz) to motor neurons in humans.We applied the proposed method to three human muscles and determined experimentally that they receive a similar large amount (>60%) of common input, irrespective of their different functional and control properties.These results increase our knowledge about the role of common and independent input to motor neurons in force control. Abstract Motor neurons receive both common and independent synaptic inputs. This observation is classically based on the presence of a significant correlation between pairs of motor unit spike trains. The functional significance of different relative proportions of common input across muscles, individuals and conditions is still debated. One of the limitations in our understanding of correlated input to motor neurons is that it has not been possible so far to quantify the absolute proportion of common input with respect to the total synaptic input received by the motor neurons. Indeed, correlation measures of pairs of output spike trains only allow for relative comparisons. In the present study, we report for the first time an approach for measuring the proportion of common input in the low frequency bandwidth (<5 Hz) to a motor neuron pool in humans. This estimate is based on a phenomenological model and the theoretical fitting of the experimental values of coherence between the permutations of groups of motor unit spike trains. We demonstrate the validity of this theoretical estimate with several simulations. Moreover, we applied this method to three human muscles: the abductor digiti minimi, tibialis anterior and vastus medialis. Despite these muscles having different functional roles and control properties, as confirmed by the results of the present study, we estimate that their motor pools receive a similar and large (>60%) proportion of common low frequency oscillations with respect to their total synaptic input. These results suggest that the central nervous system provides a large amount of common input to motor neuron pools, in a similar way to that for muscles with different functional and control properties. PMID:27151459
The role of outputs and outcomes in purchaser accountability: reflecting on New Zealand experiences.
Cumming, J; Scott, C D
1998-10-01
Recent reforms in a number of countries' health systems have led to the separation of funder, purchaser and provider roles and the strengthening of funders' and purchasers' positions relative to providers. One of the aims of such reforms is to improve accountability. This paper reports on experiences in New Zealand where, in addition to improving the accountability of providers, purchaser accountability has also been a key policy issue. Attempts have been made in New Zealand to develop a funder-purchaser accountability framework based on a mix of outcomes, outputs and inputs. This paper discusses the roles that each might play in contracts and accountability relationships between funders and purchasers. The paper concludes that holding purchasers accountable for outcomes is likely to prove difficult and controversial, because of problems of attribution and because New Zealand funders in recent years have played an important role in determining the priority outputs and inputs which must be purchased. The paper suggests that accountability is more appropriate at the output and process level, in addition to holding purchasers accountable for the ways in which they make decisions and undertake contracting roles. Holding purchasers accountable for purchasing outputs and processes, however, requires greater commitment on the part of the funder to setting priorities more clearly; specifying the range and level of outputs to be purchased and the terms of access to those services; and funding services to this level. The international attention currently being paid to the development of practice guidelines and priority criteria also suggests that holding purchasers accountable for a form of inputs may become an increasingly common practice in future. From 1 July 1998, New Zealand will introduce a priority criteria system for determining access to elective surgery; accountability is thus becoming focused on inputs in the form of patient characteristics. This approach will greatly assist in promoting accountability.
Exploring extended scope of practice in dietetics: A systems approach.
Ryan, Dominique; Pelly, Fiona; Purcell, Elizabeth
2017-09-01
The aim of this study was to explore health professionals' perceptions of an extended scope of a practice clinic, and develop a framework using a systems approach to facilitate extended scope models across various health settings. A qualitative investigation using semi-structured interviews with four health professionals involved in an extended scope dietitian-led gastroenterology clinic in a hospital in regional Queensland was conducted. A case study design was utilised to investigate interviewees' perceptions of the clinic. Participants were conveniently, purposively sampled. Transcript analysis involved a descriptive analytical approach. Interviewee responses were coded and categorised into themes, and investigator triangulation was used to ensure consistency between individual analyses. A secondary interpretative analysis was conducted where relationships between key themes were mapped to the Systems Engineering Initiative for Patient Safety work system model. Interviewees identified various factors as vital inputs to the work system. These were categorised into the four key elements: stakeholder support, resources, planning and the dietitian. Clinic outcomes were categorised into the impact on four key groups: patients, the dietitian, the multidisciplinary team and the health system. Mapping of the relationships between inputs and outcomes resulted in an implementation framework for extended scope of practice. Extended scope of practice in dietetics may provide positive outcomes for various stakeholders. However, further development of extended scope roles for dietitians requires increased advocacy and support from governments, professional bodies, training institutions and dietitians. We have developed an implementation framework which can be utilised by health professionals interested in embracing an extended scope model of care. © 2016 Dietitians Association of Australia.
Powers, Christina M; Mills, Karmann A; Morris, Stephanie A; Klaessig, Fred; Gaheen, Sharon; Lewinski, Nastassja
2015-01-01
Summary There is a critical opportunity in the field of nanoscience to compare and integrate information across diverse fields of study through informatics (i.e., nanoinformatics). This paper is one in a series of articles on the data curation process in nanoinformatics (nanocuration). Other articles in this series discuss key aspects of nanocuration (temporal metadata, data completeness, database integration), while the focus of this article is on the nanocuration workflow, or the process of identifying, inputting, and reviewing nanomaterial data in a data repository. In particular, the article discusses: 1) the rationale and importance of a defined workflow in nanocuration, 2) the influence of organizational goals or purpose on the workflow, 3) established workflow practices in other fields, 4) current workflow practices in nanocuration, 5) key challenges for workflows in emerging fields like nanomaterials, 6) examples to make these challenges more tangible, and 7) recommendations to address the identified challenges. Throughout the article, there is an emphasis on illustrating key concepts and current practices in the field. Data on current practices in the field are from a group of stakeholders active in nanocuration. In general, the development of workflows for nanocuration is nascent, with few individuals formally trained in data curation or utilizing available nanocuration resources (e.g., ISA-TAB-Nano). Additional emphasis on the potential benefits of cultivating nanomaterial data via nanocuration processes (e.g., capability to analyze data from across research groups) and providing nanocuration resources (e.g., training) will likely prove crucial for the wider application of nanocuration workflows in the scientific community. PMID:26425437
The Advanced Technology Microwave Sounder (ATMS): A New Operational Sensor Series
NASA Technical Reports Server (NTRS)
Kim, Edward; Lyu, Cheng-H Joseph; Leslie, R. Vince; Baker, Neal; Mo, Tsan; Sun, Ninghai; Bi, Li; Anderson, Mike; Landrum, Mike; DeAmici, Giovanni;
2012-01-01
ATMS is a new satellite microwave sounding sensor designed to provide operational weather agencies with atmospheric temperature and moisture profile information for global weather forecasting and climate applications. ATMS will continue the microwave sounding capabilities first provided by its predecessors, the Microwave Sounding Unit (MSU) and Advanced Microwave Sounding Unit (AMSU). The first ATMS was launched October 28, 2011 on board the Suomi National Polar-orbiting Partnership (S-NPP) satellite. Microwave soundings by themselves are the highest-impact input data used by Numerical Weather Prediction (NWP) models; and ATMS, when combined with the Cross-track Infrared Sounder (CrIS), forms the Cross-track Infrared and Microwave Sounding Suite (CrIMSS). The microwave soundings help meet NWP sounding requirements under cloudy sky conditions and provide key profile information near the surface
Integrative Data Analysis of Multi-Platform Cancer Data with a Multimodal Deep Learning Approach.
Liang, Muxuan; Li, Zhizhong; Chen, Ting; Zeng, Jianyang
2015-01-01
Identification of cancer subtypes plays an important role in revealing useful insights into disease pathogenesis and advancing personalized therapy. The recent development of high-throughput sequencing technologies has enabled the rapid collection of multi-platform genomic data (e.g., gene expression, miRNA expression, and DNA methylation) for the same set of tumor samples. Although numerous integrative clustering approaches have been developed to analyze cancer data, few of them are particularly designed to exploit both deep intrinsic statistical properties of each input modality and complex cross-modality correlations among multi-platform input data. In this paper, we propose a new machine learning model, called multimodal deep belief network (DBN), to cluster cancer patients from multi-platform observation data. In our integrative clustering framework, relationships among inherent features of each single modality are first encoded into multiple layers of hidden variables, and then a joint latent model is employed to fuse common features derived from multiple input modalities. A practical learning algorithm, called contrastive divergence (CD), is applied to infer the parameters of our multimodal DBN model in an unsupervised manner. Tests on two available cancer datasets show that our integrative data analysis approach can effectively extract a unified representation of latent features to capture both intra- and cross-modality correlations, and identify meaningful disease subtypes from multi-platform cancer data. In addition, our approach can identify key genes and miRNAs that may play distinct roles in the pathogenesis of different cancer subtypes. Among those key miRNAs, we found that the expression level of miR-29a is highly correlated with survival time in ovarian cancer patients. These results indicate that our multimodal DBN based data analysis approach may have practical applications in cancer pathogenesis studies and provide useful guidelines for personalized cancer therapy.
Preliminary design for a reverse Brayton cycle cryogenic cooler
NASA Technical Reports Server (NTRS)
Swift, Walter L.
1993-01-01
A long life, single stage, reverse Brayton cycle cryogenic cooler is being developed for applications in space. The system is designed to provide 5 W of cooling at a temperature of 65 Kelvin with a total cycle input power of less than 200 watts. Key features of the approach include high speed, miniature turbomachines; an all metal, high performance, compact heat exchanger; and a simple, high frequency, three phase motor drive. In Phase 1, a preliminary design of the system was performed. Analyses and trade studies were used to establish the thermodynamic performance of the system and the performance specifications for individual components. Key mechanical features for components were defined and assembly layouts for the components and the system were prepared. Critical materials and processes were identified. Component and brassboard system level tests were conducted at cryogenic temperatures. The system met the cooling requirement of 5 W at 65 K. The system was also operated over a range of cooling loads from 0.5 W at 37 K to 10 W at 65 K. Input power to the system was higher than target values. The heat exchanger and inverter met or exceeded their respective performance targets. The compresssor/motor assembly was marginally below its performance target. The turboexpander met its aerodynamic efficiency target, but overall performance was below target because of excessive heat leak. The heat leak will be reduced to an acceptable value in the engineering model. The results of Phase 1 indicate that the 200 watt input power requirement can be met with state-of-the-art technology in a system which has very flexible integration requirements and negligible vibration levels.
Preliminary design for a reverse Brayton cycle cryogenic cooler
NASA Astrophysics Data System (ADS)
Swift, Walter L.
1993-12-01
A long life, single stage, reverse Brayton cycle cryogenic cooler is being developed for applications in space. The system is designed to provide 5 W of cooling at a temperature of 65 Kelvin with a total cycle input power of less than 200 watts. Key features of the approach include high speed, miniature turbomachines; an all metal, high performance, compact heat exchanger; and a simple, high frequency, three phase motor drive. In Phase 1, a preliminary design of the system was performed. Analyses and trade studies were used to establish the thermodynamic performance of the system and the performance specifications for individual components. Key mechanical features for components were defined and assembly layouts for the components and the system were prepared. Critical materials and processes were identified. Component and brassboard system level tests were conducted at cryogenic temperatures. The system met the cooling requirement of 5 W at 65 K. The system was also operated over a range of cooling loads from 0.5 W at 37 K to 10 W at 65 K. Input power to the system was higher than target values. The heat exchanger and inverter met or exceeded their respective performance targets. The compresssor/motor assembly was marginally below its performance target. The turboexpander met its aerodynamic efficiency target, but overall performance was below target because of excessive heat leak. The heat leak will be reduced to an acceptable value in the engineering model. The results of Phase 1 indicate that the 200 watt input power requirement can be met with state-of-the-art technology in a system which has very flexible integration requirements and negligible vibration levels.
Damani, Zaheed; MacKean, Gail; Bohm, Eric; DeMone, Brie; Wright, Brock; Noseworthy, Tom; Holroyd-Leduc, Jayna; Marshall, Deborah A
2016-10-18
Policy dialogues are critical for developing responsive, effective, sustainable, evidence-informed policy. Our multidisciplinary team, including researchers, physicians and senior decision-makers, comprehensively evaluated The Winnipeg Central Intake Service, a single-entry model in Winnipeg, Manitoba, to improve patient access to hip/knee replacement surgery. We used the evaluation findings to develop five evidence-informed policy directions to help improve access to scheduled clinical services across Manitoba. Using guiding principles of public participation processes, we hosted a policy roundtable meeting to engage stakeholders and use their input to refine the policy directions. Here, we report on the use and input of a policy roundtable meeting and its role in contributing to the development of evidence-informed policy. Our evidence-informed policy directions focused on formal measurement/monitoring of quality, central intake as a preferred model for service delivery, provincial scope, transparent processes/performance indicators, and patient choice of provider. We held a policy roundtable meeting and used outcomes of facilitated discussions to refine these directions. Individuals from our team and six stakeholder groups across Manitoba participated (n = 44), including patients, family physicians, orthopaedic surgeons, surgical office assistants, Winnipeg Central Intake team, and administrators/managers. We developed evaluation forms to assess the meeting process, and collected decision-maker partners' perspectives on the value of the policy roundtable meeting and use of policy directions to improve access to scheduled clinical services after the meeting, and again 15 months later. We analyzed roundtable and evaluation data using thematic analysis to identify key themes. Four key findings emerged. First, participants supported all policy directions, with revisions and key implementation considerations identified. Second, participants felt the policy roundtable meeting achieved its purpose (to engage stakeholders, elicit feedback, refine policy directions). Third, our decision-maker partners' expectations of the policy roundtable meeting were exceeded; they re-affirmed its value and described the refined policy directions as foundational to establishing the vocabulary, vision and framework for improving access to scheduled clinical services in Manitoba. Finally, our adaptation of key design elements was conducive to discussion of issues surrounding access to care. Our policy roundtable process was an effective tool for acquiring broad input from stakeholders, refining policy directions and forming the necessary consensus starting points to move towards evidence-informed policy.
Daniel J. Miller; Kelly M. Burnett
2008-01-01
Debris flows are important geomorphic agents in mountainous terrains that shape channel environments and add a dynamic element to sediment supply and channel disturbance. Identification of channels susceptible to debris-flow inputs of sediment and organic debris, and quantification of the likelihood and magnitude of those inputs, are key tasks for characterizing...
Parallel processing of afferent olfactory sensory information
Vaaga, Christopher E.
2016-01-01
Key points The functional synaptic connectivity between olfactory receptor neurons and principal cells within the olfactory bulb is not well understood.One view suggests that mitral cells, the primary output neuron of the olfactory bulb, are solely activated by feedforward excitation.Using focal, single glomerular stimulation, we demonstrate that mitral cells receive direct, monosynaptic input from olfactory receptor neurons.Compared to external tufted cells, mitral cells have a prolonged afferent‐evoked EPSC, which serves to amplify the synaptic input.The properties of presynaptic glutamate release from olfactory receptor neurons are similar between mitral and external tufted cells.Our data suggest that afferent input enters the olfactory bulb in a parallel fashion. Abstract Primary olfactory receptor neurons terminate in anatomically and functionally discrete cortical modules known as olfactory bulb glomeruli. The synaptic connectivity and postsynaptic responses of mitral and external tufted cells within the glomerulus may involve both direct and indirect components. For example, it has been suggested that sensory input to mitral cells is indirect through feedforward excitation from external tufted cells. We also observed feedforward excitation of mitral cells with weak stimulation of the olfactory nerve layer; however, focal stimulation of an axon bundle entering an individual glomerulus revealed that mitral cells receive monosynaptic afferent inputs. Although external tufted cells had a 4.1‐fold larger peak EPSC amplitude, integration of the evoked currents showed that the synaptic charge was 5‐fold larger in mitral cells, reflecting the prolonged response in mitral cells. Presynaptic afferents onto mitral and external tufted cells had similar quantal amplitude and release probability, suggesting that the larger peak EPSC in external tufted cells was the result of more synaptic contacts. The results of the present study indicate that the monosynaptic afferent input to mitral cells depends on the strength of odorant stimulation. The enhanced spiking that we observed in response to brief afferent input provides a mechanism for amplifying sensory information and contrasts with the transient response in external tufted cells. These parallel input paths may have discrete functions in processing olfactory sensory input. PMID:27377344
Nearest private query based on quantum oblivious key distribution
NASA Astrophysics Data System (ADS)
Xu, Min; Shi, Run-hua; Luo, Zhen-yu; Peng, Zhen-wan
2017-12-01
Nearest private query is a special private query which involves two parties, a user and a data owner, where the user has a private input (e.g., an integer) and the data owner has a private data set, and the user wants to query which element in the owner's private data set is the nearest to his input without revealing their respective private information. In this paper, we first present a quantum protocol for nearest private query, which is based on quantum oblivious key distribution (QOKD). Compared to the classical related protocols, our protocol has the advantages of the higher security and the better feasibility, so it has a better prospect of applications.
Eye-gaze and intent: Application in 3D interface control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schryver, J.C.; Goldberg, J.H.
1993-06-01
Computer interface control is typically accomplished with an input ``device`` such as keyboard, mouse, trackball, etc. An input device translates a users input actions, such as mouse clicks and key presses, into appropriate computer commands. To control the interface, the user must first convert intent into the syntax of the input device. A more natural means of computer control is possible when the computer can directly infer user intent, without need of intervening input devices. We describe an application of eye-gaze-contingent control of an interactive three-dimensional (3D) user interface. A salient feature of the user interface is natural input, withmore » a heightened impression of controlling the computer directly by the mind. With this interface, input of rotation and translation are intuitive, whereas other abstract features, such as zoom, are more problematic to match with user intent. This paper describes successes with implementation to date, and ongoing efforts to develop a more sophisticated intent inferencing methodology.« less
Eye-gaze and intent: Application in 3D interface control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schryver, J.C.; Goldberg, J.H.
1993-01-01
Computer interface control is typically accomplished with an input device'' such as keyboard, mouse, trackball, etc. An input device translates a users input actions, such as mouse clicks and key presses, into appropriate computer commands. To control the interface, the user must first convert intent into the syntax of the input device. A more natural means of computer control is possible when the computer can directly infer user intent, without need of intervening input devices. We describe an application of eye-gaze-contingent control of an interactive three-dimensional (3D) user interface. A salient feature of the user interface is natural input, withmore » a heightened impression of controlling the computer directly by the mind. With this interface, input of rotation and translation are intuitive, whereas other abstract features, such as zoom, are more problematic to match with user intent. This paper describes successes with implementation to date, and ongoing efforts to develop a more sophisticated intent inferencing methodology.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mandelli, Diego; Rabiti, Cristian; Cogliati, Joshua Joseph
2015-10-01
RAVEN is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. The initial development was aimed to provide dynamic risk analysis capabilities to the Thermo-Hydraulic code RELAP-7, currently under development at the Idaho National Laboratory (INL). Although the initial goal has been fully accomplished, RAVEN is now a multi-purpose probabilistic and uncertainty quantification platform, capable to agnostically communicate with any system code. This agnosticism includes providing Application Programming Interfaces (APIs). These APIs are used to allow RAVEN to interact with any code as long as all the parameters that need tomore » be perturbed are accessible by inputs files or via python interfaces. RAVEN is capable of investigating the system response, and investigating the input space using Monte Carlo, Grid, or Latin Hyper Cube sampling schemes, but its strength is focused toward system feature discovery, such as limit surfaces, separating regions of the input space leading to system failure, using dynamic supervised learning techniques. The development of RAVEN has started in 2012, when, within the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program, the need to provide a modern risk evaluation framework became stronger. RAVEN principal assignment is to provide the necessary software and algorithms in order to employ the concept developed by the Risk Informed Safety Margin Characterization (RISMC) program. RISMC is one of the pathways defined within the Light Water Reactor Sustainability (LWRS) program. In the RISMC approach, the goal is not just the individuation of the frequency of an event potentially leading to a system failure, but the closeness (or not) to key safety-related events. Hence, the approach is interested in identifying and increasing the safety margins related to those events. A safety margin is a numerical value quantifying the probability that a safety metric (e.g. for an important process such as peak pressure in a pipe) is exceeded under certain conditions. The initial development of RAVEN has been focused on providing dynamic risk assessment capability to RELAP-7, currently under development at the INL and, likely, future replacement of the RELAP5-3D code. Most the capabilities that have been implemented having RELAP-7 as principal focus are easily deployable for other system codes. For this reason, several side activaties are currently ongoing for coupling RAVEN with software such as RELAP5-3D, etc. The aim of this document is the explanation of the input requirements, focalizing on the input structure.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mandelli, Diego; Rabiti, Cristian; Cogliati, Joshua Joseph
2016-02-01
RAVEN is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. The initial development was aimed to provide dynamic risk analysis capabilities to the Thermo-Hydraulic code RELAP-7, currently under development at the Idaho National Laboratory (INL). Although the initial goal has been fully accomplished, RAVEN is now a multi-purpose probabilistic and uncertainty quantification platform, capable to agnostically communicate with any system code. This agnosticism includes providing Application Programming Interfaces (APIs). These APIs are used to allow RAVEN to interact with any code as long as all the parameters that need tomore » be perturbed are accessible by input files or via python interfaces. RAVEN is capable of investigating the system response, and investigating the input space using Monte Carlo, Grid, or Latin Hyper Cube sampling schemes, but its strength is focused toward system feature discovery, such as limit surfaces, separating regions of the input space leading to system failure, using dynamic supervised learning techniques. The development of RAVEN started in 2012, when, within the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program, the need to provide a modern risk evaluation framework became stronger. RAVEN principal assignment is to provide the necessary software and algorithms in order to employ the concept developed by the Risk Informed Safety Margin Characterization (RISMC) program. RISMC is one of the pathways defined within the Light Water Reactor Sustainability (LWRS) program. In the RISMC approach, the goal is not just the individuation of the frequency of an event potentially leading to a system failure, but the closeness (or not) to key safety-related events. Hence, the approach is interested in identifying and increasing the safety margins related to those events. A safety margin is a numerical value quantifying the probability that a safety metric (e.g. for an important process such as peak pressure in a pipe) is exceeded under certain conditions. The initial development of RAVEN has been focused on providing dynamic risk assessment capability to RELAP-7, currently under development at the INL and, likely, future replacement of the RELAP5-3D code. Most the capabilities that have been implemented having RELAP-7 as principal focus are easily deployable for other system codes. For this reason, several side activates are currently ongoing for coupling RAVEN with software such as RELAP5-3D, etc. The aim of this document is the explanation of the input requirements, focusing on the input structure.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mandelli, Diego; Rabiti, Cristian; Cogliati, Joshua Joseph
2017-03-01
RAVEN is a generic software framework to perform parametric and probabilistic analy- sis based on the response of complex system codes. The initial development was aimed to provide dynamic risk analysis capabilities to the Thermo-Hydraulic code RELAP-7, currently under development at the Idaho National Laboratory (INL). Although the initial goal has been fully accomplished, RAVEN is now a multi-purpose probabilistic and uncer- tainty quantification platform, capable to agnostically communicate with any system code. This agnosticism includes providing Application Programming Interfaces (APIs). These APIs are used to allow RAVEN to interact with any code as long as all the parameters thatmore » need to be perturbed are accessible by inputs files or via python interfaces. RAVEN is capable of investigating the system response, and investigating the input space using Monte Carlo, Grid, or Latin Hyper Cube sampling schemes, but its strength is focused to- ward system feature discovery, such as limit surfaces, separating regions of the input space leading to system failure, using dynamic supervised learning techniques. The development of RAVEN has started in 2012, when, within the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program, the need to provide a modern risk evaluation framework became stronger. RAVEN principal assignment is to provide the necessary software and algorithms in order to employ the concept developed by the Risk Informed Safety Margin Characterization (RISMC) program. RISMC is one of the pathways defined within the Light Water Reactor Sustainability (LWRS) program. In the RISMC approach, the goal is not just the individuation of the frequency of an event potentially leading to a system failure, but the closeness (or not) to key safety-related events. Hence, the approach is in- terested in identifying and increasing the safety margins related to those events. A safety margin is a numerical value quantifying the probability that a safety metric (e.g. for an important process such as peak pressure in a pipe) is exceeded under certain conditions. The initial development of RAVEN has been focused on providing dynamic risk assess- ment capability to RELAP-7, currently under develop-ment at the INL and, likely, future replacement of the RELAP5-3D code. Most the capabilities that have been implemented having RELAP-7 as principal focus are easily deployable for other system codes. For this reason, several side activates are currently ongoing for coupling RAVEN with soft- ware such as RELAP5-3D, etc. The aim of this document is the explaination of the input requirements, focalizing on the input structure.« less
ASDTIC: A feedback control innovation
NASA Technical Reports Server (NTRS)
Lalli, V. R.; Schoenfeld, A. D.
1972-01-01
The ASDTIC (Analog Signal to Discrete Time Interval Converter) control subsystem provides precise output control of high performance aerospace power supplies. The key to ASDTIC operation is that it stably controls output by sensing output energy change as well as output magnitude. The ASDTIC control subsystem and control module were developed to improve power supply performance during static and dynamic input voltage and output load variations, to reduce output voltage or current regulation due to component variations or aging, to maintain a stable feedback control with variations in the loop gain or loop time constants, and to standardize the feedback control subsystem for power conditioning equipment.
Multiple-access phased array antenna simulator for a digital beam-forming system investigation
NASA Technical Reports Server (NTRS)
Kerczewski, Robert J.; Yu, John; Walton, Joanne C.; Perl, Thomas D.; Andro, Monty; Alexovich, Robert E.
1992-01-01
Future versions of data relay satellite systems are currently being planned by NASA. Being given consideration for implementation are on-board digital beamforming techniques which will allow multiple users to simultaneously access a single S-band phased array antenna system. To investigate the potential performance of such a system, a laboratory simulator has been developed at NASA's Lewis Research Center. This paper describes the system simulator, and in particular, the requirements, design and performance of a key subsystem, the phased array antenna simulator, which provides realistic inputs to the digital processor including multiple signals, noise, and nonlinearities.
Multiple-access phased array antenna simulator for a digital beam forming system investigation
NASA Technical Reports Server (NTRS)
Kerczewski, Robert J.; Yu, John; Walton, Joanne C.; Perl, Thomas D.; Andro, Monty; Alexovich, Robert E.
1992-01-01
Future versions of data relay satellite systems are currently being planned by NASA. Being given consideration for implementation are on-board digital beamforming techniques which will allow multiple users to simultaneously access a single S-band phased array antenna system. To investigate the potential performance of such a system, a laboratory simulator has been developed at NASA's Lewis Research Center. This paper describes the system simulator, and in particular, the requirements, design, and performance of a key subsystem, the phased array antenna simulator, which provides realistic inputs to the digital processor including multiple signals, noise, and nonlinearities.
ASDTIC - A feedback control innovation.
NASA Technical Reports Server (NTRS)
Lalli, V. R.; Schoenfeld, A. D.
1972-01-01
The ASDTIC (analog signal to discrete time interval converter) control subsystem provides precise output control of high performance aerospace power supplies. The key to ASDTIC operation is that it stably controls output by sensing output energy change as well as output magnitude. The ASDTIC control subsystem and control module were developed to improve power supply performance during static and dynamic input voltage and output load variations, to reduce output voltage or current regulation due to component variations or aging, to maintain a stable feedback control with variations in the loop gain or loop time constants, and to standardize the feedback control subsystem for power conditioning equipment.
Bertollo, David N; Alexander, Mary Jane; Shinn, Marybeth; Aybar, Jalila B
2007-06-01
This column describes the nonproprietary software Talker, used to adapt screening instruments to audio computer-assisted self-interviewing (ACASI) systems for low-literacy populations and other populations. Talker supports ease of programming, multiple languages, on-site scoring, and the ability to update a central research database. Key features include highly readable text display, audio presentation of questions and audio prompting of answers, and optional touch screen input. The scripting language for adapting instruments is briefly described as well as two studies in which respondents provided positive feedback on its use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pattison, Morgan
A 2017 update to the Solid-State Lighting R&D Plan that is divided into two documents. The first document describes a list of suggested SSL priority research topics and the second document provides context and background, including information drawn from technical, market, and economic studies. Widely referenced by industry and government both here and abroad, these documents reflect SSL stakeholder inputs on key R&D topics that will improve efficacy, reduce cost, remove barriers to adoption, and add value for LED and OLED lighting solutions over the next three to five years, and discuss those applications that drive and prioritize the specificmore » R&D.« less
Allocation of control rights in the PPP Project: a cooperative game model
NASA Astrophysics Data System (ADS)
Zhang, Yunhua; Feng, Jingchun; Yang, Shengtao
2017-06-01
Reasonable allocation of control rights is the key to the success of Public-Private Partnership (PPP) projects. PPP are services or ventures which are financed and operated through cooperation between governmental and private sector actors and which involve reasonable control rights sharing between these two partners. After professional firm with capital and technology as a shareholder participating in PPP project firms, the PPP project is diversified in participants and input resources. Meanwhile the allocation of control rights of PPP project tends to be complicated. According to the diversification of participants and input resources of PPP projects, the key participants are divided into professional firms and pure investors. Based on the cost of repurchase of different input resources in markets, the cooperative game relationship between these two parties is analyzed, on the basis of which the allocation model of the cooperative game for control rights is constructed to ensure optimum allocation ration of control rights and verify the share of control rights in proportion to the cost of repurchase.
Modeling the Structure of Composite Supernova Remnants
NASA Astrophysics Data System (ADS)
Slane, Patrick
2015-09-01
The dynamical structure of a composite SNR, along with its broadband emission, provides crucial constraints on the ejecta mass and explosion energy, the properties of the pulsar that powers the associated wind nebula, and the ultimate fate of the particles that it injects. Of particular importance is the effect of asymmetries introduced through spatial variations in the ambient medium density and by rapid motion of the pulsar. Here we propose hydrodynamical and semi-analytical modeling of G21.5-0.9 and G292.0+1.8, SNRs for which deep Chandra observations have provided key input parameters for these models. We will derive ambient conditions and pulsar properties that lead to the observed morphology, broadband emission, and shock conditions in these important composite systems.
ESA's Soil Moisture dnd Ocean Salinity Mission - Contributing to Water Resource Management
NASA Astrophysics Data System (ADS)
Mecklenburg, S.; Kerr, Y. H.
2015-12-01
The Soil Moisture and Ocean Salinity (SMOS) mission, launched in November 2009, is the European Space Agency's (ESA) second Earth Explorer Opportunity mission. The scientific objectives of the SMOS mission directly respond to the need for global observations of soil moisture and ocean salinity, two key variables used in predictive hydrological, oceanographic and atmospheric models. SMOS observations also provide information on the characterisation of ice and snow covered surfaces and the sea ice effect on ocean-atmosphere heat fluxes and dynamics, which affects large-scale processes of the Earth's climate system. The focus of this paper will be on SMOS's contribution to support water resource management: SMOS surface soil moisture provides the input to derive root-zone soil moisture, which in turn provides the input for the drought index, an important monitoring prediction tool for plant available water. In addition to surface soil moisture, SMOS also provides observations on vegetation optical depth. Both parameters aid agricultural applications such as crop growth, yield forecasting and drought monitoring, and provide input for carbon and land surface modelling. SMOS data products are used in data assimilation and forecasting systems. Over land, assimilating SMOS derived information has shown to have a positive impact on applications such as NWP, stream flow forecasting and the analysis of net ecosystem exchange. Over ocean, both sea surface salinity and severe wind speed have the potential to increase the predictive skill on the seasonal and short- to medium-range forecast range. Operational users in particular in Numerical Weather Prediction and operational hydrology have put forward a requirement for soil moisture data to be available in near-real time (NRT). This has been addressed by developing a fast retrieval for a NRT level 2 soil moisture product based on Neural Networks, which will be available by autumn 2015. This paper will focus on presenting the above applications and used SMOS data products.
Advanced Terrain Representation for the Microticcit Workstation: System Maintenance Manual
1986-02-01
enter the */ /* password. */ /* Inputs: passwd - password to compare userfs entry to */ /* Outputs: TRUE - if password entered correctly...include "atrdefs.h" #include "ctype.h" extern char window[]; /* useable portion of screen */ 1 i getpw( passwd ) char passwd []; { int c...blank input window */ pcvgcp(&row,*col); curs_off(); nchars - ntries - 0; len « strlen( passwd ); pcvwca(len,• *,REVIDEO); /* process keys till user
Toda, Mitsuru; Opwora, Antony; Waweru, Evelyn; Noor, Abdisalan; Edwards, Tansy; Fegan, Greg; Molyneux, Catherine; Goodman, Catherine
2012-12-13
Equitable access to health care is a key health systems goal, and is a particular concern in low-income countries. In Kenya, public facilities are an important resource for the poor, but little is known on the equity of service provision. This paper assesses whether poorer areas have poorer health services by investigating associations between public facility characteristics and the poverty level of the area in which the facility is located. Data on facility characteristics were collected from a nationally representative sample of public health centers and dispensaries across all 8 provinces in Kenya. A two-stage cluster randomized sampling process was used to select facilities. Univariate associations between facility characteristics and socioeconomic status (SES) of the area in which the facility was located were assessed using chi-squared tests, equity ratios and concentration indices. Indirectly standardized concentration indices were used to assess the influence of SES on facility inputs and service availability while controlling for facility type, province, and remoteness. For most indicators, we found no indication of variation by SES. The clear exceptions were electricity and laboratory services which showed evidence of pro-rich inequalities, with equity ratios of 3.16 and 3.43, concentration indices of 0.09 (p<0.01) and 0.05 (p=0.01), and indirectly standardized concentration ratios of 0.07 (p<0.01) and 0.05 (p=0.01). There were also some indications of pro-rich inequalities for availability of drugs and qualified staff. The lack of evidence of inequality for other indicators does not imply that availability of inputs and services was invariably high; for example, while availability was close to 90% for water supply and family planning services, under half of facilities offered delivery services or outreach. The paper shows how local area poverty data can be combined with national health facility surveys, providing a tool for policy makers to assess the equity of input and service availability. There was little evidence of inequalities for most inputs and services, with the clear exceptions of electricity and laboratory services. However, efforts are required to improve the availability of key inputs and services across public facilities in all areas, regardless of SES.
Richard, Jennifer E.; Farkas, Imre; Anesten, Fredrik; Anderberg, Rozita H.; Dickson, Suzanne L.; Gribble, Fiona M.; Reimann, Frank; Jansson, John-Olov; Liposits, Zsolt
2014-01-01
The parabrachial nucleus (PBN) is a key nucleus for the regulation of feeding behavior. Inhibitory inputs from the hypothalamus to the PBN play a crucial role in the normal maintenance of feeding behavior, because their loss leads to starvation. Viscerosensory stimuli result in neuronal activation of the PBN. However, the origin and neurochemical identity of the excitatory neuronal input to the PBN remain largely unexplored. Here, we hypothesize that hindbrain glucagon-like peptide 1 (GLP-1) neurons provide excitatory inputs to the PBN, activation of which may lead to a reduction in feeding behavior. Our data, obtained from mice expressing the yellow fluorescent protein in GLP-1-producing neurons, revealed that hindbrain GLP-1-producing neurons project to the lateral PBN (lPBN). Stimulation of lPBN GLP-1 receptors (GLP-1Rs) reduced the intake of chow and palatable food and decreased body weight in rats. It also activated lPBN neurons, reflected by an increase in the number of c-Fos-positive cells in this region. Further support for an excitatory role of GLP-1 in the PBN is provided by electrophysiological studies showing a remarkable increase in firing of lPBN neurons after Exendin-4 application. We show that within the PBN, GLP-1R activation increased gene expression of 2 energy balance regulating peptides, calcitonin gene-related peptide (CGRP) and IL-6. Moreover, nearly 70% of the lPBN GLP-1 fibers innervated lPBN CGRP neurons. Direct intra-lPBN CGRP application resulted in anorexia. Collectively, our molecular, anatomical, electrophysiological, pharmacological, and behavioral data provide evidence for a functional role of the GLP-1R for feeding control in the PBN. PMID:25116706
A 11 mW 2.4 GHz 0.18 µm CMOS Transceivers for Wireless Sensor Networks.
Hou, Bing; Chen, Hua; Wang, Zhiyu; Mo, Jiongjiong; Chen, Junli; Yu, Faxin; Wang, Wenbo
2017-01-24
In this paper, a low power transceiver for wireless sensor networks (WSN) is proposed. The system is designed with fully functional blocks including a receiver, a fractional-N frequency synthesizer, and a class-E transmitter, and it is optimized with a good balance among output power, sensitivity, power consumption, and silicon area. A transmitter and receiver (TX-RX) shared input-output matching network is used so that only one off-chip inductor is needed in the system. The power and area efficiency-oriented, fully-integrated frequency synthesizer is able to provide programmable output frequencies in the 2.4 GHz range while occupying a small silicon area. Implemented in a standard 0.18 μm RF Complementary Metal Oxide Semiconductor (CMOS) technology, the whole transceiver occupies a chip area of 0.5 mm² (1.2 mm² including bonding pads for a QFN package). Measurement results suggest that the design is able to work at amplitude shift keying (ASK)/on-off-keying (OOK) and FSK modes with up to 500 kbps data rate. With an input sensitivity of -60 dBm and an output power of 3 dBm, the receiver, transmitter and frequency synthesizer consumes 2.3 mW, 4.8 mW, and 3.9 mW from a 1.8 V supply voltage, respectively.
Mapping the structure of the world economy.
Lenzen, Manfred; Kanemoto, Keiichiro; Moran, Daniel; Geschke, Arne
2012-08-07
We have developed a new series of environmentally extended multi-region input-output (MRIO) tables with applications in carbon, water, and ecological footprinting, and Life-Cycle Assessment, as well as trend and key driver analyses. Such applications have recently been at the forefront of global policy debates, such as about assigning responsibility for emissions embodied in internationally traded products. The new time series was constructed using advanced parallelized supercomputing resources, and significantly advances the previous state of art because of four innovations. First, it is available as a continuous 20-year time series of MRIO tables. Second, it distinguishes 187 individual countries comprising more than 15,000 industry sectors, and hence offers unsurpassed detail. Third, it provides information just 1-3 years delayed therefore significantly improving timeliness. Fourth, it presents MRIO elements with accompanying standard deviations in order to allow users to understand the reliability of data. These advances will lead to material improvements in the capability of applications that rely on input-output tables. The timeliness of information means that analyses are more relevant to current policy questions. The continuity of the time series enables the robust identification of key trends and drivers of global environmental change. The high country and sector detail drastically improves the resolution of Life-Cycle Assessments. Finally, the availability of information on uncertainty allows policy-makers to quantitatively judge the level of confidence that can be placed in the results of analyses.
Acute Fasting Regulates Retrograde Synaptic Enhancement through a 4E-BP-Dependent Mechanism.
Kauwe, Grant; Tsurudome, Kazuya; Penney, Jay; Mori, Megumi; Gray, Lindsay; Calderon, Mario R; Elazouzzi, Fatima; Chicoine, Nicole; Sonenberg, Nahum; Haghighi, A Pejmun
2016-12-21
While beneficial effects of fasting on organismal function and health are well appreciated, we know little about the molecular details of how fasting influences synaptic function and plasticity. Our genetic and electrophysiological experiments demonstrate that acute fasting blocks retrograde synaptic enhancement that is normally triggered as a result of reduction in postsynaptic receptor function at the Drosophila larval neuromuscular junction (NMJ). This negative regulation critically depends on transcriptional enhancement of eukaryotic initiation factor 4E binding protein (4E-BP) under the control of the transcription factor Forkhead box O (Foxo). Furthermore, our findings indicate that postsynaptic 4E-BP exerts a constitutive negative input, which is counteracted by a positive regulatory input from the Target of Rapamycin (TOR). This combinatorial retrograde signaling plays a key role in regulating synaptic strength. Our results provide a mechanistic insight into how cellular stress and nutritional scarcity could acutely influence synaptic homeostasis and functional stability in neural circuits. Copyright © 2016 Elsevier Inc. All rights reserved.
Two paths to blame: Intentionality directs moral information processing along two distinct tracks.
Monroe, Andrew E; Malle, Bertram F
2017-01-01
There is broad consensus that features such as causality, mental states, and preventability are key inputs to moral judgments of blame. What is not clear is exactly how people process these inputs to arrive at such judgments. Three studies provide evidence that early judgments of whether or not a norm violation is intentional direct information processing along 1 of 2 tracks: if the violation is deemed intentional, blame processing relies on information about the agent's reasons for committing the violation; if the violation is deemed unintentional, blame processing relies on information about how preventable the violation was. Owing to these processing commitments, when new information requires perceivers to switch tracks, they must reconfigure their judgments, which results in measurable processing costs indicated by reaction time (RT) delays. These findings offer support for a new theory of moral judgment (the Path Model of Blame) and advance the study of moral cognition as hierarchical information processing. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Zimoń, Małgorzata; Sawko, Robert; Emerson, David; Thompson, Christopher
2017-11-01
Uncertainty quantification (UQ) is increasingly becoming an indispensable tool for assessing the reliability of computational modelling. Efficient handling of stochastic inputs, such as boundary conditions, physical properties or geometry, increases the utility of model results significantly. We discuss the application of non-intrusive generalised polynomial chaos techniques in the context of fluid engineering simulations. Deterministic and Monte Carlo integration rules are applied to a set of problems, including ordinary differential equations and the computation of aerodynamic parameters subject to random perturbations. In particular, we analyse acoustic wave propagation in a heterogeneous medium to study the effects of mesh resolution, transients, number and variability of stochastic inputs. We consider variants of multi-level Monte Carlo and perform a novel comparison of the methods with respect to numerical and parametric errors, as well as computational cost. The results provide a comprehensive view of the necessary steps in UQ analysis and demonstrate some key features of stochastic fluid flow systems.
A radial transmission line material measurement apparatus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warne, L.K.; Moyer, R.D.; Koontz, T.E.
1993-05-01
A radial transmission line material measurement sample apparatus (sample holder, offset short standards, measurement software, and instrumentation) is described which has been proposed, analyzed, designed, constructed, and tested. The purpose of the apparatus is to obtain accurate surface impedance measurements of lossy, possibly anisotropic, samples at low and intermediate frequencies (vhf and low uhf). The samples typically take the form of sections of the material coatings on conducting objects. Such measurements thus provide the key input data for predictive numerical scattering codes. Prediction of the sample surface impedance from the coaxial input impedance measurement is carried out by two techniques.more » The first is an analytical model for the coaxial-to-radial transmission line junction. The second is an empirical determination of the bilinear transformation model of the junction by the measurement of three full standards. The standards take the form of three offset shorts (and an additional lossy Salisbury load), which have also been constructed. The accuracy achievable with the device appears to be near one percent.« less
Kutejova, Eva; Sasai, Noriaki; Shah, Ankita; Gouti, Mina; Briscoe, James
2016-03-21
In the vertebrate neural tube, a morphogen-induced transcriptional network produces multiple molecularly distinct progenitor domains, each generating different neuronal subtypes. Using an in vitro differentiation system, we defined gene expression signatures of distinct progenitor populations and identified direct gene-regulatory inputs corresponding to locations of specific transcription factor binding. Combined with targeted perturbations of the network, this revealed a mechanism in which a progenitor identity is installed by active repression of the entire transcriptional programs of other neural progenitor fates. In the ventral neural tube, sonic hedgehog (Shh) signaling, together with broadly expressed transcriptional activators, concurrently activates the gene expression programs of several domains. The specific outcome is selected by repressive input provided by Shh-induced transcription factors that act as the key nodes in the network, enabling progenitors to adopt a single definitive identity from several initially permitted options. Together, the data suggest design principles relevant to many developing tissues. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Isotope evidence for agricultural extensification reveals how the world's first cities were fed.
Styring, Amy K; Charles, Michael; Fantone, Federica; Hald, Mette Marie; McMahon, Augusta; Meadow, Richard H; Nicholls, Geoff K; Patel, Ajita K; Pitre, Mindy C; Smith, Alexia; Sołtysiak, Arkadiusz; Stein, Gil; Weber, Jill A; Weiss, Harvey; Bogaard, Amy
2017-06-05
This study sheds light on the agricultural economy that underpinned the emergence of the first urban centres in northern Mesopotamia. Using δ 13 C and δ 15 N values of crop remains from the sites of Tell Sabi Abyad, Tell Zeidan, Hamoukar, Tell Brak and Tell Leilan (6500-2000 cal bc), we reveal that labour-intensive practices such as manuring/middening and water management formed an integral part of the agricultural strategy from the seventh millennium bc. Increased agricultural production to support growing urban populations was achieved by cultivation of larger areas of land, entailing lower manure/midden inputs per unit area-extensification. Our findings paint a nuanced picture of the role of agricultural production in new forms of political centralization. The shift towards lower-input farming most plausibly developed gradually at a household level, but the increased importance of land-based wealth constituted a key potential source of political power, providing the possibility for greater bureaucratic control and contributing to the wider societal changes that accompanied urbanization.
Sustainable fisheries management: Pacific salmon
Knudsen, E. Eric; Steward, Cleveland R.; MacDonald, Donald; Williams, Jack E.; Reiser, Dudley W.
1999-01-01
What has happened to the salmon resource in the Pacific Northwest? Who is responsible and what can be done to reverse the decline in salmon populations? The responsibly falls on everyone involved - fishermen, resource managers and concerned citizens alike - to take the steps necessary to ensure that salmon populations make a full recovery.This collection of papers examines the state of the salmon fisheries in the Pacific Northwest. They cover existing methods and supply model approaches for alternative solutions. The editors stress the importance of input from and cooperation with all parties involved to create a viable solution. Grass roots education and participation is the key to public support - and ultimately the success - of whatever management solutions are developed.A unique and valuable scientific publication, Sustainable Fisheries Management: Pacific Salmon clearly articulates the current state of the Pacific salmon resource, describes the key features of its management, and provides important guidance on how we can make the transition towards sustainable fisheries. The solutions presented in this book provide the basis of a strategy for sustainable fisheries, requiring society and governmental agencies to establish a shared vision, common policies, and a process for collaborative management.
E-cadherin junction formation involves an active kinetic nucleation process
Biswas, Kabir H.; Hartman, Kevin L.; Yu, Cheng-han; Harrison, Oliver J.; Song, Hang; Smith, Adam W.; Huang, William Y. C.; Lin, Wan-Chen; Guo, Zhenhuan; Padmanabhan, Anup; Troyanovsky, Sergey M.; Dustin, Michael L.; Shapiro, Lawrence; Honig, Barry; Zaidel-Bar, Ronen; Groves, Jay T.
2015-01-01
Epithelial (E)-cadherin-mediated cell−cell junctions play important roles in the development and maintenance of tissue structure in multicellular organisms. E-cadherin adhesion is thus a key element of the cellular microenvironment that provides both mechanical and biochemical signaling inputs. Here, we report in vitro reconstitution of junction-like structures between native E-cadherin in living cells and the extracellular domain of E-cadherin (E-cad-ECD) in a supported membrane. Junction formation in this hybrid live cell-supported membrane configuration requires both active processes within the living cell and a supported membrane with low E-cad-ECD mobility. The hybrid junctions recruit α-catenin and exhibit remodeled cortical actin. Observations suggest that the initial stages of junction formation in this hybrid system depend on the trans but not the cis interactions between E-cadherin molecules, and proceed via a nucleation process in which protrusion and retraction of filopodia play a key role. PMID:26290581
Asymmetric optical image encryption using Kolmogorov phase screens and equal modulus decomposition
NASA Astrophysics Data System (ADS)
Kumar, Ravi; Bhaduri, Basanta; Quan, Chenggen
2017-11-01
An asymmetric technique for optical image encryption is proposed using Kolmogorov phase screens (KPSs) and equal modulus decomposition (EMD). The KPSs are generated using the power spectral density of Kolmogorov turbulence. The input image is first randomized and then Fresnel propagated with distance d. Further, the output in the Fresnel domain is modulated with a random phase mask, and the gyrator transform (GT) of the modulated image is obtained with an angle α. The EMD is operated on the GT spectrum to get the complex images, Z1 and Z2. Among these, Z2 is reserved as a private key for decryption and Z1 is propagated through a medium consisting of four KPSs, located at specified distances, to get the final encrypted image. The proposed technique provides a large set of security keys and is robust against various potential attacks. Numerical simulation results validate the effectiveness and security of the proposed technique.
Hilty, Donald M; Hales, Deborah J; Briscoe, Greg; Benjamin, Sheldon; Boland, Robert J; Luo, John S; Chan, Carlyle H; Kennedy, Robert S; Karlinsky, Harry; Gordon, Daniel B; Yager, Joel; Yellowlees, Peter M
2006-01-01
This article provides a brief overview of important issues for educators regarding medical education and technology. The literature describes key concepts, prototypical technology tools, and model programs. A work group of psychiatric educators was convened three times by phone conference to discuss the literature. Findings were presented to and input was received from the 2005 Summit on Medical Student Education by APA and the American Directors of Medical Student Education in Psychiatry. Knowledge of, skills in, and attitudes toward medical informatics are important to life-long learning and modern medical practice. A needs assessment is a starting place, since student, faculty, institution, and societal factors bear consideration. Technology needs to "fit" into a curriculum in order to facilitate learning and teaching. Learning about computers and applying computer technology to education and clinical care are key steps in computer literacy for physicians.
E-cadherin junction formation involves an active kinetic nucleation process
Biswas, Kabir H.; Hartman, Kevin L.; Yu, Cheng -han; ...
2015-08-19
Epithelial (E)-cadherin-mediated cell–cell junctions play important roles in the development and maintenance of tissue structure in multicellular organisms. E-cadherin adhesion is thus a key element of the cellular microenvironment that provides both mechanical and biochemical signaling inputs. Here, we report in vitro reconstitution of junction-like structures between native E-cadherin in living cells and the extracellular domain of E-cadherin in a supported membrane. Junction formation in this hybrid live cell-supported membrane configuration requires both active processes within the living cell and a supported membrane with low E-cad-ECD mobility. The hybrid junctions recruit α-catenin and exhibit remodeled cortical actin. Observations suggest thatmore » the initial stages of junction formation in this hybrid system depend on the trans but not the cis interactions between E-cadherin molecules, and proceed via a nucleation process in which protrusion and retraction of filopodia play a key role.« less
E-cadherin junction formation involves an active kinetic nucleation process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Biswas, Kabir H.; Hartman, Kevin L.; Yu, Cheng -han
Epithelial (E)-cadherin-mediated cell–cell junctions play important roles in the development and maintenance of tissue structure in multicellular organisms. E-cadherin adhesion is thus a key element of the cellular microenvironment that provides both mechanical and biochemical signaling inputs. Here, we report in vitro reconstitution of junction-like structures between native E-cadherin in living cells and the extracellular domain of E-cadherin in a supported membrane. Junction formation in this hybrid live cell-supported membrane configuration requires both active processes within the living cell and a supported membrane with low E-cad-ECD mobility. The hybrid junctions recruit α-catenin and exhibit remodeled cortical actin. Observations suggest thatmore » the initial stages of junction formation in this hybrid system depend on the trans but not the cis interactions between E-cadherin molecules, and proceed via a nucleation process in which protrusion and retraction of filopodia play a key role.« less
The DRG shift: a new twist for ICD-10 preparation.
Long, Peri L
2012-06-01
Analysis of your specific business is a key component of ICD-10 implementation. An understanding of your organization's current reimbursement trends will go a long way to assessing and preparing for the impact of ICD-10 in your environment. If you cannot be prepared for each detailed scenario, remember that much of the analysis and resolution requires familiar coding, DRG analysis, and claims processing best practices. Now, they simply have the new twist of researching new codes and some new concepts. The news of a delay in the implementation compliance date, along with the release of grouper Version 29, should encourage your educational and business analysis efforts. This is a great opportunity to maintain open communication with the Centers for Medicare & Medicaid Services, Department of Health and Human Services, and Centers for Disease Control. This is also a key time to report any unusual or discrepant findings in order to provide input to the final rule.
NASA Astrophysics Data System (ADS)
French, N. H. F.; Lawrence, R. L.
2017-12-01
AmericaView is a nationwide partnership of remote sensing scientists who support the use of Landsat and other public domain remotely sensed data through applied remote sensing research, K-12 and higher STEM education, workforce development, and technology transfer. The national AmericaView program currently has active university-lead members in 39 states, each of which has a "stateview" consortium consisting of some combination of university, agency, non-profit, and other members. This "consortium of consortia" has resulted in a strong and unique nationwide network of remote sensing practitioners. AmericaView has used this network to contribute to the USGS Requirements Capabilities & Analysis for Earth Observations. Participating states have conducted interviews of key remote sensing end users across the country to provide key input at the state and local level for the design and implementation of future U.S. moderate resolution Earth observations.
Method for routing events from key strokes in a multi-processing computer systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rhodes, D.A.; Rustici, E.; Carter, K.H.
1990-01-23
The patent describes a method of routing user input in a computer system which concurrently runs a plurality of processes. It comprises: generating keycodes representative of keys typed by a user; distinguishing generated keycodes by looking up each keycode in a routing table which assigns each possible keycode to an individual assigned process of the plurality of processes, one of which processes being a supervisory process; then, sending each keycode to its assigned process until a keycode assigned to the supervisory process is received; sending keycodes received subsequent to the keycode assigned to the supervisory process to a buffer; next,more » providing additional keycodes to the supervisory process from the buffer until the supervisory process has completed operation; and sending keycodes stored in the buffer to processes assigned therewith after the supervisory process has completedoperation.« less
NASA Technical Reports Server (NTRS)
1993-01-01
Using chordic technology, a data entry operator can finger key combinations for text or graphics input. Because only one hand is needed, a disabled person may use it. Strain and fatigue are less than when using a conventional keyboard; input is faster, and the system can be learned in about an hour. Infogrip, Inc. developed chordic input technology with Stennis Space Center (SSC). (NASA is interested in potentially faster human/computer interaction on spacecraft as well as a low cost tactile/visual training system for the handicapped.) The company is now marketing the BAT as an improved system for both disabled and non-disabled computer operators.
Li, Tingting; Cheng, Zhengguo; Zhang, Le
2017-01-01
Since they can provide a natural and flexible description of nonlinear dynamic behavior of complex system, Agent-based models (ABM) have been commonly used for immune system simulation. However, it is crucial for ABM to obtain an appropriate estimation for the key parameters of the model by incorporating experimental data. In this paper, a systematic procedure for immune system simulation by integrating the ABM and regression method under the framework of history matching is developed. A novel parameter estimation method by incorporating the experiment data for the simulator ABM during the procedure is proposed. First, we employ ABM as simulator to simulate the immune system. Then, the dimension-reduced type generalized additive model (GAM) is employed to train a statistical regression model by using the input and output data of ABM and play a role as an emulator during history matching. Next, we reduce the input space of parameters by introducing an implausible measure to discard the implausible input values. At last, the estimation of model parameters is obtained using the particle swarm optimization algorithm (PSO) by fitting the experiment data among the non-implausible input values. The real Influeza A Virus (IAV) data set is employed to demonstrate the performance of our proposed method, and the results show that the proposed method not only has good fitting and predicting accuracy, but it also owns favorable computational efficiency. PMID:29194393
Understanding and Improving High-Performance I/O Subsystems
NASA Technical Reports Server (NTRS)
El-Ghazawi, Tarek A.; Frieder, Gideon; Clark, A. James
1996-01-01
This research program has been conducted in the framework of the NASA Earth and Space Science (ESS) evaluations led by Dr. Thomas Sterling. In addition to the many important research findings for NASA and the prestigious publications, the program has helped orienting the doctoral research program of two students towards parallel input/output in high-performance computing. Further, the experimental results in the case of the MasPar were very useful and helpful to MasPar with which the P.I. has had many interactions with the technical management. The contributions of this program are drawn from three experimental studies conducted on different high-performance computing testbeds/platforms, and therefore presented in 3 different segments as follows: 1. Evaluating the parallel input/output subsystem of a NASA high-performance computing testbeds, namely the MasPar MP- 1 and MP-2; 2. Characterizing the physical input/output request patterns for NASA ESS applications, which used the Beowulf platform; and 3. Dynamic scheduling techniques for hiding I/O latency in parallel applications such as sparse matrix computations. This study also has been conducted on the Intel Paragon and has also provided an experimental evaluation for the Parallel File System (PFS) and parallel input/output on the Paragon. This report is organized as follows. The summary of findings discusses the results of each of the aforementioned 3 studies. Three appendices, each containing a key scholarly research paper that details the work in one of the studies are included.
Li, Tingting; Cheng, Zhengguo; Zhang, Le
2017-12-01
Since they can provide a natural and flexible description of nonlinear dynamic behavior of complex system, Agent-based models (ABM) have been commonly used for immune system simulation. However, it is crucial for ABM to obtain an appropriate estimation for the key parameters of the model by incorporating experimental data. In this paper, a systematic procedure for immune system simulation by integrating the ABM and regression method under the framework of history matching is developed. A novel parameter estimation method by incorporating the experiment data for the simulator ABM during the procedure is proposed. First, we employ ABM as simulator to simulate the immune system. Then, the dimension-reduced type generalized additive model (GAM) is employed to train a statistical regression model by using the input and output data of ABM and play a role as an emulator during history matching. Next, we reduce the input space of parameters by introducing an implausible measure to discard the implausible input values. At last, the estimation of model parameters is obtained using the particle swarm optimization algorithm (PSO) by fitting the experiment data among the non-implausible input values. The real Influeza A Virus (IAV) data set is employed to demonstrate the performance of our proposed method, and the results show that the proposed method not only has good fitting and predicting accuracy, but it also owns favorable computational efficiency.
Factors affecting sustainable dairy production: A case study from Uva Province of Sri Lanka
NASA Astrophysics Data System (ADS)
Wijethilaka, D.; De Silva, S.; Deshapriya, R. M. C.; Gunaratne, L. H. P.
2018-05-01
Dairy farming has been playing a key role by improving household incomes and food security for rural communities in Sri Lanka. Nevertheless, it has failed to meet the expected self-sufficiency. In 2015, Sri Lanka imported 51percent of the national milk requirement spending US 251 million from its debt-ridden economy. This paper aims to analyse socio-economic characteristics of dairy farmers and factors affecting dairy production efficiency in the Uva Province of Sri Lanka, a highly potential area comprising all the dairy value chain actors. Surveyed was conducted to farmers, key informants from input suppliers, collectors, transporters, processors, sellers and support service providers. Result revealed that intensive farmer’s milk yields per cow was only 7.97 L/day, which was 35% and 60% higher than the yields of semi-intensive and extensive farmers respectively. The highest profit of Rs. 53.30 per litre was earned by extensive farmers, whereas it was Rs. 47.63 for semi-intensive and Rs. 44.76 for intensive farmers respectively if family labour cost was not taken into the account. The Technical Efficiency Analysis revealed that 37.1% and 20% milk production of intensive farmers and semi-intensive is being loss due to inefficiency and could be increased without any additional inputs. The main factors affecting efficiency in milk production included farmers’ socio-economic characteristics and farm characteristics. Based on the results it can be concluded that sustainability dairy production depends on farmer training, collectivizing farmers into farmer societies, culling unproductive male animals, increasing the availability and access to AI/other breading programs and low-cost quality concentrate feed and other supplements, and, thus appropriate measures should be taken to provide these conditions if Sri Lanka aims to achieve self-sufficiency in milk production.
Automated Cryocooler Monitor and Control System
NASA Technical Reports Server (NTRS)
Britcliffe, Michael J.; Hanscon, Theodore R.; Fowler, Larry E.
2011-01-01
A system was designed to automate cryogenically cooled low-noise amplifier systems used in the NASA Deep Space Network. It automates the entire operation of the system including cool-down, warm-up, and performance monitoring. The system is based on a single-board computer with custom software and hardware to monitor and control the cryogenic operation of the system. The system provides local display and control, and can be operated remotely via a Web interface. The system controller is based on a commercial single-board computer with onboard data acquisition capability. The commercial hardware includes a microprocessor, an LCD (liquid crystal display), seven LED (light emitting diode) displays, a seven-key keypad, an Ethernet interface, 40 digital I/O (input/output) ports, 11 A/D (analog to digital) inputs, four D/A (digital to analog) outputs, and an external relay board to control the high-current devices. The temperature sensors used are commercial silicon diode devices that provide a non-linear voltage output proportional to temperature. The devices are excited with a 10-microamp bias current. The system is capable of monitoring and displaying three temperatures. The vacuum sensors are commercial thermistor devices. The output of the sensors is a non-linear voltage proportional to vacuum pressure in the 1-Torr to 1-millitorr range. Two sensors are used. One measures the vacuum pressure in the cryocooler and the other the pressure at the input to the vacuum pump. The helium pressure sensor is a commercial device that provides a linear voltage output from 1 to 5 volts, corresponding to a gas pressure from 0 to 3.5 MPa (approx. = 500 psig). Control of the vacuum process is accomplished with a commercial electrically operated solenoid valve. A commercial motor starter is used to control the input power of the compressor. The warm-up heaters are commercial power resistors sized to provide the appropriate power for the thermal mass of the particular system, and typically provide 50 watts of heat. There are four basic operating modes. "Cool " mode commands the system to cool to normal operating temperature. "Heat " mode is used to warm the device to a set temperature near room temperature. "Pump " mode is a maintenance function that allows the vacuum system to be operated alone to remove accumulated contaminants from the vacuum area. In "Off " mode, no power is applied to the system.
Enhancing and Adapting Treatment Foster Care: Lessons Learned in Trying to Change Practice.
Murray, Maureen M; Southerland, Dannia; Farmer, Elizabeth M; Ballentine, Kess
2010-01-01
Evidence-based practices to improve outcomes for children with severe behavioral and emotional problems have received a great deal of attention in children's mental health. Therapeutic Foster Care (TFC), a residential intervention for youth with emotional or behavioral problems, is one of the few community-based programs that is considered to be evidence-based. However, as for most treatment approaches, the vast majority of existing programs do not deliver the evidence-based version. In an attempt to fill this gap and improve practice across a wide range of TFC agencies, we developed an enhanced model of TFC based on input from both practice and research. It includes elements associated with improved outcomes for youth in "usual care" TFC agencies as well as key elements from Chamberlain's evidence-based model. The current manuscript describes this "hybrid" intervention - Together Facing the Challenge - and discusses key issues in implementation. We describe the sample and settings, highlight key implementation strategies, and provide "lessons learned" to help guide others who may wish to change practice in existing agencies.
Aspen-triticale alleycropping system: effects of landscape position and fertilizer rate
W.L. Headlee; R.B. Hall; R.S. Jr. Zalesny
2010-01-01
Short-rotation woody crops offer several key advantages over other potential bioenergy feedstocks, particularly with regard to nutrient inputs and biomass storage. However, a key disadvantage is a lack of income for the grower early in the rotation. Alleycropping offers the opportunity to grow annual crops for income while the trees become established.
Reversible control of biofilm formation by Cellulomonas spp. in response to nitrogen availability.
Young, Jenna M; Leschine, Susan B; Reguera, Gemma
2012-03-01
The microbial degradation of cellulose contributes greatly to the cycling of carbon in terrestrial environments and feedbacks to the atmosphere, a process that is highly responsive to nitrogen inputs. Yet how key groups of cellulolytic microorganisms adaptively respond to the global conditions of nitrogen limitation and/or anthropogenic or climate nitrogen inputs is poorly understood. The actinobacterial genus Cellulomonas is of special interest because it incorporates the only species known to degrade cellulose aerobically and anaerobically. Furthermore, despite their inability to fix nitrogen, they are active decomposers in nitrogen-limited environments. Here we show that nitrogen limitation induced biofilm formation in Cellulomonas spp., a process that was coupled to carbon sequestration and storage in a curdlan-type biofilm matrix. The response was reversible and the curdlan matrix was solubilized and used as a carbon and energy source for biofilm dispersal once nitrogen sources became available. The biofilms attached strongly to cellulosic surfaces and, despite the growth limitation, produced cellulases and degraded cellulose more efficiently. The results show that biofilm formation is a competitive strategy for carbon and nitrogen acquisition and provide valuable insights linking nitrogen inputs to carbon sequestration and remobilization in terrestrial environments. © 2011 Society for Applied Microbiology and Blackwell Publishing Ltd.
Delevich, Kristen; Tucciarone, Jason; Huang, Z. Josh
2015-01-01
Although the medial prefrontal cortex (mPFC) is classically defined by its reciprocal connections with the mediodorsal thalamic nucleus (MD), the nature of information transfer between MD and mPFC is poorly understood. In sensory thalamocortical pathways, thalamic recruitment of feedforward inhibition mediated by fast-spiking, putative parvalbumin-expressing (PV) interneurons is a key feature that enables cortical neurons to represent sensory stimuli with high temporal fidelity. Whether a similar circuit mechanism is in place for the projection from the MD (a higher-order thalamic nucleus that does not receive direct input from the periphery) to the mPFC is unknown. Here we show in mice that inputs from the MD drive disynaptic feedforward inhibition in the dorsal anterior cingulate cortex (dACC) subregion of the mPFC. In particular, we demonstrate that axons arising from MD neurons directly synapse onto and excite PV interneurons that in turn mediate feedforward inhibition of pyramidal neurons in layer 3 of the dACC. This feedforward inhibition in the dACC limits the time window during which pyramidal neurons integrate excitatory synaptic inputs and fire action potentials, but in a manner that allows for greater flexibility than in sensory cortex. These findings provide a foundation for understanding the role of MD-PFC circuit function in cognition. PMID:25855185
2011-07-25
testing, the EFTR must be keyed with the same key used to encrypt the Enhanced Flight Termination Systems ( EFTS ) message. To ensure identical keys...required to verify the proper state. e. Procedure. (1) Pull up EFTS graphic user interface (GUI) (Figure 3). (2) Click “Receiver Power On...commanded mode steady state input currents will not exceed their specified values. TOP 05-2-543 25 July 2011 19 Figure 3. EFTS GUIa
Deng, Rongkang; Kao, Joseph P Y; Kanold, Patrick O
2017-05-09
GABAergic activity is important in neocortical development and plasticity. Because the maturation of GABAergic interneurons is regulated by neural activity, the source of excitatory inputs to GABAergic interneurons plays a key role in development. We show, by laser-scanning photostimulation, that layer 4 and layer 5 GABAergic interneurons in the auditory cortex in neonatal mice (
Scaling of global input-output networks
NASA Astrophysics Data System (ADS)
Liang, Sai; Qi, Zhengling; Qu, Shen; Zhu, Ji; Chiu, Anthony S. F.; Jia, Xiaoping; Xu, Ming
2016-06-01
Examining scaling patterns of networks can help understand how structural features relate to the behavior of the networks. Input-output networks consist of industries as nodes and inter-industrial exchanges of products as links. Previous studies consider limited measures for node strengths and link weights, and also ignore the impact of dataset choice. We consider a comprehensive set of indicators in this study that are important in economic analysis, and also examine the impact of dataset choice, by studying input-output networks in individual countries and the entire world. Results show that Burr, Log-Logistic, Log-normal, and Weibull distributions can better describe scaling patterns of global input-output networks. We also find that dataset choice has limited impacts on the observed scaling patterns. Our findings can help examine the quality of economic statistics, estimate missing data in economic statistics, and identify key nodes and links in input-output networks to support economic policymaking.
DREAM-3D and the importance of model inputs and boundary conditions
NASA Astrophysics Data System (ADS)
Friedel, Reiner; Tu, Weichao; Cunningham, Gregory; Jorgensen, Anders; Chen, Yue
2015-04-01
Recent work on radiation belt 3D diffusion codes such as the Los Alamos "DREAM-3D" code have demonstrated the ability of such codes to reproduce realistic magnetospheric storm events in the relativistic electron dynamics - as long as sufficient "event-oriented" boundary conditions and code inputs such as wave powers, low energy boundary conditions, background plasma densities, and last closed drift shell (outer boundary) are available. In this talk we will argue that the main limiting factor in our modeling ability is no longer our inability to represent key physical processes that govern the dynamics of the radiation belts (radial, pitch angle and energy diffusion) but rather our limitations in specifying accurate boundary conditions and code inputs. We use here DREAM-3D runs to show the sensitivity of the modeled outcomes to these boundary conditions and inputs, and also discuss alternate "proxy" approaches to obtain the required inputs from other (ground-based) sources.
Miller, Derek M; DeMayo, William M; Bourdages, George H; Wittman, Samuel R; Yates, Bill J; McCall, Andrew A
2017-04-01
The integration of inputs from vestibular and proprioceptive sensors within the central nervous system is critical to postural regulation. We recently demonstrated in both decerebrate and conscious cats that labyrinthine and hindlimb inputs converge onto vestibular nucleus neurons. The pontomedullary reticular formation (pmRF) also plays a key role in postural control, and additionally participates in regulating locomotion. Thus, we hypothesized that like vestibular nucleus neurons, pmRF neurons integrate inputs from the limb and labyrinth. To test this hypothesis, we recorded the responses of pmRF neurons to passive ramp-and-hold movements of the hindlimb and to whole-body tilts, in both decerebrate and conscious felines. We found that pmRF neuronal activity was modulated by hindlimb movement in the rostral-caudal plane. Most neurons in both decerebrate (83% of units) and conscious (61% of units) animals encoded both flexion and extension movements of the hindlimb. In addition, hindlimb somatosensory inputs converged with vestibular inputs onto pmRF neurons in both preparations. Pontomedullary reticular formation neurons receiving convergent vestibular and limb inputs likely participate in balance control by governing reticulospinal outflow.
Miller, Derek M.; DeMayo, William M.; Bourdages, George H.; Wittman, Samuel; Yates, Bill J.; McCall, Andrew A.
2017-01-01
The integration of inputs from vestibular and proprioceptive sensors within the central nervous system is critical to postural regulation. We recently demonstrated in both decerebrate and conscious cats that labyrinthine and hindlimb inputs converge onto vestibular nucleus neurons. The pontomedullary reticular formation (pmRF) also plays a key role in postural control, and additionally participates in regulating locomotion. Thus, we hypothesized that like vestibular nucleus neurons, pmRF neurons integrate inputs from the limb and labyrinth. To test this hypothesis, we recorded the responses of pmRF neurons to passive ramp-and-hold movements of the hindlimb and to whole-body tilts, in both decerebrate and conscious felines. We found that pmRF neuronal activity was modulated by hindlimb movement in the rostral-caudal plane. Most neurons in both decerebrate (83% of units) and conscious (61% of units) animals encoded both flexion and extension movements of the hindlimb. Additionally, hindlimb somatosensory inputs converged with vestibular inputs onto pmRF neurons in both preparations. Pontomedullary reticular formation neurons receiving convergent vestibular and limb inputs likely participate in balance control by governing reticulospinal outflow. PMID:28188328
Real Time Calibration Method for Signal Conditioning Amplifiers
NASA Technical Reports Server (NTRS)
Medelius, Pedro J. (Inventor); Mata, Carlos T. (Inventor); Eckhoff, Anthony (Inventor); Perotti, Jose (Inventor); Lucena, Angel (Inventor)
2004-01-01
A signal conditioning amplifier receives an input signal from an input such as a transducer. The signal is amplified and processed through an analog to digital converter and sent to a processor. The processor estimates the input signal provided by the transducer to the amplifier via a multiplexer. The estimated input signal is provided as a calibration voltage to the amplifier immediately following the receipt of the amplified input signal. The calibration voltage is amplified by the amplifier and provided to the processor as an amplified calibration voltage. The amplified calibration voltage is compared to the amplified input signal, and if a significant error exists, the gain and/or offset of the amplifier may be adjusted as necessary.
40 CFR 97.76 - Additional requirements to provide heat input data.
Code of Federal Regulations, 2010 CFR
2010-07-01
... heat input data. 97.76 Section 97.76 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Monitoring and Reporting § 97.76 Additional requirements to provide heat input data. The owner or operator of... a flow system shall also monitor and report heat input rate at the unit level using the procedures...
40 CFR 97.76 - Additional requirements to provide heat input data.
Code of Federal Regulations, 2011 CFR
2011-07-01
... heat input data. 97.76 Section 97.76 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Monitoring and Reporting § 97.76 Additional requirements to provide heat input data. The owner or operator of... a flow system shall also monitor and report heat input rate at the unit level using the procedures...
40 CFR 97.76 - Additional requirements to provide heat input data.
Code of Federal Regulations, 2013 CFR
2013-07-01
... heat input data. 97.76 Section 97.76 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Monitoring and Reporting § 97.76 Additional requirements to provide heat input data. The owner or operator of... a flow system shall also monitor and report heat input rate at the unit level using the procedures...
40 CFR 97.76 - Additional requirements to provide heat input data.
Code of Federal Regulations, 2014 CFR
2014-07-01
... heat input data. 97.76 Section 97.76 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Monitoring and Reporting § 97.76 Additional requirements to provide heat input data. The owner or operator of... a flow system shall also monitor and report heat input rate at the unit level using the procedures...
40 CFR 97.76 - Additional requirements to provide heat input data.
Code of Federal Regulations, 2012 CFR
2012-07-01
... heat input data. 97.76 Section 97.76 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Monitoring and Reporting § 97.76 Additional requirements to provide heat input data. The owner or operator of... a flow system shall also monitor and report heat input rate at the unit level using the procedures...
Stokes, Elizabeth A; Wordsworth, Sarah; Staves, Julie; Mundy, Nicola; Skelly, Jane; Radford, Kelly; Stanworth, Simon J
2018-04-01
In an environment of limited health care resources, it is crucial for health care systems which provide blood transfusion to have accurate and comprehensive information on the costs of transfusion, incorporating not only the costs of blood products, but also their administration. Unfortunately, in many countries accurate costs for administering blood are not available. Our study aimed to generate comprehensive estimates of the costs of administering transfusions for the UK National Health Service. A detailed microcosting study was used to cost two key inputs into transfusion: transfusion laboratory and nursing inputs. For each input, data collection forms were developed to capture staff time, equipment, and consumables associated with each step in the transfusion process. Costing results were combined with costs of blood product wastage to calculate the cost per unit transfused, separately for different blood products. Data were collected in 2014/15 British pounds and converted to US dollars. A total of 438 data collection forms were completed by 74 staff. The cost of administering blood was $71 (£49) per unit for red blood cells, $84 (£58) for platelets, $55 (£38) for fresh-frozen plasma, and $72 (£49) for cryoprecipitate. Blood administration costs add substantially to the costs of the blood products themselves. These are frequently incurred costs; applying estimates to the blood components supplied to UK hospitals in 2015, the annual cost of blood administration, excluding blood products, exceeds $175 (£120) million. These results provide more accurate estimates of the total costs of transfusion than those previously available. © 2018 AABB.
Almagro, Bartolomé J; Sáenz-López, Pedro; Moreno, Juan A
2010-01-01
The purpose of this study was to test a motivational model of the coach-athlete relationship, based on self-determination theory and on the hierarchical model of intrinsic and extrinsic motivation. The sample comprised of 608 athletes (ages of 12-17 years) completed the following measures: interest in athlete's input, praise for autonomous behavior, perceived autonomy, intrinsic motivation, and the intention to be physically active. Structural equation modeling results demonstrated that interest in athletes' input and praise for autonomous behavior predicted perceived autonomy, and perceived autonomy positively predicted intrinsic motivation. Finally, intrinsic motivation predicted the intention to be physically active in the future. The results are discussed in relation to the importance of the climate of autonomy support created by the coach on intrinsic motivation and adherence to sport by adolescent athletes. Further, the results provide information related to the possible objectives of future interventions for the education of coaches, with the goal of providing them with tools and strategies to favor the development of intrinsic motivation among their athletes. In conclusion, the climate of autonomy support created by the coach can predict the autonomy perceived by the athletes which predicts the intrinsic motivation experienced by the athletes, and therefore, their adherence to athletic practice. Key pointsImportance of the climate of autonomy support created by the coach on intrinsic motivation and adherence to sport by adolescent athletes.Interest in athletes' input and praise for autonomous behavior predicted perceived autonomy, and perceived autonomy positively predicted intrinsic motivation.Intrinsic motivation predicted the intention to be physically active in the future.
Almagro, Bartolomé J.; Sáenz-López, Pedro; Moreno, Juan A.
2010-01-01
The purpose of this study was to test a motivational model of the coach-athlete relationship, based on self-determination theory and on the hierarchical model of intrinsic and extrinsic motivation. The sample comprised of 608 athletes (ages of 12-17 years) completed the following measures: interest in athlete's input, praise for autonomous behavior, perceived autonomy, intrinsic motivation, and the intention to be physically active. Structural equation modeling results demonstrated that interest in athletes' input and praise for autonomous behavior predicted perceived autonomy, and perceived autonomy positively predicted intrinsic motivation. Finally, intrinsic motivation predicted the intention to be physically active in the future. The results are discussed in relation to the importance of the climate of autonomy support created by the coach on intrinsic motivation and adherence to sport by adolescent athletes. Further, the results provide information related to the possible objectives of future interventions for the education of coaches, with the goal of providing them with tools and strategies to favor the development of intrinsic motivation among their athletes. In conclusion, the climate of autonomy support created by the coach can predict the autonomy perceived by the athletes which predicts the intrinsic motivation experienced by the athletes, and therefore, their adherence to athletic practice. Key points Importance of the climate of autonomy support created by the coach on intrinsic motivation and adherence to sport by adolescent athletes. Interest in athletes' input and praise for autonomous behavior predicted perceived autonomy, and perceived autonomy positively predicted intrinsic motivation. Intrinsic motivation predicted the intention to be physically active in the future. PMID:24149380
Li-Zn-Pb multi isotopic characterization of the Loire River Basin, France
NASA Astrophysics Data System (ADS)
Millot, R.; Desaulty, A.; Widory, D.; Bourrain, X.
2013-12-01
The Loire River in France is approximately 1010 km long and drains an area of 117 800 km2. Upstream, the Loire River flows following a south to north direction from the Massif Central down to the city of Orléans, 650 km from its source. The Loire River is one of the main European riverine inputs to the Atlantic Ocean. Over time, its basin has been exposed to numerous sources of anthropogenic metal pollutions, such as metal mining, industry, agriculture and domestic inputs. The Loire River basin is thus an excellent study site to develop new isotope systematics for tracking anthropogenic sources of metal pollutions (Zn and Pb) and also to investigate Li isotope tracing that can provide key information on the nature of weathering processes at the Loire River Basin scale. Preliminary data show that Li-Zn-Pb concentrations and isotopic compositions span a wide range in river waters of the Loire River main stream and the main tributaries. There is a clear contrast between the headwaters upstream and rivers located downstream in the lowlands. In addition, one of the major tributaries within the Massif Central (the Allier River) is clearly influenced by inputs resulting from mineralizations and thermomineral waters. The results showed that, on their own, each of these isotope systematics reveals important information about the geogenic or anthropogenic origin Li-Zn-Pb. Considered together, they are however providing a more integrated understanding of the overall budgets of these elements at the scale of the Loire River Basin.
Advancing the Implementation of Hydrologic Models as Web-based Applications
NASA Astrophysics Data System (ADS)
Dahal, P.; Tarboton, D. G.; Castronova, A. M.
2017-12-01
Advanced computer simulations are required to understand hydrologic phenomenon such as rainfall-runoff response, groundwater hydrology, snow hydrology, etc. Building a hydrologic model instance to simulate a watershed requires investment in data (diverse geospatial datasets such as terrain, soil) and computer resources, typically demands a wide skill set from the analyst, and the workflow involved is often difficult to reproduce. This work introduces a web-based prototype infrastructure in the form of a web application that provides researchers with easy to use access to complete hydrological modeling functionality. This includes creating the necessary geospatial and forcing data, preparing input files for a model by applying complex data preprocessing, running the model for a user defined watershed, and saving the results to a web repository. The open source Tethys Platform was used to develop the web app front-end Graphical User Interface (GUI). We used HydroDS, a webservice that provides data preparation processing capability to support backend computations used by the app. Results are saved in HydroShare, a hydrologic information system that supports the sharing of hydrologic data, model and analysis tools. The TOPographic Kinematic APproximation and Integration (TOPKAPI) model served as the example for which we developed a complete hydrologic modeling service to demonstrate the approach. The final product is a complete modeling system accessible through the web to create input files, and run the TOPKAPI hydrologic model for a watershed of interest. We are investigating similar functionality for the preparation of input to Regional Hydro-Ecological Simulation System (RHESSys). Key Words: hydrologic modeling, web services, hydrologic information system, HydroShare, HydroDS, Tethys Platform
Acute Radiation Risk and BRYNTRN Organ Dose Projection Graphical User Interface
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.; Hu, Shaowen; Nounu, Hateni N.; Kim, Myung-Hee
2011-01-01
The integration of human space applications risk projection models of organ dose and acute radiation risk has been a key problem. NASA has developed an organ dose projection model using the BRYNTRN with SUM DOSE computer codes, and a probabilistic model of Acute Radiation Risk (ARR). The codes BRYNTRN and SUM DOSE are a Baryon transport code and an output data processing code, respectively. The risk projection models of organ doses and ARR take the output from BRYNTRN as an input to their calculations. With a graphical user interface (GUI) to handle input and output for BRYNTRN, the response models can be connected easily and correctly to BRYNTRN. A GUI for the ARR and BRYNTRN Organ Dose (ARRBOD) projection code provides seamless integration of input and output manipulations, which are required for operations of the ARRBOD modules. The ARRBOD GUI is intended for mission planners, radiation shield designers, space operations in the mission operations directorate (MOD), and space biophysics researchers. BRYNTRN code operation requires extensive input preparation. Only a graphical user interface (GUI) can handle input and output for BRYNTRN to the response models easily and correctly. The purpose of the GUI development for ARRBOD is to provide seamless integration of input and output manipulations for the operations of projection modules (BRYNTRN, SLMDOSE, and the ARR probabilistic response model) in assessing the acute risk and the organ doses of significant Solar Particle Events (SPEs). The assessment of astronauts radiation risk from SPE is in support of mission design and operational planning to manage radiation risks in future space missions. The ARRBOD GUI can identify the proper shielding solutions using the gender-specific organ dose assessments in order to avoid ARR symptoms, and to stay within the current NASA short-term dose limits. The quantified evaluation of ARR severities based on any given shielding configuration and a specified EVA or other mission scenario can be made to guide alternative solutions for attaining determined objectives set by mission planners. The ARRBOD GUI estimates the whole-body effective dose, organ doses, and acute radiation sickness symptoms for astronauts, by which operational strategies and capabilities can be made for the protection of astronauts from SPEs in the planning of future lunar surface scenarios, exploration of near-Earth objects, and missions to Mars.
Enhancing the role of science in the decision-making of the European Union.
Allio, Lorenzo; Ballantine, Bruce; Meads, Richard
2006-02-01
Used well, science provides effective ways of identifying potential risks, protecting citizens, and using resources wisely. It enables government decisions to be based on evidence and provides a foundation for a rule-based framework that supports global trade. To ensure that the best available science becomes a key input in the decisions made by EU institutions, this abridged version of a working paper produced for the European Policy Centre, a leading, independent think tank, considers how science is currently used in the policy and decision-making processes of the EU, what the limitations of scientific evidence are, and how a risk assessment process based on scientific 'good practices' can be advantageous. Finally, the paper makes recommendations on how to improve the use of science by EU institutions.
Modeling of Melt-Infiltrated SiC/SiC Composite Properties
NASA Technical Reports Server (NTRS)
Mital, Subodh K.; Bednarcyk, Brett A.; Arnold, Steven M.; Lang, Jerry
2009-01-01
The elastic properties of a two-dimensional five-harness melt-infiltrated silicon carbide fiber reinforced silicon carbide matrix (MI SiC/SiC) ceramic matrix composite (CMC) were predicted using several methods. Methods used in this analysis are multiscale laminate analysis, micromechanics-based woven composite analysis, a hybrid woven composite analysis, and two- and three-dimensional finite element analyses. The elastic properties predicted are in good agreement with each other as well as with the available measured data. However, the various methods differ from each other in three key areas: (1) the fidelity provided, (2) the efforts required for input data preparation, and (3) the computational resources required. Results also indicate that efficient methods are also able to provide a reasonable estimate of local stress fields.
Solid-State Lighting 2017 Suggested Research Topics
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
2017-09-29
A 2017 update to the Solid-State Lighting R&D Plan that is divided into two documents. The first document describes a list of suggested SSL priority research topics and the second document provides context and background, including information drawn from technical, market, and economic studies. Widely referenced by industry and government both here and abroad, these documents reflect SSL stakeholder inputs on key R&D topics that will improve efficacy, reduce cost, remove barriers to adoption, and add value for LED and OLED lighting solutions over the next three to five years, and discuss those applications that drive and prioritize the specificmore » R&D.« less
A modal parameter extraction procedure applicable to linear time-invariant dynamic systems
NASA Technical Reports Server (NTRS)
Kurdila, A. J.; Craig, R. R., Jr.
1985-01-01
Modal analysis has emerged as a valuable tool in many phases of the engineering design process. Complex vibration and acoustic problems in new designs can often be remedied through use of the method. Moreover, the technique has been used to enhance the conceptual understanding of structures by serving to verify analytical models. A new modal parameter estimation procedure is presented. The technique is applicable to linear, time-invariant systems and accommodates multiple input excitations. In order to provide a background for the derivation of the method, some modal parameter extraction procedures currently in use are described. Key features implemented in the new technique are elaborated upon.
The deep ocean under climate change
NASA Astrophysics Data System (ADS)
Levin, Lisa A.; Le Bris, Nadine
2015-11-01
The deep ocean absorbs vast amounts of heat and carbon dioxide, providing a critical buffer to climate change but exposing vulnerable ecosystems to combined stresses of warming, ocean acidification, deoxygenation, and altered food inputs. Resulting changes may threaten biodiversity and compromise key ocean services that maintain a healthy planet and human livelihoods. There exist large gaps in understanding of the physical and ecological feedbacks that will occur. Explicit recognition of deep-ocean climate mitigation and inclusion in adaptation planning by the United Nations Framework Convention on Climate Change (UNFCCC) could help to expand deep-ocean research and observation and to protect the integrity and functions of deep-ocean ecosystems.
Crew behavior and performance in space analog environments
NASA Technical Reports Server (NTRS)
Kanki, Barbara G.
1992-01-01
The objectives and the current status of the Crew Factors research program conducted at NASA-Ames Research Center are reviewed. The principal objectives of the program are to determine the effects of a broad class of input variables on crew performance and to provide guidance with respect to the design and management of crews assigned to future space missions. A wide range of research environments are utilized, including controlled experimental settings, high fidelity full mission simulator facilities, and fully operational field environments. Key group processes are identified, and preliminary data are presented on the effect of crew size, type, and structure on team performance.
Solid-State Lighting 2017 Suggested Research Topics
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
A 2017 update to the Solid-State Lighting R&D Plan that is divided into two documents. The first document describes a list of suggested SSL priority research topics and the second document provides context and background, including information drawn from technical, market, and economic studies. Widely referenced by industry and government both here and abroad, these documents reflect SSL stakeholder inputs on key R&D topics that will improve efficacy, reduce cost, remove barriers to adoption, and add value for LED and OLED lighting solutions over the next three to five years, and discuss those applications that drive and prioritize the specificmore » R&D.« less
Pitot tube calculations with a TI-59
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, K.
Industrial plant and stack analysis dictates that flow measurements in ducts be accurate. This is usually accomplished by running a traverse with a pitot tube across the duct or flue. A traverse is a series of measurements taken at predetermined points across the duct. The values of these measurements are calculated into point flow rates and averaged. A program for the Texas Instruments TI-59 programmable calculator follows. The program will perform calculations for an infinite number of test points, both with the standard (combined impact type) pitot tube and the S-type (combined reverse type). The type of tube is selectedmore » by inputting an indicating valve that triggers a flag in the program. To use the standard pitot tube, a 1 is input into key E. When the S-type is used, a zero is input into key E. The program output will note if the S-type had been used. Since most process systems are not at standard conditions (32/sup 0/F, 1 atm) the program will take this into account.« less
Key management of the double random-phase-encoding method using public-key encryption
NASA Astrophysics Data System (ADS)
Saini, Nirmala; Sinha, Aloka
2010-03-01
Public-key encryption has been used to encode the key of the encryption process. In the proposed technique, an input image has been encrypted by using the double random-phase-encoding method using extended fractional Fourier transform. The key of the encryption process have been encoded by using the Rivest-Shamir-Adelman (RSA) public-key encryption algorithm. The encoded key has then been transmitted to the receiver side along with the encrypted image. In the decryption process, first the encoded key has been decrypted using the secret key and then the encrypted image has been decrypted by using the retrieved key parameters. The proposed technique has advantage over double random-phase-encoding method because the problem associated with the transmission of the key has been eliminated by using public-key encryption. Computer simulation has been carried out to validate the proposed technique.
Model-free adaptive control of supercritical circulating fluidized-bed boilers
Cheng, George Shu-Xing; Mulkey, Steven L
2014-12-16
A novel 3-Input-3-Output (3.times.3) Fuel-Air Ratio Model-Free Adaptive (MFA) controller is introduced, which can effectively control key process variables including Bed Temperature, Excess O2, and Furnace Negative Pressure of combustion processes of advanced boilers. A novel 7-input-7-output (7.times.7) MFA control system is also described for controlling a combined 3-Input-3-Output (3.times.3) process of Boiler-Turbine-Generator (BTG) units and a 5.times.5 CFB combustion process of advanced boilers. Those boilers include Circulating Fluidized-Bed (CFB) Boilers and Once-Through Supercritical Circulating Fluidized-Bed (OTSC CFB) Boilers.
NASA Astrophysics Data System (ADS)
Whitehead, James Joshua
The analysis documented herein provides an integrated approach for the conduct of optimization under uncertainty (OUU) using Monte Carlo Simulation (MCS) techniques coupled with response surface-based methods for characterization of mixture-dependent variables. This novel methodology provides an innovative means of conducting optimization studies under uncertainty in propulsion system design. Analytic inputs are based upon empirical regression rate information obtained from design of experiments (DOE) mixture studies utilizing a mixed oxidizer hybrid rocket concept. Hybrid fuel regression rate was selected as the target response variable for optimization under uncertainty, with maximization of regression rate chosen as the driving objective. Characteristic operational conditions and propellant mixture compositions from experimental efforts conducted during previous foundational work were combined with elemental uncertainty estimates as input variables. Response surfaces for mixture-dependent variables and their associated uncertainty levels were developed using quadratic response equations incorporating single and two-factor interactions. These analysis inputs, response surface equations and associated uncertainty contributions were applied to a probabilistic MCS to develop dispersed regression rates as a function of operational and mixture input conditions within design space. Illustrative case scenarios were developed and assessed using this analytic approach including fully and partially constrained operational condition sets over all of design mixture space. In addition, optimization sets were performed across an operationally representative region in operational space and across all investigated mixture combinations. These scenarios were selected as representative examples relevant to propulsion system optimization, particularly for hybrid and solid rocket platforms. Ternary diagrams, including contour and surface plots, were developed and utilized to aid in visualization. The concept of Expanded-Durov diagrams was also adopted and adapted to this study to aid in visualization of uncertainty bounds. Regions of maximum regression rate and associated uncertainties were determined for each set of case scenarios. Application of response surface methodology coupled with probabilistic-based MCS allowed for flexible and comprehensive interrogation of mixture and operating design space during optimization cases. Analyses were also conducted to assess sensitivity of uncertainty to variations in key elemental uncertainty estimates. The methodology developed during this research provides an innovative optimization tool for future propulsion design efforts.
40 CFR 96.76 - Additional requirements to provide heat input data for allocations purposes.
Code of Federal Regulations, 2011 CFR
2011-07-01
... heat input data for allocations purposes. 96.76 Section 96.76 Protection of Environment ENVIRONMENTAL... to provide heat input data for allocations purposes. (a) The owner or operator of a unit that elects... also monitor and report heat input at the unit level using the procedures set forth in part 75 of this...
40 CFR 96.76 - Additional requirements to provide heat input data for allocations purposes.
Code of Federal Regulations, 2013 CFR
2013-07-01
... heat input data for allocations purposes. 96.76 Section 96.76 Protection of Environment ENVIRONMENTAL... to provide heat input data for allocations purposes. (a) The owner or operator of a unit that elects... also monitor and report heat input at the unit level using the procedures set forth in part 75 of this...
40 CFR 96.76 - Additional requirements to provide heat input data for allocations purposes.
Code of Federal Regulations, 2012 CFR
2012-07-01
... heat input data for allocations purposes. 96.76 Section 96.76 Protection of Environment ENVIRONMENTAL... to provide heat input data for allocations purposes. (a) The owner or operator of a unit that elects... also monitor and report heat input at the unit level using the procedures set forth in part 75 of this...
40 CFR 96.76 - Additional requirements to provide heat input data for allocations purposes.
Code of Federal Regulations, 2014 CFR
2014-07-01
... heat input data for allocations purposes. 96.76 Section 96.76 Protection of Environment ENVIRONMENTAL... to provide heat input data for allocations purposes. (a) The owner or operator of a unit that elects... also monitor and report heat input at the unit level using the procedures set forth in part 75 of this...
Plikus, Maksim V; Zhang, Zina; Chuong, Cheng-Ming
2006-01-01
Background Understanding research activity within any given biomedical field is important. Search outputs generated by MEDLINE/PubMed are not well classified and require lengthy manual citation analysis. Automation of citation analytics can be very useful and timesaving for both novices and experts. Results PubFocus web server automates analysis of MEDLINE/PubMed search queries by enriching them with two widely used human factor-based bibliometric indicators of publication quality: journal impact factor and volume of forward references. In addition to providing basic volumetric statistics, PubFocus also prioritizes citations and evaluates authors' impact on the field of search. PubFocus also analyses presence and occurrence of biomedical key terms within citations by utilizing controlled vocabularies. Conclusion We have developed citations' prioritisation algorithm based on journal impact factor, forward referencing volume, referencing dynamics, and author's contribution level. It can be applied either to the primary set of PubMed search results or to the subsets of these results identified through key terms from controlled biomedical vocabularies and ontologies. NCI (National Cancer Institute) thesaurus and MGD (Mouse Genome Database) mammalian gene orthology have been implemented for key terms analytics. PubFocus provides a scalable platform for the integration of multiple available ontology databases. PubFocus analytics can be adapted for input sources of biomedical citations other than PubMed. PMID:17014720
Moreland, Leslie D; Gore, Fiona M; Andre, Nathalie; Cairncross, Sandy; Ensink, Jeroen H J
2016-08-01
There are significant gaps in information about the inputs required to effectively extend and sustain hygiene promotion activities to improve people's health outcomes through water, sanitation and hygiene (WASH) interventions. We sought to analyse current country and global trends in the use of key inputs required for effective and sustainable implementation of hygiene promotion to help guide hygiene promotion policy and decision-making after 2015. Data collected in response to the GLAAS 2013/2014 survey from 93 countries of 94 were included, and responses were analysed for 12 questions assessing the inputs and enabling environment for hygiene promotion under four thematic areas. Data were included and analysed from 20 External Support Agencies (ESA) of 23 collected through self-administered surveys. Firstly, the data showed a large variation in the way in which hygiene promotion is defined and what constitutes key activities in this area. Secondly, challenges to implement hygiene promotion are considerable: include poor implementation of policies and plans, weak coordination mechanisms, human resource limitations and a lack of available hygiene promotion budget data. Despite the proven benefits of hand washing with soap, a critical hygiene-related factor in minimising infection, GLAAS 2013/2014 survey data showed that hygiene promotion remains a neglected component of WASH. Additional research to identify the context-specific strategies and inputs required to enhance the effectiveness of hygiene promotion at scale are needed. Improved data collection methods are also necessary to advance the availability and reliability of hygiene-specific information. © 2016 John Wiley & Sons Ltd.
Update on value-based medicine.
Brown, Melissa M; Brown, Gary C
2013-05-01
To update concepts in Value-Based Medicine, especially in view of the Patient Protection and Affordable Care Act. The Patient Protection and Affordable Care Act assures that some variant of Value-Based Medicine cost-utility analysis will play a key role in the healthcare system. It identifies the highest quality care, thereby maximizing the most efficacious use of healthcare resources and empowering patients and physicians.Standardization is critical for the creation and acceptance of a Value-Based Medicine, cost-utility analysis, information system, since 27 million different input variants can go into a cost-utility analysis. Key among such standards is the use of patient preferences (utilities), as patients best understand the quality of life associated with their health states. The inclusion of societal costs, versus direct medical costs alone, demonstrates that medical interventions are more cost effective and, in many instances, provide a net financial return-on-investment to society referent to the direct medical costs expended. Value-Based Medicine provides a standardized methodology, integrating critical, patient, quality-of-life preferences, and societal costs, to allow the highest quality, most cost-effective care. Central to Value-Based Medicine is the concept that all patients deserve the interventions that provide the greatest patient value (improvement in quality of life and/or length of life).
Development of a core set of outcome measures for OAB treatment.
Foust-Wright, Caroline; Wissig, Stephanie; Stowell, Caleb; Olson, Elizabeth; Anderson, Anita; Anger, Jennifer; Cardozo, Linda; Cotterill, Nikki; Gormley, Elizabeth Ann; Toozs-Hobson, Philip; Heesakkers, John; Herbison, Peter; Moore, Kate; McKinney, Jessica; Morse, Abraham; Pulliam, Samantha; Szonyi, George; Wagg, Adrian; Milsom, Ian
2017-12-01
Standardized measures enable the comparison of outcomes across providers and treatments giving valuable information for improving care quality and efficacy. The aim of this project was to define a minimum standard set of outcome measures and case-mix factors for evaluating the care of patients with overactive bladder (OAB). The International Consortium for Health Outcomes Measurement (ICHOM) convened an international working group (WG) of leading clinicians and patients to engage in a structured method for developing a core outcome set. Consensus was determined by a modified Delphi process, and discussions were supported by both literature review and patient input. The standard set measures outcomes of care for adults seeking treatment for OAB, excluding residents of long-term care facilities. The WG focused on treatment outcomes identified as most important key outcome domains to patients: symptom burden and bother, physical functioning, emotional health, impact of symptoms and treatment on quality of life, and success of treatment. Demographic information and case-mix factors that may affect these outcomes were also included. The standardized outcome set for evaluating clinical care is appropriate for use by all health providers caring for patients with OAB, regardless of specialty or geographic location, and provides key data for quality improvement activities and research.
The future of pediatric dentistry education and curricula: a Chilean perspective.
Mariño, Rodrigo; Ramos-Gómez, Francisco; Manton, David John; Onetto, Juan Eduardo; Hugo, Fernando; Feldens, Carlos Alberto; Bedi, Raman; Uribe, Sergio; Zillmann, Gisela
2016-07-18
A meeting was organised to consolidate a network of researchers and academics from Australia, Brazil, Chile, the UK and the USA, relating to Early Childhood Caries (ECC) and Dental Trauma (DT). As part of this meeting, a dedicated session was held on the future of paediatric dental education and curricula. Twenty-four paediatric dentistry (PD) academics, representing eight Chilean dental schools, and three international specialists (from Brazil and Latvia) participated in group discussions facilitated by five members of the ECC/DT International Collaborative Network. Data were collected from group discussions which followed themes developed as guides to identify key issues associated with paediatric dentistry education, training and research. Participants discussed current PD dental curricula in Chile, experiences in educating new cohorts of oral health care providers, and the outcomes of existing efforts in education and research in PD. They also, identified challenges, opportunities and areas in need of further development. This paper provides an introspective analysis of the education and training of PD in Chile; describes the input provided by participants into pediatric dentistry education and curricula; and sets out some key priorities for action with suggested directions to best prepare the future dental workforce to maximise oral health outcomes for children. Immediate priorities for action in paediatric dentistry in Chile were proposed.
NASA Astrophysics Data System (ADS)
Handley, Heather K.; Turner, Simon; Afonso, Juan C.; Dosseto, Anthony; Cohen, Tim
2013-02-01
Quantifying the rates of landscape evolution in response to climate change is inhibited by the difficulty of dating the formation of continental detrital sediments. We present uranium isotope data for Cooper Creek palaeochannel sediments from the Lake Eyre Basin in semi-arid South Australia in order to attempt to determine the formation ages and hence residence times of the sediments. To calculate the amount of recoil loss of 234U, a key input parameter used in the comminution approach, we use two suggested methods (weighted geometric and surface area measurement with an incorporated fractal correction) and typical assumed input parameter values found in the literature. The calculated recoil loss factors and comminution ages are highly dependent on the method of recoil loss factor determination used and the chosen assumptions. To appraise the ramifications of the assumptions inherent in the comminution age approach and determine individual and combined comminution age uncertainties associated to each variable, Monte Carlo simulations were conducted for a synthetic sediment sample. Using a reasonable associated uncertainty for each input factor and including variations in the source rock and measured (234U/238U) ratios, the total combined uncertainty on comminution age in our simulation (for both methods of recoil loss factor estimation) can amount to ±220-280 ka. The modelling shows that small changes in assumed input values translate into large effects on absolute comminution age. To improve the accuracy of the technique and provide meaningful absolute comminution ages, much tighter constraints are required on the assumptions for input factors such as the fraction of α-recoil lost 234Th and the initial (234U/238U) ratio of the source material. In order to be able to directly compare calculated comminution ages produced by different research groups, the standardisation of pre-treatment procedures, recoil loss factor estimation and assumed input parameter values is required. We suggest a set of input parameter values for such a purpose. Additional considerations for calculating comminution ages of sediments deposited within large, semi-arid drainage basins are discussed.
Honeywell optical investigations on FLASH program
NASA Astrophysics Data System (ADS)
O'Rourke, Ken; Peterson, Eric; Yount, Larry
1995-05-01
The increasing performance and reduction of life cycle cost requirements placed on commercial and military transport aircraft are resulting in more complex, highly integrated aircraft control and management systems. The use of fiber optic data transmission media can make significant contributions in achieving these performance and cost goals. The Honeywell portion of Task 2A on the Fly-by-Light Advanced System Hardware (FLASH) program is evaluating a Primary Flight Control System (PFCS) using pilot and copilot inputs from Active Hand Controllers (AHC) which are optically linked to the primary flight Control Computers (PFCC). Customer involvement is an important element of the Task 2A activity. Establishing customer requirements and perspectives on productization of systems developed under FLASH are key to future product success. The Honeywell elements of the PFCS demonstrator provide a command path that is optically interfaced from crew inputs to commands of distributed, smart actuation subsystems commands. Optical communication architectures are implemented using several protocols including the new AS-1773A 20 Mbps data bus standard. The interconnecting fiber optic cable plant is provided by our Task 1A teammate McDonnell Douglas Aerospace (West). Fiber optic cable plant fabrication uses processed, tools and materials reflecting necessary advances in manufacturing required to make fly-by-light avionics systems marketable.
ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations
NASA Astrophysics Data System (ADS)
Laloo, Jalal Z. A.; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai
2017-07-01
The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.
An imaging-based stochastic model for simulation of tumour vasculature
NASA Astrophysics Data System (ADS)
Adhikarla, Vikram; Jeraj, Robert
2012-10-01
A mathematical model which reconstructs the structure of existing vasculature using patient-specific anatomical, functional and molecular imaging as input was developed. The vessel structure is modelled according to empirical vascular parameters, such as the mean vessel branching angle. The model is calibrated such that the resultant oxygen map modelled from the simulated microvasculature stochastically matches the input oxygen map to a high degree of accuracy (R2 ≈ 1). The calibrated model was successfully applied to preclinical imaging data. Starting from the anatomical vasculature image (obtained from contrast-enhanced computed tomography), a representative map of the complete vasculature was stochastically simulated as determined by the oxygen map (obtained from hypoxia [64Cu]Cu-ATSM positron emission tomography). The simulated microscopic vasculature and the calculated oxygenation map successfully represent the imaged hypoxia distribution (R2 = 0.94). The model elicits the parameters required to simulate vasculature consistent with imaging and provides a key mathematical relationship relating the vessel volume to the tissue oxygen tension. Apart from providing an excellent framework for visualizing the imaging gap between the microscopic and macroscopic imagings, the model has the potential to be extended as a tool to study the dynamics between the tumour and the vasculature in a patient-specific manner and has an application in the simulation of anti-angiogenic therapies.
Asteroid Redirect Mission (ARM) Formulation Assessment and Support Team (FAST) Final Report
NASA Technical Reports Server (NTRS)
Mazanek, Daniel D.; Reeves, David M.; Abell, Paul A.; Asphaug, Erik; Abreu, Neyda M.; Bell, James F.; Bottke, William F.; Britt, Daniel T.; Campins, Humberto; Chodas, Paul W.;
2016-01-01
The Asteroid Redirect Mission (ARM) Formulation Assessment and Support Team (FAST) was a two-month effort, chartered by NASA, to provide timely inputs for mission requirement formulation in support of the Asteroid Redirect Robotic Mission (ARRM) Requirements Closure Technical Interchange Meeting held December 15-16, 2015, to assist in developing an initial list of potential mission investigations, and to provide input on potential hosted payloads and partnerships. The FAST explored several aspects of potential science benefits and knowledge gain from the ARM. Expertise from the science, engineering, and technology communities was represented in exploring lines of inquiry related to key characteristics of the ARRM reference target asteroid (2008 EV5) for engineering design purposes. Specific areas of interest included target origin, spatial distribution and size of boulders, surface geotechnical properties, boulder physical properties, and considerations for boulder handling, crew safety, and containment. In order to increase knowledge gain potential from the mission, opportunities for partnerships and accompanying payloads were also investigated. Potential investigations could be conducted to reduce mission risks and increase knowledge return in the areas of science, planetary defense, asteroid resources and in-situ resource utilization, and capability and technology demonstrations. This report represents the FASTâ€"TM"s final product for the ARM.
ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations.
Laloo, Jalal Z A; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai
2017-07-01
The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.
Mycorrhizal strategies for nitrogen acquisition have divergent effects on soil carbon
NASA Astrophysics Data System (ADS)
Wurzburger, N.; Brookshire, J.
2016-12-01
Most land plants acquire nitrogen (N) through associations with mycorrhizal fungi, but these symbioses employ contrasting strategies for N acquisition, which may lead to different stocks of soil carbon (C). Here we experimentally test the hypothesis that contrasting strategies for N acquisition by arbuscular (AM) and ectomycorrhizal (ECM) plants drive divergent patterns in soil decomposer activity and C loss. By employing a simple mesocosm system where we grew AM and ECM trees in 13C- and 15N-enriched organic matter, we quantified loss rates of soil C, uptake of N and net contributions of new plant C to soil. We found that AM trees promoted greater soil C loss relative to ECM trees and key mechanisms of N acquisition explained this pattern. AM trees were less dependent on biomass C to acquire N than ECM trees, and N uptake was correlated with soil C loss for AM, but not ECM trees. Further, while new plant C inputs stimulated soil C loss in both symbioses, we detected plant C inputs more frequently and measured higher rates of decomposer activity in soils colonized by AM relative to ECM trees. Together, our findings suggest that contrasting strategies of N acquisition by AM and ECM, including differences in stimulating decomposition, explain mycorrhizal effects on soil C. Our study provides experimental demonstration of the key mechanisms by which mycorrhizal strategies may give rise to broad patterns in soil C across terrestrial ecosystems.
Distributed usability evaluation of the Pennsylvania Cancer Atlas
Bhowmick, Tanuka; Robinson, Anthony C; Gruver, Adrienne; MacEachren, Alan M; Lengerich, Eugene J
2008-01-01
Background The Pennsylvania Cancer Atlas (PA-CA) is an interactive online atlas to help policy-makers, program managers, and epidemiologists with tasks related to cancer prevention and control. The PA-CA includes maps, graphs, tables, that are dynamically linked to support data exploration and decision-making with spatio-temporal cancer data. Our Atlas development process follows a user-centered design approach. To assess the usability of the initial versions of the PA-CA, we developed and applied a novel strategy for soliciting user feedback through multiple distributed focus groups and surveys. Our process of acquiring user feedback leverages an online web application (e-Delphi). In this paper we describe the PA-CA, detail how we have adapted e-Delphi web application to support usability and utility evaluation of the PA-CA, and present the results of our evaluation. Results We report results from four sets of users. Each group provided structured individual and group assessments of the PA-CA as well as input on the kinds of users and applications for which it is best suited. Overall reactions to the PA-CA are quite positive. Participants did, however, provide a range of useful suggestions. Key suggestions focused on improving interaction functions, enhancing methods of temporal analysis, addressing data issues, and providing additional data displays and help functions. These suggestions were incorporated in each design and implementation iteration for the PA-CA and used to inform a set of web-atlas design principles. Conclusion For the Atlas, we find that a design that utilizes linked map, graph, and table views is understandable to and perceived to be useful by the target audience of cancer prevention and control professionals. However, it is clear that considerable variation in experience using maps and graphics exists and for those with less experience, integrated tutorials and help features are needed. In relation to our usability assessment strategy, we find that our distributed, web-based method for soliciting user input is generally effective. Advantages include the ability to gather information from users distributed in time and space and the relative anonymity of the participants while disadvantages include less control over when and how often participants provide input and challenges for obtaining rich input. PMID:18620565
MODELING OF HUMAN EXPOSURE TO IN-VEHICLE PM2.5 FROM ENVIRONMENTAL TOBACCO SMOKE
Cao, Ye; Frey, H. Christopher
2012-01-01
Environmental tobacco smoke (ETS) is estimated to be a significant contributor to in-vehicle human exposure to fine particulate matter of 2.5 µm or smaller (PM2.5). A critical assessment was conducted of a mass balance model for estimating PM2.5 concentration with smoking in a motor vehicle. Recommendations for the range of inputs to the mass-balance model are given based on literature review. Sensitivity analysis was used to determine which inputs should be prioritized for data collection. Air exchange rate (ACH) and the deposition rate have wider relative ranges of variation than other inputs, representing inter-individual variability in operations, and inter-vehicle variability in performance, respectively. Cigarette smoking and emission rates, and vehicle interior volume, are also key inputs. The in-vehicle ETS mass balance model was incorporated into the Stochastic Human Exposure and Dose Simulation for Particulate Matter (SHEDS-PM) model to quantify the potential magnitude and variability of in-vehicle exposures to ETS. The in-vehicle exposure also takes into account near-road incremental PM2.5 concentration from on-road emissions. Results of probabilistic study indicate that ETS is a key contributor to the in-vehicle average and high-end exposure. Factors that mitigate in-vehicle ambient PM2.5 exposure lead to higher in-vehicle ETS exposure, and vice versa. PMID:23060732
Spike Triggered Covariance in Strongly Correlated Gaussian Stimuli
Aljadeff, Johnatan; Segev, Ronen; Berry, Michael J.; Sharpee, Tatyana O.
2013-01-01
Many biological systems perform computations on inputs that have very large dimensionality. Determining the relevant input combinations for a particular computation is often key to understanding its function. A common way to find the relevant input dimensions is to examine the difference in variance between the input distribution and the distribution of inputs associated with certain outputs. In systems neuroscience, the corresponding method is known as spike-triggered covariance (STC). This method has been highly successful in characterizing relevant input dimensions for neurons in a variety of sensory systems. So far, most studies used the STC method with weakly correlated Gaussian inputs. However, it is also important to use this method with inputs that have long range correlations typical of the natural sensory environment. In such cases, the stimulus covariance matrix has one (or more) outstanding eigenvalues that cannot be easily equalized because of sampling variability. Such outstanding modes interfere with analyses of statistical significance of candidate input dimensions that modulate neuronal outputs. In many cases, these modes obscure the significant dimensions. We show that the sensitivity of the STC method in the regime of strongly correlated inputs can be improved by an order of magnitude or more. This can be done by evaluating the significance of dimensions in the subspace orthogonal to the outstanding mode(s). Analyzing the responses of retinal ganglion cells probed with Gaussian noise, we find that taking into account outstanding modes is crucial for recovering relevant input dimensions for these neurons. PMID:24039563
Common Bolted Joint Analysis Tool
NASA Technical Reports Server (NTRS)
Imtiaz, Kauser
2011-01-01
Common Bolted Joint Analysis Tool (comBAT) is an Excel/VB-based bolted joint analysis/optimization program that lays out a systematic foundation for an inexperienced or seasoned analyst to determine fastener size, material, and assembly torque for a given design. Analysts are able to perform numerous what-if scenarios within minutes to arrive at an optimal solution. The program evaluates input design parameters, performs joint assembly checks, and steps through numerous calculations to arrive at several key margins of safety for each member in a joint. It also checks for joint gapping, provides fatigue calculations, and generates joint diagrams for a visual reference. Optimum fastener size and material, as well as correct torque, can then be provided. Analysis methodology, equations, and guidelines are provided throughout the solution sequence so that this program does not become a "black box:" for the analyst. There are built-in databases that reduce the legwork required by the analyst. Each step is clearly identified and results are provided in number format, as well as color-coded spelled-out words to draw user attention. The three key features of the software are robust technical content, innovative and user friendly I/O, and a large database. The program addresses every aspect of bolted joint analysis and proves to be an instructional tool at the same time. It saves analysis time, has intelligent messaging features, and catches operator errors in real time.
Diagnosable structured logic array
NASA Technical Reports Server (NTRS)
Whitaker, Sterling (Inventor); Miles, Lowell (Inventor); Gambles, Jody (Inventor); Maki, Gary K. (Inventor)
2009-01-01
A diagnosable structured logic array and associated process is provided. A base cell structure is provided comprising a logic unit comprising a plurality of input nodes, a plurality of selection nodes, and an output node, a plurality of switches coupled to the selection nodes, where the switches comprises a plurality of input lines, a selection line and an output line, a memory cell coupled to the output node, and a test address bus and a program control bus coupled to the plurality of input lines and the selection line of the plurality of switches. A state on each of the plurality of input nodes is verifiably loaded and read from the memory cell. A trusted memory block is provided. The associated process is provided for testing and verifying a plurality of truth table inputs of the logic unit.
Technology Benefit Estimator (T/BEST): User's Manual
NASA Technical Reports Server (NTRS)
Generazio, Edward R.; Chamis, Christos C.; Abumeri, Galib
1994-01-01
The Technology Benefit Estimator (T/BEST) system is a formal method to assess advanced technologies and quantify the benefit contributions for prioritization. T/BEST may be used to provide guidelines to identify and prioritize high payoff research areas, help manage research and limited resources, show the link between advanced concepts and the bottom line, i.e., accrued benefit and value, and to communicate credibly the benefits of research. The T/BEST software computer program is specifically designed to estimating benefits, and benefit sensitivities, of introducing new technologies into existing propulsion systems. Key engine cycle, structural, fluid, mission and cost analysis modules are used to provide a framework for interfacing with advanced technologies. An open-ended, modular approach is used to allow for modification and addition of both key and advanced technology modules. T/BEST has a hierarchical framework that yields varying levels of benefit estimation accuracy that are dependent on the degree of input detail available. This hierarchical feature permits rapid estimation of technology benefits even when the technology is at the conceptual stage. As knowledge of the technology details increases the accuracy of the benefit analysis increases. Included in T/BEST's framework are correlations developed from a statistical data base that is relied upon if there is insufficient information given in a particular area, e.g., fuel capacity or aircraft landing weight. Statistical predictions are not required if these data are specified in the mission requirements. The engine cycle, structural fluid, cost, noise, and emissions analyses interact with the default or user material and component libraries to yield estimates of specific global benefits: range, speed, thrust, capacity, component life, noise, emissions, specific fuel consumption, component and engine weights, pre-certification test, mission performance engine cost, direct operating cost, life cycle cost, manufacturing cost, development cost, risk, and development time. Currently, T/BEST operates on stand-alone or networked workstations, and uses a UNIX shell or script to control the operation of interfaced FORTRAN based analyses. T/BEST's interface structure works equally well with non-FORTRAN or mixed software analysis. This interface structure is designed to maintain the integrity of the expert's analyses by interfacing with expert's existing input and output files. Parameter input and output data (e.g., number of blades, hub diameters, etc.) are passed via T/BEST's neutral file, while copious data (e.g., finite element models, profiles, etc.) are passed via file pointers that point to the expert's analyses output files. In order to make the communications between the T/BEST's neutral file and attached analyses codes simple, only two software commands, PUT and GET, are required. This simplicity permits easy access to all input and output variables contained within the neutral file. Both public domain and proprietary analyses codes may be attached with a minimal amount of effort, while maintaining full data and analysis integrity, and security. T/BESt's sotware framework, status, beginner-to-expert operation, interface architecture, analysis module addition, and key analysis modules are discussed. Representative examples of T/BEST benefit analyses are shown.
Technology Benefit Estimator (T/BEST): User's manual
NASA Astrophysics Data System (ADS)
Generazio, Edward R.; Chamis, Christos C.; Abumeri, Galib
1994-12-01
The Technology Benefit Estimator (T/BEST) system is a formal method to assess advanced technologies and quantify the benefit contributions for prioritization. T/BEST may be used to provide guidelines to identify and prioritize high payoff research areas, help manage research and limited resources, show the link between advanced concepts and the bottom line, i.e., accrued benefit and value, and to communicate credibly the benefits of research. The T/BEST software computer program is specifically designed to estimating benefits, and benefit sensitivities, of introducing new technologies into existing propulsion systems. Key engine cycle, structural, fluid, mission and cost analysis modules are used to provide a framework for interfacing with advanced technologies. An open-ended, modular approach is used to allow for modification and addition of both key and advanced technology modules. T/BEST has a hierarchical framework that yields varying levels of benefit estimation accuracy that are dependent on the degree of input detail available. This hierarchical feature permits rapid estimation of technology benefits even when the technology is at the conceptual stage. As knowledge of the technology details increases the accuracy of the benefit analysis increases. Included in T/BEST's framework are correlations developed from a statistical data base that is relied upon if there is insufficient information given in a particular area, e.g., fuel capacity or aircraft landing weight. Statistical predictions are not required if these data are specified in the mission requirements. The engine cycle, structural fluid, cost, noise, and emissions analyses interact with the default or user material and component libraries to yield estimates of specific global benefits: range, speed, thrust, capacity, component life, noise, emissions, specific fuel consumption, component and engine weights, pre-certification test, mission performance engine cost, direct operating cost, life cycle cost, manufacturing cost, development cost, risk, and development time. Currently, T/BEST operates on stand-alone or networked workstations, and uses a UNIX shell or script to control the operation of interfaced FORTRAN based analyses. T/BEST's interface structure works equally well with non-FORTRAN or mixed software analysis. This interface structure is designed to maintain the integrity of the expert's analyses by interfacing with expert's existing input and output files. Parameter input and output data (e.g., number of blades, hub diameters, etc.) are passed via T/BEST's neutral file, while copious data (e.g., finite element models, profiles, etc.) are passed via file pointers that point to the expert's analyses output files. In order to make the communications between the T/BEST's neutral file and attached analyses codes simple, only two software commands, PUT and GET, are required. This simplicity permits easy access to all input and output variables contained within the neutral file. Both public domain and proprietary analyses codes may be attached with a minimal amount of effort, while maintaining full data and analysis integrity, and security.
Lippert, Thomas; Bandelin, Jochen; Musch, Alexandra; Drewes, Jörg E; Koch, Konrad
2018-05-20
The performance of a novel ultrasonic flatbed reactor for sewage sludge pre-treatment was assessed for three different waste activated sludges. The study systematically investigated the impact of specific energy input (200 - 3,000 kJ/kg TS ) on the degree of disintegration (DD COD , i.e. ratio between ultrasonically and maximum chemically solubilized COD) and methane production enhancement. Relationship between DD COD and energy input was linear, for all sludges tested. Methane yields were significantly increased for both low (200 kJ/kg TS ) and high (2,000 - 3,000 kJ/kg TS ) energy inputs, while intermediate inputs (400 - 1,000 kJ/kg TS ) showed no significant improvement. High inputs additionally accelerated reaction kinetics, but were limited to similar gains as low inputs (max. 12%), despite the considerably higher DD COD values. Energy balance was only positive for 200 kJ/kg TS -treatments, with a maximum energy recovery of 122%. Results suggest that floc deagglomeration rather than cell lysis (DD COD =1% - 5% at 200 kJ/kg TS ) is the key principle of energy-positive sludge sonication. Copyright © 2018 Elsevier Ltd. All rights reserved.
Voice input/output capabilities at Perception Technology Corporation
NASA Technical Reports Server (NTRS)
Ferber, Leon A.
1977-01-01
Condensed resumes of key company personnel at the Perception Technology Corporation are presented. The staff possesses recognition, speech synthesis, speaker authentication, and language identification. Hardware and software engineers' capabilities are included.
Dombrowski, Julia C; Carey, James W; Pitts, Nicole; Craw, Jason; Freeman, Arin; Golden, Matthew R; Bertolli, Jeanne
2016-06-10
U.S. health departments have not historically used HIV surveillance data for disease control interventions with individuals, but advances in HIV treatment and surveillance are changing public health practice. Many U.S. health departments are in the early stages of implementing "Data to Care" programs to assists persons living with HIV (PLWH) with engaging in care, based on information collected for HIV surveillance. Stakeholder engagement is a critical first step for development of these programs. In Seattle-King County, Washington, the health department conducted interviews with HIV medical care providers and PLWH to inform its Data to Care program. This paper describes the key themes of these interviews and traces the evolution of the resulting program. Disease intervention specialists conducted individual, semi-structured qualitative interviews with 20 PLWH randomly selected from HIV surveillance who had HIV RNA levels >10,000 copies/mL in 2009-2010. A physician investigator conducted key informant interviews with 15 HIV medical care providers. Investigators analyzed de-identified interview transcripts, developed a codebook of themes, independently coded the interviews, and identified codes used most frequently as well as illustrative quotes for these key themes. We also trace the evolution of the program from 2010 to 2015. PLWH generally accepted the idea of the health department helping PLWH engage in care, and described how hearing about the treatment experiences of HIV seropositive peers would assist them with engagement in care. Although many physicians were supportive of the Data to Care concept, others expressed concern about potential health department intrusion on patient privacy and the patient-physician relationship. Providers emphasized the need for the health department to coordinate with existing efforts to improve patient engagement. As a result of the interviews, the Data to Care program in Seattle-King County was designed to incorporate an HIV-positive peer component and to ensure coordination with HIV care providers in the process of relinking patients to care. Health departments can build support for Data to Care efforts by gathering input of key stakeholders, such as HIV medical and social service providers, and coordinating with clinic-based efforts to re-engage patients in care.
Rail-to-rail differential input amplification stage with main and surrogate differential pairs
Britton, Jr., Charles Lanier; Smith, Stephen Fulton
2007-03-06
An operational amplifier input stage provides a symmetrical rail-to-rail input common-mode voltage without turning off either pair of complementary differential input transistors. Secondary, or surrogate, transistor pairs assume the function of the complementary differential transistors. The circuit also maintains essentially constant transconductance, constant slew rate, and constant signal-path supply current as it provides rail-to-rail operation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tauke-Pedretti, Anna; Skogen, Erik J; Vawter, Gregory A
An optical sampler includes a first and second 1.times.n optical beam splitters splitting an input optical sampling signal and an optical analog input signal into n parallel channels, respectively, a plurality of optical delay elements providing n parallel delayed input optical sampling signals, n photodiodes converting the n parallel optical analog input signals into n respective electrical output signals, and n optical modulators modulating the input optical sampling signal or the optical analog input signal by the respective electrical output signals, and providing n successive optical samples of the optical analog input signal. A plurality of output photodiodes and eADCsmore » convert the n successive optical samples to n successive digital samples. The optical modulator may be a photodiode interconnected Mach-Zehnder Modulator. A method of sampling the optical analog input signal is disclosed.« less
Ethical practice in Telehealth and Telemedicine.
Chaet, Danielle; Clearfield, Ron; Sabin, James E; Skimming, Kathryn
2017-10-01
This article summarizes the report of the American Medical Association's (AMA) Council on Ethical and Judicial Affairs (CEJA) on ethical practice in telehealth and telemedicine. Through its reports and recommendations, CEJA is responsible for maintaining and updating the AMA Code of Medical Ethics (Code). CEJA reports are developed through an iterative process of deliberation with input from multiple stakeholders; report recommendations, once adopted by the AMA House of Delegates, become ethics policy of the AMA and are issued as Opinions in the Code. To provide enduring guidance for the medical profession as a whole, CEJA strives to articulate expectations for conduct that are as independent of specific technologies or models of practice as possible. The present report, developed at the request of the House of Delegates, provides broad guidance for ethical conduct relating to key issues in telehealth/telemedicine. The report and recommendations were debated at meetings of the House in June and November 2015; recommendations were adopted in June 2016 and published as Opinion E-1.2.12, Ethical Practice in Telemedicine, in November 2016. A summary of the key points of the recommendations can be found in Appendix A (online), and the full text of the opinion can be found in Appendix B (online).
Nonlinear detection for a high rate extended binary phase shift keying system.
Chen, Xian-Qing; Wu, Le-Nan
2013-03-28
The algorithm and the results of a nonlinear detector using a machine learning technique called support vector machine (SVM) on an efficient modulation system with high data rate and low energy consumption is presented in this paper. Simulation results showed that the performance achieved by the SVM detector is comparable to that of a conventional threshold decision (TD) detector. The two detectors detect the received signals together with the special impacting filter (SIF) that can improve the energy utilization efficiency. However, unlike the TD detector, the SVM detector concentrates not only on reducing the BER of the detector, but also on providing accurate posterior probability estimates (PPEs), which can be used as soft-inputs of the LDPC decoder. The complexity of this detector is considered in this paper by using four features and simplifying the decision function. In addition, a bandwidth efficient transmission is analyzed with both SVM and TD detector. The SVM detector is more robust to sampling rate than TD detector. We find that the SVM is suitable for extended binary phase shift keying (EBPSK) signal detection and can provide accurate posterior probability for LDPC decoding.
Nonlinear Detection for a High Rate Extended Binary Phase Shift Keying System
Chen, Xian-Qing; Wu, Le-Nan
2013-01-01
The algorithm and the results of a nonlinear detector using a machine learning technique called support vector machine (SVM) on an efficient modulation system with high data rate and low energy consumption is presented in this paper. Simulation results showed that the performance achieved by the SVM detector is comparable to that of a conventional threshold decision (TD) detector. The two detectors detect the received signals together with the special impacting filter (SIF) that can improve the energy utilization efficiency. However, unlike the TD detector, the SVM detector concentrates not only on reducing the BER of the detector, but also on providing accurate posterior probability estimates (PPEs), which can be used as soft-inputs of the LDPC decoder. The complexity of this detector is considered in this paper by using four features and simplifying the decision function. In addition, a bandwidth efficient transmission is analyzed with both SVM and TD detector. The SVM detector is more robust to sampling rate than TD detector. We find that the SVM is suitable for extended binary phase shift keying (EBPSK) signal detection and can provide accurate posterior probability for LDPC decoding. PMID:23539034
Marli: Mars Lidar for Global Wind Profiles and Aerosol Profiles from Orbit
NASA Technical Reports Server (NTRS)
Abshire, J. B.; Guzewich, S. D.; Smith, M. D.; Riris, H.; Sun, X.; Gentry, B. M.; Yu, A.; Allan, G. R.
2016-01-01
The Mars Exploration Analysis Group's Next Orbiter Science Analysis Group (NEXSAG) has recently identified atmospheric wind measurements as one of 5 top compelling science objectives for a future Mars orbiter. To date, only isolated lander observations of martian winds exist. Winds are the key variable to understand atmospheric transport and answer fundamental questions about the three primary cycles of the martian climate: CO2, H2O, and dust. However, the direct lack of observations and imprecise and indirect inferences from temperature observations leave many basic questions about the atmospheric circulation unanswered. In addition to addressing high priority science questions, direct wind observations from orbit would help validate 3D general circulation models (GCMs) while also providing key input to atmospheric reanalyses. The dust and CO2 cycles on Mars are partially coupled and their influences on the atmospheric circulation modify the global wind field. Dust absorbs solar infrared radiation and its variable spatial distribution forces changes in the atmospheric temperature and wind fields. Thus it is important to simultaneously measure the height-resolved wind and dust profiles. MARLI provides a unique capability to observe these variables continuously, day and night, from orbit.
Bitton, Asaf; Ratcliffe, Hannah L; Veillard, Jeremy H; Kress, Daniel H; Barkley, Shannon; Kimball, Meredith; Secci, Federica; Wong, Ethan; Basu, Lopa; Taylor, Chelsea; Bayona, Jaime; Wang, Hong; Lagomarsino, Gina; Hirschhorn, Lisa R
2017-05-01
Primary health care (PHC) has been recognized as a core component of effective health systems since the early part of the twentieth century. However, despite notable progress, there remains a large gap between what individuals and communities need, and the quality and effectiveness of care delivered. The Primary Health Care Performance Initiative (PHCPI) was established by an international consortium to catalyze improvements in PHC delivery and outcomes in low- and middle-income countries through better measurement and sharing of effective models and practices. PHCPI has developed a framework to illustrate the relationship between key financing, workforce, and supply inputs, and core primary health care functions of first-contact accessibility, comprehensiveness, coordination, continuity, and person-centeredness. The framework provides guidance for more effective assessment of current strengths and gaps in PHC delivery through a core set of 25 key indicators ("Vital Signs"). Emerging best practices that foster high-performing PHC system development are being codified and shared around low- and high-income countries. These measurement and improvement approaches provide countries and implementers with tools to assess the current state of their PHC delivery system and to identify where cross-country learning can accelerate improvements in PHC quality and effectiveness.
NASA Astrophysics Data System (ADS)
Cheng, Lara W. S.
Airport moving maps (AMMs) have been shown to decrease navigation errors, increase taxiing speed, and reduce workload when they depict airport layout, current aircraft position, and the cleared taxi route. However, current technologies are limited in their ability to depict the cleared taxi route due to the unavailability of datacomm or other means of electronically transmitting clearances from ATC to the flight deck. This study examined methods by which pilots can input ATC-issued taxi clearances to support taxi route depictions on the AMM. Sixteen general aviation (GA) pilots used a touchscreen monitor to input taxi clearances using two input layouts, softkeys and QWERTY, each with and without feedforward (graying out invalid inputs). QWERTY yielded more taxi route input errors than the softkeys layout. The presence of feedforward did not produce fewer taxi route input errors than in the non-feedforward condition. The QWERTY layout did reduce taxi clearance input times relative to the softkeys layout, but when feedforward was present this effect was observed only for the longer, 6-segment taxi clearances. It was observed that with the softkeys layout, feedforward reduced input times compared to non-feedforward but only for the 4-segment clearances. Feedforward did not support faster taxi clearance input times for the QWERTY layout. Based on the results and analyses of the present study, it is concluded that for taxi clearance inputs, (1) QWERTY remain the standard for alphanumeric inputs, and (2) feedforward be investigated further, with a focus on participant preference and performance of black-gray contrast of keys.
40 CFR 60.4176 - Additional requirements to provide heat input data.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 6 2011-07-01 2011-07-01 false Additional requirements to provide heat... requirements to provide heat input data. The owner or operator of a Hg Budget unit that monitors and reports Hg... monitor and report heat input rate at the unit level using the procedures set forth in part 75 of this...
NASA Astrophysics Data System (ADS)
Mao, Yaya; Wu, Chongqing; Liu, Bo; Ullah, Rahat; Tian, Feng
2017-12-01
We experimentally investigate the polarization insensitivity and cascadability of an all-optical wavelength converter for differential phase-shift keyed (DPSK) signals for the first time. The proposed wavelength converter is composed of a one-bit delay interferometer demodulation stage followed by a single semiconductor optical amplifier. The impact of input DPSK signal polarization fluctuation on receiver sensitivity for the converted signal is carried out. It is found that this scheme is almost insensitive to the state of polarization of the input DPSK signal. Furthermore, the cascadability of the converter is demonstrated in a two-path recirculating loop. Error-free transmission is achieved with 20 stage cascaded wavelength conversions over 2800 km, where the power penalty is <3.4 dB at bit error rate of 10-9.
A Tool and Application Programming Interface for Browsing Historical Geostationary Satellite Data
NASA Astrophysics Data System (ADS)
Chee, T.; Nguyen, L.; Minnis, P.; Spangenberg, D.; Ayers, J.
2013-12-01
Providing access to information is a key concern for NASA Langley Research Center. We describe a tool and method that allows end users to easily browse and access information that is otherwise difficult to acquire and manipulate. The tool described has as its core the application-programming interface that is made available to the public. One goal of the tool is to provide a demonstration to end users so that they can use the enhanced imagery as an input into their own work flows. This project builds upon NASA Langley Cloud and Radiation Group's experience with making real-time and historical satellite imagery accessible and easily searchable. As we see the increasing use of virtual supply chains that provide additional value at each link there is value in making satellite imagery available through a simple access method as well as allowing users to browse and view that imagery as they need rather than in a manner most convenient for the data provider.
NASA Astrophysics Data System (ADS)
Beijen, Michiel A.; Voorhoeve, Robbert; Heertjes, Marcel F.; Oomen, Tom
2018-07-01
Vibration isolation is essential for industrial high-precision systems to suppress external disturbances. The aim of this paper is to develop a general identification approach to estimate the frequency response function (FRF) of the transmissibility matrix, which is a key performance indicator for vibration isolation systems. The major challenge lies in obtaining a good signal-to-noise ratio in view of a large system weight. A non-parametric system identification method is proposed that combines floor and shaker excitations. Furthermore, a method is presented to analyze the input power spectrum of the floor excitations, both in terms of magnitude and direction. In turn, the input design of the shaker excitation signals is investigated to obtain sufficient excitation power in all directions with minimum experiment cost. The proposed methods are shown to provide an accurate FRF of the transmissibility matrix in three relevant directions on an industrial active vibration isolation system over a large frequency range. This demonstrates that, despite their heavy weight, industrial vibration isolation systems can be accurately identified using this approach.
Preparing for influenza after 2009 H1N1: special considerations for pregnant women and newborns.
Rasmussen, Sonja A; Kissin, Dmitry M; Yeung, Lorraine F; MacFarlane, Kitty; Chu, Susan Y; Turcios-Ruiz, Reina M; Mitchell, Elizabeth W; Williams, Jennifer; Fry, Alicia M; Hageman, Jeffrey; Uyeki, Timothy M; Jamieson, Denise J
2011-06-01
Pregnant women and their newborn infants are at increased risk for influenza-associated complications, based on data from seasonal influenza and influenza pandemics. The Centers for Disease Control and Prevention (CDC) developed public health recommendations for these populations in response to the 2009 H1N1 pandemic. A review of these recommendations and information that was collected during the pandemic is needed to prepare for future influenza seasons and pandemics. The CDC convened a meeting entitled "Pandemic Influenza Revisited: Special Considerations for Pregnant Women and Newborns" on August 12-13, 2010, to gain input from experts and key partners on 4 main topics: antiviral prophylaxis and therapy, vaccine use, intrapartum/newborn (including infection control) issues, and nonpharmaceutical interventions and health care planning. Challenges to communicating recommendations regarding influenza to pregnant women and their health care providers were also discussed. After careful consideration of the available information and individual expert input, the CDC updated its recommendations for these populations for future influenza seasons and pandemics. Published by Mosby, Inc.
Neutrinoless double beta decay and chiral SU(3)
Cirigliano, Vincenzo; Dekens, Wouter Gerard; Graesser, Michael Lawrence; ...
2017-04-14
TeV-scale lepton number violation can affect neutrinoless double beta decay through dimension-9 ΔL=ΔI=2 operators involving two electrons and four quarks. Since the dominant effects within a nucleus are expected to arise from pion exchange, the π -→π +ee matrix elements of the dimension-9 operators are a key hadronic input. Here in this letter we provide estimates for the π -→π + matrix elements of all Lorentz scalar ΔI=2 four-quark operators relevant to the study of TeV-scale lepton number violation. The analysis is based on chiral SU(3) symmetry, which relates the π -→π + matrix elements of the ΔI=2 operators to themore » $K$ 0→$$\\bar{K}$$ 0 and K→ππ matrix elements of their ΔS=2 and ΔS=1 chiral partners, for which lattice QCD input is available. The inclusion of next-to-leading order chiral loop corrections to all symmetry relations used in the analysis makes our results robust at the 30% level or better, depending on the operator.« less
A trans-cultural comparison of the organisation of care at headache centres world-wide.
Bhola, Ria; Goadsby, Peter J
2011-02-01
The need to provide better outcomes for patients with headache, and to minimise the costs involved in doing so, has prompted the search for new modes of service delivery by exploring the service organisation and nursing role from various cultural, economic and global perspectives. This study was based on comparisons with the UK headache service up to 2007, the point at which this study was set up. This UK service was based at the National Hospital for Neurology and Neurosurgery (NHNN, UCLH Trust). Data were obtained from US headache centres in 2008 and from centres in Copenhagen, Bangkok, Sydney and Porto Alegre in 2009. A comparison shows the key components of services at all centres showing the team structure and size of service. Prominent features at the centres included: team-working, regular meetings, educational input, good access and communication among team members, headache-trained neurologists, specialist nursing at most centres, and the input of psychological and physical therapists at some centres. The problems of tertiary headache care are very similar throughout the world and seem to transcend ethnic, cultural and economic considerations.
Molecular quantum cellular automata cell design trade-offs: latching vs. power dissipation.
Rahimi, Ehsan; Reimers, Jeffrey R
2018-06-20
The use of molecules to enact quantum cellular automata (QCA) cells has been proposed as a new way for performing electronic logic operations at sub-nm dimensions. A key question that arises concerns whether chemical or physical processes are to be exploited. The use of chemical reactions allows the state of a switch element to be latched in molecular form, making the output of a cell independent of its inputs, but costs energy to do the reaction. Alternatively, if purely electronic polarization is manipulated then no internal latching occurs, but no power is dissipated provided the fields from the inputs change slowly compared to the molecular response times. How these scenarios pan out is discussed by considering calculated properties of the 1,4-diallylbutane cation, a species often used as a paradigm for molecular electronic switching. Utilized are results from different calculation approaches that depict the ion either as a charge-localized mixed-valence compound functioning as a bistable switch, or else as an extremely polarizable molecule with a delocalized electronic structure. Practical schemes for using molecular cells in QCA and other devices emerge.
Finite Element-Based Mechanical Assessment of Bone Quality on the Basis of In Vivo Images.
Pahr, Dieter H; Zysset, Philippe K
2016-12-01
Beyond bone mineral density (BMD), bone quality designates the mechanical integrity of bone tissue. In vivo images based on X-ray attenuation, such as CT reconstructions, provide size, shape, and local BMD distribution and may be exploited as input for finite element analysis (FEA) to assess bone fragility. Further key input parameters of FEA are the material properties of bone tissue. This review discusses the main determinants of bone mechanical properties and emphasizes the added value, as well as the important assumptions underlying finite element analysis. Bone tissue is a sophisticated, multiscale composite material that undergoes remodeling but exhibits a rather narrow band of tissue mineralization. Mechanically, bone tissue behaves elastically under physiologic loads and yields by cracking beyond critical strain levels. Through adequate cell-orchestrated modeling, trabecular bone tunes its mechanical properties by volume fraction and fabric. With proper calibration, these mechanical properties may be incorporated in quantitative CT-based finite element analysis that has been validated extensively with ex vivo experiments and has been applied increasingly in clinical trials to assess treatment efficacy against osteoporosis.
Brainstem mechanisms underlying the cough reflex and its regulation.
Mutolo, Donatella
2017-09-01
Cough is a very important airway protective reflex. Cough-related inputs are conveyed to the caudal nucleus tractus solitarii (cNTS) that projects to the brainstem respiratory network. The latter is reconfigured to generate the cough motor pattern. A high degree of modulation is exerted on second-order neurons and the brainstem respiratory network by sensory inputs and higher brain areas. Two medullary structures proved to have key functions in cough production and to be strategic sites of action for centrally active drugs: the cNTS and the caudal ventral respiratory group (cVRG). Drugs microinjected into these medullary structures caused downregulation or upregulation of the cough reflex. The results suggest that inhibition and disinhibition are prominent regulatory mechanisms of this reflex and that both the cNTS and the cVRG are essential in the generation of the entire cough motor pattern. Studies on the basic neural mechanisms subserving the cough reflex may provide hints for novel therapeutic approaches. Different proposals for further investigations are advanced. Copyright © 2017 Elsevier B.V. All rights reserved.
Entangling the Whole by Beam Splitting a Part.
Croal, Callum; Peuntinger, Christian; Chille, Vanessa; Marquardt, Christoph; Leuchs, Gerd; Korolkova, Natalia; Mišta, Ladislav
2015-11-06
A beam splitter is a basic linear optical element appearing in many optics experiments and is frequently used as a continuous-variable entangler transforming a pair of input modes from a separable Gaussian state into an entangled state. However, a beam splitter is a passive operation that can create entanglement from Gaussian states only under certain conditions. One such condition is that the input light is suitably squeezed. We demonstrate, experimentally, that a beam splitter can create entanglement even from modes which do not possess such a squeezing provided that they are correlated to, but not entangled with, a third mode. Specifically, we show that a beam splitter can create three-mode entanglement by acting on two modes of a three-mode fully separable Gaussian state without entangling the two modes themselves. This beam splitter property is a key mechanism behind the performance of the protocol for entanglement distribution by separable states. Moreover, the property also finds application in collaborative quantum dense coding in which decoding of transmitted information is assisted by interference with a mode of the collaborating party.
Visual analytics of inherently noisy crowdsourced data on ultra high resolution displays
NASA Astrophysics Data System (ADS)
Huynh, Andrew; Ponto, Kevin; Lin, Albert Yu-Min; Kuester, Falko
The increasing prevalence of distributed human microtasking, crowdsourcing, has followed the exponential increase in data collection capabilities. The large scale and distributed nature of these microtasks produce overwhelming amounts of information that is inherently noisy due to the nature of human input. Furthermore, these inputs create a constantly changing dataset with additional information added on a daily basis. Methods to quickly visualize, filter, and understand this information over temporal and geospatial constraints is key to the success of crowdsourcing. This paper present novel methods to visually analyze geospatial data collected through crowdsourcing on top of remote sensing satellite imagery. An ultra high resolution tiled display system is used to explore the relationship between human and satellite remote sensing data at scale. A case study is provided that evaluates the presented technique in the context of an archaeological field expedition. A team in the field communicated in real-time with and was guided by researchers in the remote visual analytics laboratory, swiftly sifting through incoming crowdsourced data to identify target locations that were identified as viable archaeological sites.
Uncertainty in Ecohydrological Modeling in an Arid Region Determined with Bayesian Methods
Yang, Junjun; He, Zhibin; Du, Jun; Chen, Longfei; Zhu, Xi
2016-01-01
In arid regions, water resources are a key forcing factor in ecosystem circulation, and soil moisture is the critical link that constrains plant and animal life on the soil surface and underground. Simulation of soil moisture in arid ecosystems is inherently difficult due to high variability. We assessed the applicability of the process-oriented CoupModel for forecasting of soil water relations in arid regions. We used vertical soil moisture profiling for model calibration. We determined that model-structural uncertainty constituted the largest error; the model did not capture the extremes of low soil moisture in the desert-oasis ecotone (DOE), particularly below 40 cm soil depth. Our results showed that total uncertainty in soil moisture prediction was improved when input and output data, parameter value array, and structure errors were characterized explicitly. Bayesian analysis was applied with prior information to reduce uncertainty. The need to provide independent descriptions of uncertainty analysis (UA) in the input and output data was demonstrated. Application of soil moisture simulation in arid regions will be useful for dune-stabilization and revegetation efforts in the DOE. PMID:26963523
Wiedmann, Thomas O; Suh, Sangwon; Feng, Kuishuang; Lenzen, Manfred; Acquaye, Adolf; Scott, Kate; Barrett, John R
2011-07-01
Future energy technologies will be key for a successful reduction of man-made greenhouse gas emissions. With demand for electricity projected to increase significantly in the future, climate policy goals of limiting the effects of global atmospheric warming can only be achieved if power generation processes are profoundly decarbonized. Energy models, however, have ignored the fact that upstream emissions are associated with any energy technology. In this work we explore methodological options for hybrid life cycle assessment (hybrid LCA) to account for the indirect greenhouse gas (GHG) emissions of energy technologies using wind power generation in the UK as a case study. We develop and compare two different approaches using a multiregion input-output modeling framework - Input-Output-based Hybrid LCA and Integrated Hybrid LCA. The latter utilizes the full-sized Ecoinvent process database. We discuss significance and reliability of the results and suggest ways to improve the accuracy of the calculations. The comparison of hybrid LCA methodologies provides valuable insight into the availability and robustness of approaches for informing energy and environmental policy.
Neural principles of memory and a neural theory of analogical insight
NASA Astrophysics Data System (ADS)
Lawson, David I.; Lawson, Anton E.
1993-12-01
Grossberg's principles of neural modeling are reviewed and extended to provide a neural level theory to explain how analogies greatly increase the rate of learning and can, in fact, make learning and retention possible. In terms of memory, the key point is that the mind is able to recognize and recall when it is able to match sensory input from new objects, events, or situations with past memory records of similar objects, events, or situations. When a match occurs, an adaptive resonance is set up in which the synaptic strengths of neurons are increased; thus a long term record of the new input is formed in memory. Systems of neurons called outstars and instars are presumably the underlying units that enable this to occur. Analogies can greatly facilitate learning and retention because they activate the outstars (i.e., the cells that are sampling the to-be-learned pattern) and cause the neural activity to grow exponentially by forming feedback loops. This increased activity insures the boost in synaptic strengths of neurons, thus causing storage and retention in long-term memory (i.e., learning).
SNP ID-info: SNP ID searching and visualization platform.
Yang, Cheng-Hong; Chuang, Li-Yeh; Cheng, Yu-Huei; Wen, Cheng-Hao; Chang, Phei-Lang; Chang, Hsueh-Wei
2008-09-01
Many association studies provide the relationship between single nucleotide polymorphisms (SNPs), diseases and cancers, without giving a SNP ID, however. Here, we developed the SNP ID-info freeware to provide the SNP IDs within inputting genetic and physical information of genomes. The program provides an "SNP-ePCR" function to generate the full-sequence using primers and template inputs. In "SNPosition," sequence from SNP-ePCR or direct input is fed to match the SNP IDs from SNP fasta-sequence. In "SNP search" and "SNP fasta" function, information of SNPs within the cytogenetic band, contig position, and keyword input are acceptable. Finally, the SNP ID neighboring environment for inputs is completely visualized in the order of contig position and marked with SNP and flanking hits. The SNP identification problems inherent in NCBI SNP BLAST are also avoided. In conclusion, the SNP ID-info provides a visualized SNP ID environment for multiple inputs and assists systematic SNP association studies. The server and user manual are available at http://bio.kuas.edu.tw/snpid-info.
Whitehead, P G; Leckie, H; Rankinen, K; Butterfield, D; Futter, M N; Bussi, G
2016-12-01
Pathogens are an ongoing issue for catchment water management and quantifying their transport, loss and potential impacts at key locations, such as water abstractions for public supply and bathing sites, is an important aspect of catchment and coastal management. The Integrated Catchment Model (INCA) has been adapted to model the sources and sinks of pathogens and to capture the dominant dynamics and processes controlling pathogens in catchments. The model simulates the stores of pathogens in soils, sediments, rivers and groundwaters and can account for diffuse inputs of pathogens from agriculture, urban areas or atmospheric deposition. The model also allows for point source discharges from intensive livestock units or from sewage treatment works or any industrial input to river systems. Model equations are presented and the new pathogens model has been applied to the River Thames in order to assess total coliform (TC) responses under current and projected future land use. A Monte Carlo sensitivity analysis indicates that the input coliform estimates from agricultural sources and decay rates are the crucial parameters controlling pathogen behaviour. Whilst there are a number of uncertainties associated with the model that should be accounted for, INCA-Pathogens potentially provides a useful tool to inform policy decisions and manage pathogen loading in river systems. Copyright © 2016. Published by Elsevier B.V.
Microlens array processor with programmable weight mask and direct optical input
NASA Astrophysics Data System (ADS)
Schmid, Volker R.; Lueder, Ernst H.; Bader, Gerhard; Maier, Gert; Siegordner, Jochen
1999-03-01
We present an optical feature extraction system with a microlens array processor. The system is suitable for online implementation of a variety of transforms such as the Walsh transform and DCT. Operating with incoherent light, our processor accepts direct optical input. Employing a sandwich- like architecture, we obtain a very compact design of the optical system. The key elements of the microlens array processor are a square array of 15 X 15 spherical microlenses on acrylic substrate and a spatial light modulator as transmissive mask. The light distribution behind the mask is imaged onto the pixels of a customized a-Si image sensor with adjustable gain. We obtain one output sample for each microlens image and its corresponding weight mask area as summation of the transmitted intensity within one sensor pixel. The resulting architecture is very compact and robust like a conventional camera lens while incorporating a high degree of parallelism. We successfully demonstrate a Walsh transform into the spatial frequency domain as well as the implementation of a discrete cosine transform with digitized gray values. We provide results showing the transformation performance for both synthetic image patterns and images of natural texture samples. The extracted frequency features are suitable for neural classification of the input image. Other transforms and correlations can be implemented in real-time allowing adaptive optical signal processing.
High Performance Input/Output for Parallel Computer Systems
NASA Technical Reports Server (NTRS)
Ligon, W. B.
1996-01-01
The goal of our project is to study the I/O characteristics of parallel applications used in Earth Science data processing systems such as Regional Data Centers (RDCs) or EOSDIS. Our approach is to study the runtime behavior of typical programs and the effect of key parameters of the I/O subsystem both under simulation and with direct experimentation on parallel systems. Our three year activity has focused on two items: developing a test bed that facilitates experimentation with parallel I/O, and studying representative programs from the Earth science data processing application domain. The Parallel Virtual File System (PVFS) has been developed for use on a number of platforms including the Tiger Parallel Architecture Workbench (TPAW) simulator, The Intel Paragon, a cluster of DEC Alpha workstations, and the Beowulf system (at CESDIS). PVFS provides considerable flexibility in configuring I/O in a UNIX- like environment. Access to key performance parameters facilitates experimentation. We have studied several key applications fiom levels 1,2 and 3 of the typical RDC processing scenario including instrument calibration and navigation, image classification, and numerical modeling codes. We have also considered large-scale scientific database codes used to organize image data.
Taghadomi-Saberi, Saeedeh; Mas Garcia, Sílvia; Allah Masoumi, Amin; Sadeghi, Morteza; Marco, Santiago
2018-06-13
The quality and composition of bitter orange essential oils (EOs) strongly depend on the ripening stage of the citrus fruit. The concentration of volatile compounds and consequently its organoleptic perception varies. While this can be detected by trained humans, we propose an objective approach for assessing the bitter orange from the volatile composition of their EO. The method is based on the combined use of headspace gas chromatography⁻mass spectrometry (HS-GC-MS) and artificial neural networks (ANN) for predictive modeling. Data obtained from the analysis of HS-GC-MS were preprocessed to select relevant peaks in the total ion chromatogram as input features for ANN. Results showed that key volatile compounds have enough predictive power to accurately classify the EO, according to their ripening stage for different applications. A sensitivity analysis detected the key compounds to identify the ripening stage. This study provides a novel strategy for the quality control of bitter orange EO without subjective methods.
Zullig, Leah L; Granger, Bradi B; Bosworth, Hayden B
2016-01-01
Nonadherence to prescription medications is a common and costly problem with multiple contributing factors, spanning the dimensions of individual behavior change, psychology, medicine, and health policy, among others. Addressing the problem of medication nonadherence requires strategic input from key experts in a number of fields. The Medication Adherence Alliance is a group of key experts, predominately from the US, in the field of medication nonadherence. Members include representatives from consumer advocacy groups, community health providers, nonprofit groups, the academic community, decision-making government officials, and industry. In 2015, the Medication Adherence Alliance convened to review the current landscape of medication adherence. The group then established three working groups that will develop recommendations for shifting toward solutions-oriented science. From the perspective of the Medication Adherence Alliance, the objective of this commentary is to describe changes in the US landscape of medication adherence, framing the evolving field in the context of a recent think tank meeting of experts in the field of medication adherence.
Periodic components of hand acceleration/deceleration impulses during telemanipulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Draper, J.V.; Handel, S.
1994-01-01
Responsiveness is the ability of a telemanipulator to recreate user trajectories and impedance in time and space. For trajectory production, a key determinant of responsiveness is the ability of the system to accept user inputs, which are forces on the master handle generated by user hand acceleration/deceleration (a/d) impulses, and translate them into slave arm acceleration/deceleration. This paper presents observations of master controller a/d impulses during completion of a simple target acquisition task. Power spectral density functions (PSDF`s) calculated from hand controller a/d impulses were used to assess impulse waveform. The relative contributions of frequency intervals ranging up to 25more » Hz for three spatially different versions of the task were used to determine which frequencies were most important. The highest relative power was observed in frequencies between 1 Hz and 6 Hz. The key frequencies related to task difficulty were in the range from 2 Hz to 8 Hz. the results provide clues to the source of the performance inhibition.« less
Singh, Anoop; Pant, Deepak; Korres, Nicholas E; Nizami, Abdul-Sattar; Prasad, Shiv; Murphy, Jerry D
2010-07-01
Progressive depletion of conventional fossil fuels with increasing energy consumption and greenhouse gas (GHG) emissions have led to a move towards renewable and sustainable energy sources. Lignocellulosic biomass is available in massive quantities and provides enormous potential for bioethanol production. However, to ascertain optimal biofuel strategies, it is necessary to take into account environmental impacts from cradle to grave. Life cycle assessment (LCA) techniques allow detailed analysis of material and energy fluxes on regional and global scales. This includes indirect inputs to the production process and associated wastes and emissions, and the downstream fate of products in the future. At the same time if not used properly, LCA can lead to incorrect and inappropriate actions on the part of industry and/or policy makers. This paper aims to list key issues for quantifying the use of resources and releases to the environment associated with the entire life cycle of lignocellulosic bioethanol production. Copyright 2009 Elsevier Ltd. All rights reserved.
Wijesena, Naveen; Simmons, David K.
2017-01-01
Gastrulation was arguably the key evolutionary innovation that enabled metazoan diversification, leading to the formation of distinct germ layers and specialized tissues. Differential gene expression specifying cell fate is governed by the inputs of intracellular and/or extracellular signals. Beta-catenin/Tcf and the TGF-beta bone morphogenetic protein (BMP) provide critical molecular signaling inputs during germ layer specification in bilaterian metazoans, but there has been no direct experimental evidence for a specific role for BMP signaling during endomesoderm specification in the early branching metazoan Nematostella vectensis (an anthozoan cnidarian). Using forward transcriptomics, we show that beta-catenin/Tcf signaling and BMP2/4 signaling provide differential inputs into the cnidarian endomesodermal gene regulatory network (GRN) at the onset of gastrulation (24 h postfertilization) in N. vectensis. Surprisingly, beta-catenin/Tcf signaling and BMP2/4 signaling regulate a subset of common downstream target genes in the GRN in opposite ways, leading to the spatial and temporal differentiation of fields of cells in the developing embryo. Thus, we show that regulatory interactions between beta-catenin/Tcf signaling and BMP2/4 signaling are required for the specification and determination of different embryonic regions and the patterning of the oral–aboral axis in Nematostella. We also show functionally that the conserved “kernel” of the bilaterian heart mesoderm GRN is operational in N. vectensis, which reinforces the hypothesis that the endoderm and mesoderm in triploblastic bilaterians evolved from the bifunctional endomesoderm (gastrodermis) of a diploblastic ancestor, and that slow rhythmic contractions might have been one of the earliest functions of mesodermal tissue. PMID:28652368
Three-input majority logic gate and multiple input logic circuit based on DNA strand displacement.
Li, Wei; Yang, Yang; Yan, Hao; Liu, Yan
2013-06-12
In biomolecular programming, the properties of biomolecules such as proteins and nucleic acids are harnessed for computational purposes. The field has gained considerable attention due to the possibility of exploiting the massive parallelism that is inherent in natural systems to solve computational problems. DNA has already been used to build complex molecular circuits, where the basic building blocks are logic gates that produce single outputs from one or more logical inputs. We designed and experimentally realized a three-input majority gate based on DNA strand displacement. One of the key features of a three-input majority gate is that the three inputs have equal priority, and the output will be true if any of the two inputs are true. Our design consists of a central, circular DNA strand with three unique domains between which are identical joint sequences. Before inputs are introduced to the system, each domain and half of each joint is protected by one complementary ssDNA that displays a toehold for subsequent displacement by the corresponding input. With this design the relationship between any two domains is analogous to the relationship between inputs in a majority gate. Displacing two or more of the protection strands will expose at least one complete joint and return a true output; displacing none or only one of the protection strands will not expose a complete joint and will return a false output. Further, we designed and realized a complex five-input logic gate based on the majority gate described here. By controlling two of the five inputs the complex gate can realize every combination of OR and AND gates of the other three inputs.
Active Learning Framework for Non-Intrusive Load Monitoring: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jin, Xin
2016-05-16
Non-Intrusive Load Monitoring (NILM) is a set of techniques that estimate the electricity usage of individual appliances from power measurements taken at a limited number of locations in a building. One of the key challenges in NILM is having too much data without class labels yet being unable to label the data manually for cost or time constraints. This paper presents an active learning framework that helps existing NILM techniques to overcome this challenge. Active learning is an advanced machine learning method that interactively queries a user for the class label information. Unlike most existing NILM systems that heuristically requestmore » user inputs, the proposed method only needs minimally sufficient information from a user to build a compact and yet highly representative load signature library. Initial results indicate the proposed method can reduce the user inputs by up to 90% while still achieving similar disaggregation performance compared to a heuristic method. Thus, the proposed method can substantially reduce the burden on the user, improve the performance of a NILM system with limited user inputs, and overcome the key market barriers to the wide adoption of NILM technologies.« less
Biometric Data Safeguarding Technologies Analysis and Best Practices
2011-12-01
fuzzy vault” scheme proposed by Juels and Sudan. The scheme was designed to encrypt data such that it could be unlocked by similar but inexact matches... designed transform functions. Multifactor Key Generation Multifactor key generation combines a biometric with one or more other inputs, such as a...cooperative, off-angle iris images. Since the commercialized system is designed for images acquired from a specific, paired acquisition system
The deep ocean under climate change.
Levin, Lisa A; Le Bris, Nadine
2015-11-13
The deep ocean absorbs vast amounts of heat and carbon dioxide, providing a critical buffer to climate change but exposing vulnerable ecosystems to combined stresses of warming, ocean acidification, deoxygenation, and altered food inputs. Resulting changes may threaten biodiversity and compromise key ocean services that maintain a healthy planet and human livelihoods. There exist large gaps in understanding of the physical and ecological feedbacks that will occur. Explicit recognition of deep-ocean climate mitigation and inclusion in adaptation planning by the United Nations Framework Convention on Climate Change (UNFCCC) could help to expand deep-ocean research and observation and to protect the integrity and functions of deep-ocean ecosystems. Copyright © 2015, American Association for the Advancement of Science.
Solid-State Lighting 2017 Suggested Research Topics Supplement: Technology and Market Context
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
A 2017 update to the Solid-State Lighting R&D Plan that is divided into two documents. The first document describes a list of suggested SSL priority research topics and the second document provides context and background, including information drawn from technical, market, and economic studies. Widely referenced by industry and government both here and abroad, these documents reflect SSL stakeholder inputs on key R&D topics that will improve efficacy, reduce cost, remove barriers to adoption, and add value for LED and OLED lighting solutions over the next three to five years, and discuss those applications that drive and prioritize the specificmore » R&D.« less
Taliotis, Constantinos; Taibi, Emanuele; Howells, Mark; Rogner, Holger; Bazilian, Morgan; Welsch, Manuel
2017-10-01
The generation mix of Cyprus has been dominated by oil products for decades. In order to conform with European Union and international legislation, a transformation of the supply system is called for. Energy system models can facilitate energy planning into the future, but a large volume of data is required to populate such models. The present data article provides information on key modelling assumptions and input data adopted with the aim of representing the electricity supply system of Cyprus in a separate research article. Data in regards to renewable energy technoeconomic characteristics and investment cost projections, fossil fuel price projections, storage technology characteristics and system operation assumptions are described in this article.
Dynamic Fuzzy Model Development for a Drum-type Boiler-turbine Plant Through GK Clustering
NASA Astrophysics Data System (ADS)
Habbi, Ahcène; Zelmat, Mimoun
2008-10-01
This paper discusses a TS fuzzy model identification method for an industrial drum-type boiler plant using the GK fuzzy clustering approach. The fuzzy model is constructed from a set of input-output data that covers a wide operating range of the physical plant. The reference data is generated using a complex first-principle-based mathematical model that describes the key dynamical properties of the boiler-turbine dynamics. The proposed fuzzy model is derived by means of fuzzy clustering method with particular attention on structure flexibility and model interpretability issues. This may provide a basement of a new way to design model based control and diagnosis mechanisms for the complex nonlinear plant.
Impact of data source on travel time reliability assessment.
DOT National Transportation Integrated Search
2014-08-01
Travel time reliability measures are becoming an increasingly important input to the mobility and : congestion management studies. In the case of Maryland State Highway Administration, reliability : measures are key elements in the agencys Annual ...
He, Yan-Lin; Xu, Yuan; Geng, Zhi-Qiang; Zhu, Qun-Xiong
2016-03-01
In this paper, a hybrid robust model based on an improved functional link neural network integrating with partial least square (IFLNN-PLS) is proposed. Firstly, an improved functional link neural network with small norm of expanded weights and high input-output correlation (SNEWHIOC-FLNN) was proposed for enhancing the generalization performance of FLNN. Unlike the traditional FLNN, the expanded variables of the original inputs are not directly used as the inputs in the proposed SNEWHIOC-FLNN model. The original inputs are attached to some small norm of expanded weights. As a result, the correlation coefficient between some of the expanded variables and the outputs is enhanced. The larger the correlation coefficient is, the more relevant the expanded variables tend to be. In the end, the expanded variables with larger correlation coefficient are selected as the inputs to improve the performance of the traditional FLNN. In order to test the proposed SNEWHIOC-FLNN model, three UCI (University of California, Irvine) regression datasets named Housing, Concrete Compressive Strength (CCS), and Yacht Hydro Dynamics (YHD) are selected. Then a hybrid model based on the improved FLNN integrating with partial least square (IFLNN-PLS) was built. In IFLNN-PLS model, the connection weights are calculated using the partial least square method but not the error back propagation algorithm. Lastly, IFLNN-PLS was developed as an intelligent measurement model for accurately predicting the key variables in the Purified Terephthalic Acid (PTA) process and the High Density Polyethylene (HDPE) process. Simulation results illustrated that the IFLNN-PLS could significant improve the prediction performance. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.
Atmospheric Impacts of a Close Cometary Encounter
NASA Astrophysics Data System (ADS)
Aylett, Tasha; Chipperfield, Martyn; Diego Carrillo Sánchez, Juan; Feng, Wuhu; Forster, Piers; Plane, John
2017-04-01
Although a close encounter with a comet is extremely unlikely, a significant perturbation to the flux of Earth-bound dust from a comet's close passage could have huge implications for both the chemistry of the atmosphere and climate. For example, following the close passage of Comet Halley to Earth in A.D. 536, dark skies, reduced day lengths and a protracted global cooling were reported [1], for which an extraterrestrial disturbance is likely to be at least partly responsible. Indeed, the recent encounter of Comet Siding Spring with Mars provided evidence that the risks posed by such an event are significant [2]. We have run sensitivity simulations using the Whole Atmosphere Community Climate Model (WACCM) with an elevated Meteoric Input Function (MIF) to investigate such an encounter - specifically, Comet Halley in A.D. 536. The simple analytical model developed by Moorhead et al. [3] has been incorporated into an atmospheric chemical ablation model to provide the MIF of several meteoric species (Na, Fe, Si, Mg and S) in the mesosphere and lower thermosphere (70-120 km) for input into WACCM. Key effects of this additional input on the chemistry of the upper atmosphere and the metal layers have been explored in the simulations and effects on mesospheric and stratospheric ozone chemistry have been assessed. In addition to any effects on atmospheric chemistry, WACCM will also be used to provide insight into the impacts of a high dust flux on the Earth's climate. References [1] Stothers, R. B. (1984), Mystery Cloud of Ad-536, Nature, 307(5949), 344-345. [2] Schneider, N. M., et al. (2015), MAVEN IUVS observations of the aftermath of the Comet Siding Spring meteor shower on Mars, Geophys Res Lett, 42(12), 4755-4761. [3] Moorhead, A. V., P. A. Wiegert, and W. J. Cooke (2014), The meteoroid fluence at Mars due to Comet C/2013 A1 (Siding Spring), Icarus, 231, 13-21.
Evans, William D [Cupertino, CA
2009-02-24
A secure content object protects electronic documents from unauthorized use. The secure content object includes an encrypted electronic document, a multi-key encryption table having at least one multi-key component, an encrypted header and a user interface device. The encrypted document is encrypted using a document encryption key associated with a multi-key encryption method. The encrypted header includes an encryption marker formed by a random number followed by a derivable variation of the same random number. The user interface device enables a user to input a user authorization. The user authorization is combined with each of the multi-key components in the multi-key encryption key table and used to try to decrypt the encrypted header. If the encryption marker is successfully decrypted, the electronic document may be decrypted. Multiple electronic documents or a document and annotations may be protected by the secure content object.
System and circuitry to provide stable transconductance for biasing
NASA Technical Reports Server (NTRS)
Garverick, Steven L. (Inventor); Yu, Xinyu (Inventor)
2012-01-01
An amplifier system can include an input amplifier configured to receive an analog input signal and provide an amplified signal corresponding to the analog input signal. A tracking loop is configured to employ delta modulation for tracking the amplified signal, the tracking loop providing a corresponding output signal. A biasing circuit is configured to adjust a bias current to maintain stable transconductance over temperature variations, the biasing circuit providing at least one bias signal for biasing at least one of the input amplifier and the tracking loop, whereby the circuitry receiving the at least one bias signal exhibits stable performance over the temperature variations. In another embodiment the biasing circuit can be utilized in other applications.
Hendrickson, Phillip J; Yu, Gene J; Song, Dong; Berger, Theodore W
2016-01-01
This paper describes a million-plus granule cell compartmental model of the rat hippocampal dentate gyrus, including excitatory, perforant path input from the entorhinal cortex, and feedforward and feedback inhibitory input from dentate interneurons. The model includes experimentally determined morphological and biophysical properties of granule cells, together with glutamatergic AMPA-like EPSP and GABAergic GABAA-like IPSP synaptic excitatory and inhibitory inputs, respectively. Each granule cell was composed of approximately 200 compartments having passive and active conductances distributed throughout the somatic and dendritic regions. Modeling excitatory input from the entorhinal cortex was guided by axonal transport studies documenting the topographical organization of projections from subregions of the medial and lateral entorhinal cortex, plus other important details of the distribution of glutamatergic inputs to the dentate gyrus. Information contained within previously published maps of this major hippocampal afferent were systematically converted to scales that allowed the topographical distribution and relative synaptic densities of perforant path inputs to be quantitatively estimated for inclusion in the current model. Results showed that when medial and lateral entorhinal cortical neurons maintained Poisson random firing, dentate granule cells expressed, throughout the million-cell network, a robust nonrandom pattern of spiking best described as a spatiotemporal "clustering." To identify the network property or properties responsible for generating such firing "clusters," we progressively eliminated from the model key mechanisms, such as feedforward and feedback inhibition, intrinsic membrane properties underlying rhythmic burst firing, and/or topographical organization of entorhinal afferents. Findings conclusively identified topographical organization of inputs as the key element responsible for generating a spatiotemporal distribution of clustered firing. These results uncover a functional organization of perforant path afferents to the dentate gyrus not previously recognized: topography-dependent clusters of granule cell activity as "functional units" or "channels" that organize the processing of entorhinal signals. This modeling study also reveals for the first time how a global signal processing feature of a neural network can evolve from one of its underlying structural characteristics.
Hendrickson, Phillip J.; Yu, Gene J.; Song, Dong; Berger, Theodore W.
2016-01-01
Goal This manuscript describes a million-plus granule cell compartmental model of the rat hippocampal dentate gyrus, including excitatory, perforant path input from the entorhinal cortex, and feedforward and feedback inhibitory input from dentate interneurons. Methods The model includes experimentally determined morphological and biophysical properties of granule cells, together with glutamatergic AMPA-like EPSP and GABAergic GABAA-like IPSP synaptic excitatory and inhibitory inputs, respectively. Each granule cell was composed of approximately 200 compartments having passive and active conductances distributed throughout the somatic and dendritic regions. Modeling excitatory input from the entorhinal cortex was guided by axonal transport studies documenting the topographical organization of projections from subregions of the medial and lateral entorhinal cortex, plus other important details of the distribution of glutamatergic inputs to the dentate gyrus. Information contained within previously published maps of this major hippocampal afferent were systematically converted to scales that allowed the topographical distribution and relative synaptic densities of perforant path inputs to be quantitatively estimated for inclusion in the current model. Results Results showed that when medial and lateral entorhinal cortical neurons maintained Poisson random firing, dentate granule cells expressed, throughout the million-cell network, a robust, non-random pattern of spiking best described as spatio-temporal “clustering”. To identify the network property or properties responsible for generating such firing “clusters”, we progressively eliminated from the model key mechanisms such as feedforward and feedback inhibition, intrinsic membrane properties underlying rhythmic burst firing, and/or topographical organization of entorhinal afferents. Conclusion Findings conclusively identified topographical organization of inputs as the key element responsible for generating a spatio-temporal distribution of clustered firing. These results uncover a functional organization of perforant path afferents to the dentate gyrus not previously recognized: topography-dependent clusters of granule cell activity as “functional units” or “channels” that organize the processing of entorhinal signals. This modeling study also reveals for the first time how a global signal processing feature of a neural network can evolve from one of its underlying structural characteristics. PMID:26087482
NASA Technical Reports Server (NTRS)
Black, Jr., William C. (Inventor); Hermann, Theodore M. (Inventor)
1998-01-01
A current determiner having an output at which representations of input currents are provided having an input conductor for the input current and a current sensor supported on a substrate electrically isolated from one another but with the sensor positioned in the magnetic fields arising about the input conductor due to any input currents. The sensor extends along the substrate in a direction primarily perpendicular to the extent of the input conductor and is formed of at least a pair of thin-film ferromagnetic layers separated by a non-magnetic conductive layer. The sensor can be electrically connected to a electronic circuitry formed in the substrate including a nonlinearity adaptation circuit to provide representations of the input currents of increased accuracy despite nonlinearities in the current sensor, and can include further current sensors in bridge circuits.
Inverter ratio failure detector
NASA Technical Reports Server (NTRS)
Wagner, A. P.; Ebersole, T. J.; Andrews, R. E. (Inventor)
1974-01-01
A failure detector which detects the failure of a dc to ac inverter is disclosed. The inverter under failureless conditions is characterized by a known linear relationship of its input and output voltages and by a known linear relationship of its input and output currents. The detector includes circuitry which is responsive to the detector's input and output voltages and which provides a failure-indicating signal only when the monitored output voltage is less by a selected factor, than the expected output voltage for the monitored input voltage, based on the known voltages' relationship. Similarly, the detector includes circuitry which is responsive to the input and output currents and provides a failure-indicating signal only when the input current exceeds by a selected factor the expected input current for the monitored output current based on the known currents' relationship.
Dual Brushless Resolver Rate Sensor
NASA Technical Reports Server (NTRS)
Howard, David E. (Inventor)
1997-01-01
A resolver rate sensor is disclosed in which dual brushless resolvers are mechanically coupled to the same output shaft. Diverse inputs are provided to each resolver by providing the first resolver with a DC input and the second resolver with an AC sinusoidal input. A trigonometric identity in which the sum of the squares of the sin and cosine components equal one is used to advantage in providing a sensor of increased accuracy. The first resolver may have a fixed or variable DC input to permit dynamic adjustment of resolver sensitivity thus permitting a wide range of coverage. In one embodiment of the invention the outputs of the first resolver are directly inputted into two separate multipliers and the outputs of the second resolver are inputted into the two separate multipliers, after being demodulated in a pair of demodulator circuits. The multiplied signals are then added in an adder circuit to provide a directional sensitive output. In another embodiment the outputs from the first resolver is modulated in separate modulator circuits and the output from the modulator circuits are used to excite the second resolver. The outputs from the second resolver are demodulated in separate demodulator circuit and added in an adder circuit to provide a direction sensitive rate output.
Subterranean Groundwater Nutrient Input to Coastal Oceans and Coral Reef Sustainability
NASA Astrophysics Data System (ADS)
Paytan, A.; Street, J. H.
2003-12-01
Coral reefs are often referred to as the tropical rain forests of the oceans because of their high productivity and biodiversity. Recent observations in coral reefs worldwide have shown clear degradation in water quality and coral reef health and diversity. The implications of this are severe, including tremendous economic losses mostly though fishing and tourism. Nutrient loading has been implicated as one possible cause for the ecosystem decline. A previously unappreciated potential source of nutrient loading is submarine ground water discharge (SGW). Ground water in many cases has high nutrient content from sewage pollution and fertilizer application for agriculture and landscaping. To better understand the effect of this potential source of nutrient input and degrading water quality, we are exploring the contribution of SGW to the nutrient levels in coral reefs. A key to this approach is determining the amount and source of SGW that flows into the coast as well as its nutrient concentrations. The SGW flux and associated input of chemical dissolved load (nutrient, DOC, trace elements and other contaminants) is quantified using naturally occurring Ra isotopes. Radium isotopes have been shown to be excellent tracers for SGW inputs into estuaries and coastal areas (Moore, 1996; Hussain et al., 1999; Kerst et al., 2000). Measurements of Ra activity within the coral reef, the lagoons and the open waters adjacent to the reef provide valuable information regarding the input of Ra as well as nutrients and possibly pollutant from groundwater discharge. Through this analysis the effect of SGD on the delicate carbon and nutrient balance of the fragile coral reef ecosystem could be evaluated. In addition to quantifying the contribution of freshwater to the nutrient mass balance in the reef, information regarding the length of time a water parcel has remained in the near-shore region over the reef can be estimated using the Ra isotope quartet.
DOT National Transportation Integrated Search
2017-09-01
The mechanistic-empirical pavement design method requires the elastic resilient modulus as the key input for characterization of geomaterials. Current density-based QA procedures do not measure resilient modulus. Additionally, the density-based metho...
Managing the travel model process : small and medium-sized MPOs. Instructor guide.
DOT National Transportation Integrated Search
2013-09-01
The learning objectives of this course were to: explain fundamental travel model concepts; describe the model development process; identify key inputs and describe the quality control process; and identify and manage resources.
Managing the travel model process : small and medium-sized MPOs. Participant handbook.
DOT National Transportation Integrated Search
2013-09-01
The learning objectives of this course were to: explain fundamental travel model concepts; describe the model development process; identify key inputs and describe the quality control process; and identify and manage resources.
Methodology update for estimating volume to service flow ratio.
DOT National Transportation Integrated Search
2015-12-01
Volume/service flow ratio (VSF) is calculated by the Highway Performance Monitoring System (HPMS) software as an indicator of peak hour congestion. It is an essential input to the Kentucky Transportation Cabinets (KYTC) key planning applications, ...
Using a Tablet PC in the German Classroom to Enliven Teacher Input
ERIC Educational Resources Information Center
Van Orden, Stephen
2006-01-01
Providing students with lively, authentic comprehensible input is one of the most important tasks of introductory German teachers. Using a Tablet PC can enable teachers to improve the quality of the comprehensible input they provide their students. This article describes how integrating a Tablet PC into daily teaching processes allows classroom…
Ferroelectric Based High Power Components for L-Band Accelerator Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kanareykin, Alex; Jing, Chunguang; Kostin, Roman
2018-01-16
We are developing a new electronic device to control the power in particle accelerators. The key technology is a new nanostructured material developed by Euclid that changes its properties with an applied electric field. Both superconducting and conventional accelerating structures require fast electronic control of the input rf power. A fast controllable phase shifter would allow for example the control of the rf power delivered to multiple accelerating cavities from a single power amplifier. Nonlinear ferroelectric microwave components can control the tuning or the input power coupling for rf cavities. Applying a bias voltage across a nonlinear ferroelectric changes itsmore » permittivity. This effect can be used to cause a phase change of a propagating rf signal or change the resonant frequency of a cavity. The key is the development of a low loss highly tunable ferroelectric material.« less
A note on scrap in the 1992 U.S. input-output tables
Swisko, George M.
2000-01-01
Introduction A key concern of industrial ecology and life cycle analysis is the disposal and recycling of scrap. One might conclude that the U.S. input-output tables are appropriate tools for analyzing scrap flows. Duchin, for instance, has suggested using input-output analysis for industrial ecology, indicating that input-output economics can trace the stocks and flows of energy and other materials from extraction through production and consumption to recycling or disposal. Lave and others use input-output tables to design life cycle assessment models for studying product design, materials use, and recycling strategies, even with the knowledge that these tables suffer from a lack of comprehensive and detailed data that may never be resolved. Although input-output tables can offer general guidance about the interdependence of economic and environmental processes, data reporting by industry and the economic concepts underlying these tables pose problems for rigorous material flow examinations. This is especially true for analyzing the output of scrap and scrap flows in the United States and estimating the amount of scrap that can be recycled. To show how data reporting has affected the values of scrap in recent input-output tables, this paper focuses on metal scrap generated in manufacturing. The paper also briefly discusses scrap that is not included in the input-output tables and some economic concepts that limit the analysis of scrap flows.
Vu, Cecilia; Rothman, Emily; Kistin, Caroline J; Barton, Kelly; Bulman, Barb; Budzak-Garza, Ann; Olson-Dorff, Denyse; Bair-Merritt, Megan H
The patient-centered medical home (PCMH) seeks to improve population health. However, PCMH models often focus on improving treatment of chronic diseases rather than on addressing psychosocial adversity. We sought to gather key stakeholder input about how PCMHs might feasibly and sustainably address psychosocial adversity within their patient populations. We conducted 25 semistructured interviews with key stakeholders, such as physicians, nurses, medical assistants, and patients. The audiorecorded interviews focused on participants' perceptions of the best ways to modify the PCMH to address patients' psychosocial adversity. To facilitate information gathering, a fictional patient case was presented. Analyses were conducted using a 3-stage content-analysis process. Participants identified provider-related and systems-level changes necessary for addressing these psychosocial adversities effectively. On the provider level, participants thought that practitioners should foster trusting relationships with patients and should be emotionally present as patients describe their life experiences. Participants also emphasized that providers need to have sensitive conversations about adversity and resilience. On a systems level, participants discussed that documentation must balance privacy and include relevant information in the medical record. In addition, care should be delivered not by a single provider but by a team that has a longitudinal relationship with the patient; this care team should include behavioral health support. Participants provided practical strategies and highlighted provider and systems level changes to adequately address patients' prior psychosocial adversity. Future studies need to assess the degree to which such a trauma-informed approach improves patient access, outcomes, and care quality, and reduces cost. Copyright © 2017 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Dodson, D. W.; Shields, N. L., Jr.
1978-01-01
The Experiment Computer Operating System (ECOS) of the Spacelab will allow the onboard Payload Specialist to command experiment devices and display information relative to the performance of experiments. Three candidate ECOS command and control service concepts were reviewed and laboratory data on operator performance was taken for each concept. The command and control service concepts evaluated included a dedicated operator's menu display from which all command inputs were issued, a dedicated command key concept with which command inputs could be issued from any display, and a multi-display concept in which command inputs were issued from several dedicated function displays. Advantages and disadvantages are discussed in terms of training, operational errors, task performance time, and subjective comments of system operators.
High input impedance amplifier
NASA Technical Reports Server (NTRS)
Kleinberg, Leonard L.
1995-01-01
High input impedance amplifiers are provided which reduce the input impedance solely to a capacitive reactance, or, in a somewhat more complex design, provide an extremely high essentially infinite, capacitive reactance. In one embodiment, where the input impedance is reduced in essence, to solely a capacitive reactance, an operational amplifier in a follower configuration is driven at its non-inverting input and a resistor with a predetermined magnitude is connected between the inverting and non-inverting inputs. A second embodiment eliminates the capacitance from the input by adding a second stage to the first embodiment. The second stage is a second operational amplifier in a non-inverting gain-stage configuration where the output of the first follower stage drives the non-inverting input of the second stage and the output of the second stage is fed back to the non-inverting input of the first stage through a capacitor of a predetermined magnitude. These amplifiers, while generally useful, are very useful as sensor buffer amplifiers that may eliminate significant sources of error.
Sprague, Lori A.; Gronberg, Jo Ann M.
2013-01-01
Anthropogenic inputs of nitrogen and phosphorus to each county in the conterminous United States and to the watersheds of 495 surface-water sites studied as part of the U.S. Geological Survey National Water-Quality Assessment Program were quantified for the years 1992, 1997, and 2002. Estimates of inputs of nitrogen and phosphorus from biological fixation by crops (for nitrogen only), human consumption, crop production for human consumption, animal production for human consumption, animal consumption, and crop production for animal consumption for each county are provided in a tabular dataset. These county-level estimates were allocated to the watersheds of the surface-water sites to estimate watershed-level inputs from the same sources; these estimates also are provided in a tabular dataset, together with calculated estimates of net import of food and net import of feed and previously published estimates of inputs from atmospheric deposition, fertilizer, and recoverable manure. The previously published inputs are provided for each watershed so that final estimates of total anthropogenic nutrient inputs could be calculated. Estimates of total anthropogenic inputs are presented together with previously published estimates of riverine loads of total nitrogen and total phosphorus for reference.
Nutrient and phytoplankton analysis of a Mediterranean coastal area.
Sebastiá, M T; Rodilla, M
2013-01-01
Identifying and quantifying the key anthropogenic nutrient input sources are essential to adopting management measures that can target input for maximum effect in controlling the phytoplankton biomass. In this study, three systems characterized by distinctive main nutrient sources were sampled along a Mediterranean coast transect. These sources were groundwater discharge in the Ahuir area, the Serpis river discharge in the Venecia area, and a submarine wastewater outfall 1,900 m from the coast. The study area includes factors considered important in determining a coastal area as a sensitive area: it has significant nutrient sources, tourism is a major source of income in the region, and it includes an area of high water residence time (Venecia area) which is affected by the harbor facilities and by wastewater discharges. We found that in the Ahuir and the submarine wastewater outfall areas, the effects of freshwater inputs were reduced because of a greater water exchange with the oligotrophic Mediterranean waters. On the other hand, in the Venecia area, the highest levels of nutrient concentration and phytoplankton biomass were attributed to the greatest water residence time. In this enclosed area, harmful dinoflagellates were detected (Alexandrium sp. and Dinophysis caudata). If the planned enlargement of the Gandia Harbor proceeds, it may increase the vulnerability of this system and provide the proper conditions of confinement for the dinoflagellate blooms' development. Management measures should first target phosphorus inputs as this is the most potential-limiting nutrient in the Venecia area and comes from a point source that is easier to control. Finally, we recommend that harbor environmental management plans include regular monitoring of water quality in adjacent waters to identify adverse phytoplankton community changes.
The Origin of DIRT (Detrital Input and Removal Treatments): the Legacy of Dr. Francis D. Hole
NASA Astrophysics Data System (ADS)
Townsend, K. L.; Lajtha, K.; Caldwell, B.; Sollins, P.
2007-12-01
Soil organic matter (SOM) plays a key role in the cycling and retention of nitrogen and carbon within soil. Both above and belowground detrital inputs determine the nature and quantity of SOM. Studies on detrital impacts on SOM dynamics are underway at several LTER, ILTER and LTER-affiliated sites using a common experimental design, Detrital Input and Removal Treatments (DIRT). The concept for DIRT was originally based on experimental plots established at the University of Wisconsin Arboretum by Dr. Francis D. Hole in 1956 to study the effects of detrital inputs on pedogenesis. These plots are located on two forested sites and two prairie sites within the arboretum. Manipulations of the forested sites include double litter, no litter and removal of the O and A horizons. Manipulations of the prairie sites include harvest, mulch, bare and burn. These original treatments have largely been maintained since 1956. After 40 years of maintenance, there were significant differences in soil carbon between the double and no litter plots. The double litter plots had increased by nearly 30% while the no litter plots had decreased over 50%. The original DIRT plots are now 50 years old and have been re-sampled, where possible, for total carbon and nitrogen, labile and recalcitrant carbon fractions, net and gross nitrogen mineralization rates, and SOM bioavailability through CO2 respiration. The soils were fractionated by density to examine the role of carbon in each density fraction. The mean age of carbon in each fraction was determined by radiocarbon dating. This sampling and analysis is of special significance because it provides a glimpse into the future SOM trajectories for the new DIRT sites: Harvard Forest (MA), Bousson (PA), Andrews Experimental Forest (OR) and Sikfokut (Hungary).
Process-based Cost Estimation for Ramjet/Scramjet Engines
NASA Technical Reports Server (NTRS)
Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John
2003-01-01
Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.
Byrd, Kristin B.; Windham-Myers, Lisamarie; Leeuw, Thomas; Downing, Bryan D.; Morris, James T.; Ferner, Matthew C.
2016-01-01
Reducing uncertainty in data inputs at relevant spatial scales can improve tidal marsh forecasting models, and their usefulness in coastal climate change adaptation decisions. The Marsh Equilibrium Model (MEM), a one-dimensional mechanistic elevation model, incorporates feedbacks of organic and inorganic inputs to project elevations under sea-level rise scenarios. We tested the feasibility of deriving two key MEM inputs—average annual suspended sediment concentration (SSC) and aboveground peak biomass—from remote sensing data in order to apply MEM across a broader geographic region. We analyzed the precision and representativeness (spatial distribution) of these remote sensing inputs to improve understanding of our study region, a brackish tidal marsh in San Francisco Bay, and to test the applicable spatial extent for coastal modeling. We compared biomass and SSC models derived from Landsat 8, DigitalGlobe WorldView-2, and hyperspectral airborne imagery. Landsat 8-derived inputs were evaluated in a MEM sensitivity analysis. Biomass models were comparable although peak biomass from Landsat 8 best matched field-measured values. The Portable Remote Imaging Spectrometer SSC model was most accurate, although a Landsat 8 time series provided annual average SSC estimates. Landsat 8-measured peak biomass values were randomly distributed, and annual average SSC (30 mg/L) was well represented in the main channels (IQR: 29–32 mg/L), illustrating the suitability of these inputs across the model domain. Trend response surface analysis identified significant diversion between field and remote sensing-based model runs at 60 yr due to model sensitivity at the marsh edge (80–140 cm NAVD88), although at 100 yr, elevation forecasts differed less than 10 cm across 97% of the marsh surface (150–200 cm NAVD88). Results demonstrate the utility of Landsat 8 for landscape-scale tidal marsh elevation projections due to its comparable performance with the other sensors, temporal frequency, and cost. Integration of remote sensing data with MEM should advance regional projections of marsh vegetation change by better parameterizing MEM inputs spatially. Improving information for coastal modeling will support planning for ecosystem services, including habitat, carbon storage, and flood protection.
NASA Technical Reports Server (NTRS)
Jones, Denise R.
1990-01-01
A piloted simulation study was conducted comparing three different input methods for interfacing to a large-screen, multiwindow, whole-flight-deck display for management of transport aircraft systems. The thumball concept utilized a miniature trackball embedded in a conventional side-arm controller. The touch screen concept provided data entry through a capacitive touch screen. The voice concept utilized a speech recognition system with input through a head-worn microphone. No single input concept emerged as the most desirable method of interacting with the display. Subjective results, however, indicate that the voice concept was the most preferred method of data entry and had the most potential for future applications. The objective results indicate that, overall, the touch screen concept was the most effective input method. There was also significant differences between the time required to perform specific tasks and the input concept employed, with each concept providing better performance relative to a specific task. These results suggest that a system combining all three input concepts might provide the most effective method of interaction.
El Bcheraoui, Charbel; Palmisano, Erin B; Dansereau, Emily; Schaefer, Alexandra; Woldeab, Alexander; Moradi-Lakeh, Maziar; Salvatierra, Benito; Hernandez-Prado, Bernardo; Mokdad, Ali H
2017-01-01
The Salud Mesoamérica Initiative (SMI) is a three-operation strategy, and is a pioneer in the world of results-based aid (RBA) in terms of the success it has achieved in improving health system inputs following its initial operation. This success in meeting pre-defined targets is rare in the world of financial assistance for health. We investigated the influential aspects of SMI that could have contributed to its effectiveness in improving health systems, with the aim of providing international donors, bilateral organizations, philanthropies, and recipient countries with new perspectives that can help increase the effectiveness of future assistance for health, specifically in the arena of RBA. Qualitative methods based on the criteria of relevance and effectiveness proposed by the Development Assistance Committee of the Organization for Economic Co-operation and Development. Our methods included document review, key informant interviews, a focus group discussion, and a partnership analysis. A purposive sample of 113 key informants, comprising donors, representatives from the Inter-American Development Bank, ministries of health, technical assistance organizations, evaluation organizations, and health care providers. During May-October 2016, we interviewed regarding the relevance and effectiveness of SMI. Themes emerged relative to the topics we investigated, and covered the design and the drivers of success of the initiative. The success is due to 1) the initiative's regional approach, which pressured recipient countries to compete toward meeting targets, 2) a robust and flexible design that incorporated the richness of input from stakeholders at all levels, 3) the design-embedded evaluation component that created a culture of accountability among recipient countries, and 4) the reflective knowledge environment that created a culture of evidence-based decision-making. A regional approach involving all appropriate stakeholders, and based on knowledge sharing and embedded evaluation can help ensure the effectiveness of future results-based aid programs for health in global settings.
Socio-economic exposure to natural disasters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marin, Giovanni, E-mail: giovanni.marin@uniurb.it; IRCrES - CNR, Research Institute on Sustainable Economic Growth, Via Corti 12, 20133 - Milano; SEEDS, Ferrara
Even though the correct assessment of risks is a key aspect of the risk management analysis, we argue that limited effort has been devoted in the assessment of comprehensive measures of economic exposure at very low scale. For this reason, we aim at providing a series of suitable methodologies to provide a complete and detailed list of the exposure of economic activities to natural disasters. We use Input-Output models to provide information about several socio-economic variables, such as population density, employment density, firms' turnover and capital stock, that can be seen as direct and indirect socio-economic exposure to natural disasters.more » We then provide an application to the Italian context. These measures can be easily incorporated into risk assessment models to provide a clear picture of the disaster risk for local areas. - Highlights: • Ex ante assessment of economic exposure to disasters at very low geographical scale • Assessment of the cost of natural disasters in ex-post perspective • IO model and spatial autocorrelation to get information on socio-economic variables • Indicators supporting risk assessment and risk management models.« less
Light adaptation alters the source of inhibition to the mouse retinal OFF pathway
Mazade, Reece E.
2013-01-01
Sensory systems must avoid saturation to encode a wide range of stimulus intensities. One way the retina accomplishes this is by using both dim-light-sensing rod and bright-light-sensing cone photoreceptor circuits. OFF cone bipolar cells are a key point in this process, as they receive both excitatory input from cones and inhibitory input from AII amacrine cells via the rod pathway. However, in addition to AII amacrine cell input, other inhibitory inputs from cone pathways also modulate OFF cone bipolar cell light signals. It is unknown how these inhibitory inputs to OFF cone bipolar cells change when switching between rod and cone pathways or whether all OFF cone bipolar cells receive rod pathway input. We found that one group of OFF cone bipolar cells (types 1, 2, and 4) receive rod-mediated inhibitory inputs that likely come from the rod-AII amacrine cell pathway, while another group of OFF cone bipolar cells (type 3) do not. In both cases, dark-adapted rod-dominant light responses showed a significant contribution of glycinergic inhibition, which decreased with light adaptation and was, surprisingly, compensated by an increase in GABAergic inhibition. As GABAergic input has distinct timing and spatial spread from glycinergic input, a shift from glycinergic to GABAergic inhibition could significantly alter OFF cone bipolar cell signaling to downstream OFF ganglion cells. Larger GABAergic input could reflect an adjustment of OFF bipolar cell spatial inhibition, which may be one mechanism that contributes to retinal spatial sensitivity in the light. PMID:23926034
Step-control of electromechanical systems
Lewis, Robert N.
1979-01-01
The response of an automatic control system to a general input signal is improved by applying a test input signal, observing the response to the test input signal and determining correctional constants necessary to provide a modified input signal to be added to the input to the system. A method is disclosed for determining correctional constants. The modified input signal, when applied in conjunction with an operating signal, provides a total system output exhibiting an improved response. This method is applicable to open-loop or closed-loop control systems. The method is also applicable to unstable systems, thus allowing controlled shut-down before dangerous or destructive response is achieved and to systems whose characteristics vary with time, thus resulting in improved adaptive systems.
[A Terahertz Spectral Database Based on Browser/Server Technique].
Zhang, Zhuo-yong; Song, Yue
2015-09-01
With the solution of key scientific and technical problems and development of instrumentation, the application of terahertz technology in various fields has been paid more and more attention. Owing to the unique characteristic advantages, terahertz technology has been showing a broad future in the fields of fast, non-damaging detections, as well as many other fields. Terahertz technology combined with other complementary methods can be used to cope with many difficult practical problems which could not be solved before. One of the critical points for further development of practical terahertz detection methods depends on a good and reliable terahertz spectral database. We developed a BS (browser/server) -based terahertz spectral database recently. We designed the main structure and main functions to fulfill practical requirements. The terahertz spectral database now includes more than 240 items, and the spectral information was collected based on three sources: (1) collection and citation from some other abroad terahertz spectral databases; (2) collected from published literatures; and (3) spectral data measured in our laboratory. The present paper introduced the basic structure and fundament functions of the terahertz spectral database developed in our laboratory. One of the key functions of this THz database is calculation of optical parameters. Some optical parameters including absorption coefficient, refractive index, etc. can be calculated based on the input THz time domain spectra. The other main functions and searching methods of the browser/server-based terahertz spectral database have been discussed. The database search system can provide users convenient functions including user registration, inquiry, displaying spectral figures and molecular structures, spectral matching, etc. The THz database system provides an on-line searching function for registered users. Registered users can compare the input THz spectrum with the spectra of database, according to the obtained correlation coefficient one can perform the searching task very fast and conveniently. Our terahertz spectral database can be accessed at http://www.teralibrary.com. The proposed terahertz spectral database is based on spectral information so far, and will be improved in the future. We hope this terahertz spectral database can provide users powerful, convenient, and high efficient functions, and could promote the broader applications of terahertz technology.
Application of uncertainty and sensitivity analysis to the air quality SHERPA modelling tool
NASA Astrophysics Data System (ADS)
Pisoni, E.; Albrecht, D.; Mara, T. A.; Rosati, R.; Tarantola, S.; Thunis, P.
2018-06-01
Air quality has significantly improved in Europe over the past few decades. Nonetheless we still find high concentrations in measurements mainly in specific regions or cities. This dimensional shift, from EU-wide to hot-spot exceedances, calls for a novel approach to regional air quality management (to complement EU-wide existing policies). The SHERPA (Screening for High Emission Reduction Potentials on Air quality) modelling tool was developed in this context. It provides an additional tool to be used in support to regional/local decision makers responsible for the design of air quality plans. It is therefore important to evaluate the quality of the SHERPA model, and its behavior in the face of various kinds of uncertainty. Uncertainty and sensitivity analysis techniques can be used for this purpose. They both reveal the links between assumptions and forecasts, help in-model simplification and may highlight unexpected relationships between inputs and outputs. Thus, a policy steered SHERPA module - predicting air quality improvement linked to emission reduction scenarios - was evaluated by means of (1) uncertainty analysis (UA) to quantify uncertainty in the model output, and (2) by sensitivity analysis (SA) to identify the most influential input sources of this uncertainty. The results of this study provide relevant information about the key variables driving the SHERPA output uncertainty, and advise policy-makers and modellers where to place their efforts for an improved decision-making process.
The Kinetics of Oxygen Atom Recombination in the Presence of Carbon Dioxide
NASA Astrophysics Data System (ADS)
Jamieson, C. S.; Garcia, R. M.; Pejakovic, D.; Kalogerakis, K.
2009-12-01
Understanding processes involving atomic oxygen is crucial for the study and modeling of composition, energy transfer, airglow, and transport dynamics in planetary atmospheres. Significant gaps and uncertainties exist in the understanding of these processes and often the relevant input from laboratory measurements is missing or outdated. We are conducting laboratory experiments to measure the rate coefficient for O + O + CO2 recombination and investigating the O2 excited states produced following the recombination. These measurements will provide key input for a quantitative understanding and reliable modeling of the atmospheres of the CO2 planets and their airglow. An excimer laser providing pulsed output at either 193 nm or 248 nm is employed to produce O atoms by dissociating carbon dioxide, nitrous oxide, or ozone. In an ambient-pressure background of CO2, O atoms recombine in a time scale of a few milliseconds. Detection of laser-induced fluorescence at 845 nm following two-photon excitation near 226 nm monitors the decay of the oxygen atom population. From the temporal evolution of the signal the recombination rate coefficient is extracted. Fluorescence spectroscopy is used to detect the products of O-atom recombination and subsequent relaxation in CO2. This work is supported by the US National Science Foundation’s (NSF) Planetary Astronomy Program. Rosanne Garcia’s participation was funded by the NSF Research Experiences for Undergraduates (REU) Program.
The CAWMSET Report: A Framework for Change
NASA Astrophysics Data System (ADS)
Budil, Kimberly S.
2001-04-01
In October 1998 the Commission on the Advancement of Women and Minorities in Science, Engineering and Technology Development (CAWMSET) was established by Congress through legislation developed and sponsored by Congresswomen Constance A. Morella (R-MD). The CAWMSET became a focal point for a grass-roots organization of women at the Lawrence Livermore National Laboratory (LLNL) and Sandia National Laboratories (SNL), California in collaboration with the Society of Women Engineers seeking to improve the environment in our workplaces. With the encouragement of our Congresswoman, Ellen Tauscher (D-CA), we embarked on an effort to provide input to the Commission regarding the recruitment, advancement and retention of women in the technical workforce since the input they received was primarily focused on the educational pipeline. The release of the CAWMSET's final report this summer provided a framework to begin to work toward the overarching goal of an inclusive, supportive, and diverse scientific community and to help us devise strategies for our home organizations that will allow us to achieve this in the near future. The Commission's final recommendation was to create a follow-on organization to carry their work forward. Professional organizations like the American Physical Society can play a key role in helping to ensure that the CAWMSET report is acted upon, not filed and forgotten. I will discuss the findings of the CAWMSET as well as past and ongoing activities at LLNL and SNL in support of this effort.
Siano, Gabriel G; Montemurro, Milagros; Alcaráz, Mirta R; Goicoechea, Héctor C
2017-10-17
Higher-order data generation implies some automation challenges, which are mainly related to the hidden programming languages and electronic details of the equipment. When techniques and/or equipment hyphenation are the key to obtaining higher-order data, the required simultaneous control of them demands funds for new hardware, software, and licenses, in addition to very skilled operators. In this work, we present Design of Inputs-Outputs with Sikuli (DIOS), a free and open-source code program that provides a general framework for the design of automated experimental procedures without prior knowledge of programming or electronics. Basically, instruments and devices are considered as nodes in a network, and every node is associated both with physical and virtual inputs and outputs. Virtual components, such as graphical user interfaces (GUIs) of equipment, are handled by means of image recognition tools provided by Sikuli scripting language, while handling of their physical counterparts is achieved using an adapted open-source three-dimensional (3D) printer. Two previously reported experiments of our research group, related to fluorescence matrices derived from kinetics and high-performance liquid chromatography, were adapted to be carried out in a more automated fashion. Satisfactory results, in terms of analytical performance, were obtained. Similarly, advantages derived from open-source tools assistance could be appreciated, mainly in terms of lesser intervention of operators and cost savings.
Reservoirs as hotspots of fluvial carbon cycling in peatland catchments.
Stimson, A G; Allott, T E H; Boult, S; Evans, M G
2017-02-15
Inland water bodies are recognised as dynamic sites of carbon processing, and lakes and reservoirs draining peatland soils are particularly important, due to the potential for high carbon inputs combined with long water residence times. A carbon budget is presented here for a water supply reservoir (catchment area~9km 2 ) draining an area of heavily eroded upland peat in the South Pennines, UK. It encompasses a two year dataset and quantifies reservoir dissolved organic carbon (DOC), particulate organic carbon (POC) and aqueous carbon dioxide (CO 2 (aq)) inputs and outputs. The budget shows the reservoir to be a hotspot of fluvial carbon cycling, as with high levels of POC influx it acts as a net sink of fluvial carbon and has the potential for significant gaseous carbon export. The reservoir alternates between acting as a producer and consumer of DOC (a pattern linked to rainfall and temperature) which provides evidence for transformations between different carbon species. In particular, the budget data accompanied by 14 C (radiocarbon) analyses provide evidence that POC-DOC transformations are a key process, occurring at rates which could represent at least ~10% of the fluvial carbon sink. To enable informed catchment management further research is needed to produce carbon cycle models more applicable to these environments, and on the implications of high POC levels for DOC composition. Copyright © 2016 Elsevier B.V. All rights reserved.
Influence of Boundary Conditions on Simulated U.S. Air Quality
One of the key inputs to regional-scale photochemical models frequently used in air quality planning and forecasting applications are chemical boundary conditions representing background pollutant concentrations originating outside the regional modeling domain. A number of studie...
Validation and augmentation of Inrix arterial travel time data using independent sources.
DOT National Transportation Integrated Search
2015-02-01
Travel time data is a key input to Intelligent Transportation Systems (ITS) applications. Advancement in vehicle : tracking and identification technologies and proliferation of location-aware and connected devices has made network-wide travel time da...
Evaluating cell phone data for AADT estimation : research project capsule.
DOT National Transportation Integrated Search
2016-04-01
Annual average daily traffic (AADT) is a key input in a transportation agencys roadway : planning, design, operation, and maintenance activities, including air quality and : safety assessments. AADT is required to be reported annually by a state...
Stakeholder Meetings on Black Carbon from Diesel Sources in the Russian Arctic
From January 28-February 1, 2013, EPA and its partners held meetings in Murmansk and Moscow with key Russian stakeholders to gather input into the project’s emissions inventory methodologies and potential pilot project ideas.
Barkhofen, Sonja; Bartley, Tim J; Sansoni, Linda; Kruse, Regina; Hamilton, Craig S; Jex, Igor; Silberhorn, Christine
2017-01-13
Sampling the distribution of bosons that have undergone a random unitary evolution is strongly believed to be a computationally hard problem. Key to outperforming classical simulations of this task is to increase both the number of input photons and the size of the network. We propose driven boson sampling, in which photons are input within the network itself, as a means to approach this goal. We show that the mean number of photons entering a boson sampling experiment can exceed one photon per input mode, while maintaining the required complexity, potentially leading to less stringent requirements on the input states for such experiments. When using heralded single-photon sources based on parametric down-conversion, this approach offers an ∼e-fold enhancement in the input state generation rate over scattershot boson sampling, reaching the scaling limit for such sources. This approach also offers a dramatic increase in the signal-to-noise ratio with respect to higher-order photon generation from such probabilistic sources, which removes the need for photon number resolution during the heralding process as the size of the system increases.
Influence of changing water sources and mineral chemistry on the everglades ecosystem
McCormick, P.V.; Harvey, J.W.; Crawford, E.S.
2011-01-01
Human influences during the previous century increased mineral inputs to the Florida Everglades by changing the sources and chemistry of surface inflows. Biogeochemical responses to this enrichment include changes in the availability of key limiting nutrients such as P, the potential for increased turnover of nutrient pools due to accelerated plant decomposition, and increased rates of mercury methylation associated with sulfate enrichment. Mineral enrichment has also been linked to the loss of sensitive macrophyte species, although dominant Everglades species appear tolerant of a broad range of mineral chemistry. Shifts in periphyton community composition and function provide an especially sensitive indicator of mineral enrichment. Understanding the influence of mineral chemistry on Everglades processes and biota may improve predictions of ecosystem responses to ongoing hydrologic restoration efforts and provide guidelines for protecting remaining mineral-poor areas of this peatland. Copyright ?? 2011 Taylor & Francis Group, LLC.
NASA Technical Reports Server (NTRS)
Hinton, David A.
2001-01-01
A ground-based system has been developed to demonstrate the feasibility of automating the process of collecting relevant weather data, predicting wake vortex behavior from a data base of aircraft, prescribing safe wake vortex spacing criteria, estimating system benefit, and comparing predicted and observed wake vortex behavior. This report describes many of the system algorithms, features, limitations, and lessons learned, as well as suggested system improvements. The system has demonstrated concept feasibility and the potential for airport benefit. Significant opportunities exist however for improved system robustness and optimization. A condensed version of the development lab book is provided along with samples of key input and output file types. This report is intended to document the technical development process and system architecture, and to augment archived internal documents that provide detailed descriptions of software and file formats.
Cantwell, George; Riesenhuber, Maximilian; Roeder, Jessica L; Ashby, F Gregory
2017-05-01
The field of computational cognitive neuroscience (CCN) builds and tests neurobiologically detailed computational models that account for both behavioral and neuroscience data. This article leverages a key advantage of CCN-namely, that it should be possible to interface different CCN models in a plug-and-play fashion-to produce a new and biologically detailed model of perceptual category learning. The new model was created from two existing CCN models: the HMAX model of visual object processing and the COVIS model of category learning. Using bitmap images as inputs and by adjusting only a couple of learning-rate parameters, the new HMAX/COVIS model provides impressively good fits to human category-learning data from two qualitatively different experiments that used different types of category structures and different types of visual stimuli. Overall, the model provides a comprehensive neural and behavioral account of basal ganglia-mediated learning. Copyright © 2017 Elsevier Ltd. All rights reserved.
Safety monitoring and reactor transient interpreter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hench, J. E.; Fukushima, T. Y.
1983-12-20
An apparatus which monitors a subset of control panel inputs in a nuclear reactor power plant, the subset being those indicators of plant status which are of a critical nature during an unusual event. A display (10) is provided for displaying primary information (14) as to whether the core is covered and likely to remain covered, including information as to the status of subsystems needed to cool the core and maintain core integrity. Secondary display information (18,20) is provided which can be viewed selectively for more detailed information when an abnormal condition occurs. The primary display information has messages (24)more » for prompting an operator as to which one of a number of pushbuttons (16) to press to bring up the appropriate secondary display (18,20). The apparatus utilizes a thermal-hydraulic analysis to more accurately determine key parameters (such as water level) from other measured parameters, such as power, pressure, and flow rate.« less
NASA Astrophysics Data System (ADS)
Stalcup, Bruce W.; Dennis, Phillip W.; Dydyk, Robert B.
1999-10-01
Litton PRC and Litton Data Systems Division are developing a system, the Imaged Document Optical Correlation and Conversion System (IDOCCS), to provide a total solution to the problem of managing and retrieving textual and graphic information from imaged document archives. At the heart of IDOCCS, optical correlation technology provides the search and retrieval of information from imaged documents. IDOCCS can be used to rapidly search for key words or phrases within the imaged document archives. In addition, IDOCCS can automatically compare an input document with the archived database to determine if it is a duplicate, thereby reducing the overall resources required to maintain and access the document database. Embedded graphics on imaged pages can also be exploited; e.g., imaged documents containing an agency's seal or logo can be singled out. In this paper, we present a description of IDOCCS as well as preliminary performance results and theoretical projections.
Interplay of weak interactions in the atom-by-atom condensation of xenon within quantum boxes
Nowakowska, Sylwia; Wäckerlin, Aneliia; Kawai, Shigeki; Ivas, Toni; Nowakowski, Jan; Fatayer, Shadi; Wäckerlin, Christian; Nijs, Thomas; Meyer, Ernst; Björk, Jonas; Stöhr, Meike; Gade, Lutz H.; Jung, Thomas A.
2015-01-01
Condensation processes are of key importance in nature and play a fundamental role in chemistry and physics. Owing to size effects at the nanoscale, it is conceptually desired to experimentally probe the dependence of condensate structure on the number of constituents one by one. Here we present an approach to study a condensation process atom-by-atom with the scanning tunnelling microscope, which provides a direct real-space access with atomic precision to the aggregates formed in atomically defined ‘quantum boxes’. Our analysis reveals the subtle interplay of competing directional and nondirectional interactions in the emergence of structure and provides unprecedented input for the structural comparison with quantum mechanical models. This approach focuses on—but is not limited to—the model case of xenon condensation and goes significantly beyond the well-established statistical size analysis of clusters in atomic or molecular beams by mass spectrometry. PMID:25608225
Space Launch System Upper Stage Technology Assessment
NASA Technical Reports Server (NTRS)
Holladay, Jon; Hampton, Bryan; Monk, Timothy
2014-01-01
The Space Launch System (SLS) is envisioned as a heavy-lift vehicle that will provide the foundation for future beyond low-Earth orbit (LEO) exploration missions. Previous studies have been performed to determine the optimal configuration for the SLS and the applicability of commercial off-the-shelf in-space stages for Earth departure. Currently NASA is analyzing the concept of a Dual Use Upper Stage (DUUS) that will provide LEO insertion and Earth departure burns. This paper will explore candidate in-space stages based on the DUUS design for a wide range of beyond LEO missions. Mission payloads will range from small robotic systems up to human systems with deep space habitats and landers. Mission destinations will include cislunar space, Mars, Jupiter, and Saturn. Given these wide-ranging mission objectives, a vehicle-sizing tool has been developed to determine the size of an Earth departure stage based on the mission objectives. The tool calculates masses for all the major subsystems of the vehicle including propellant loads, avionics, power, engines, main propulsion system components, tanks, pressurization system and gases, primary structural elements, and secondary structural elements. The tool uses an iterative sizing algorithm to determine the resulting mass of the stage. Any input into one of the subsystem sizing routines or the mission parameters can be treated as a parametric sweep or as a distribution for use in Monte Carlo analysis. Taking these factors together allows for multi-variable, coupled analysis runs. To increase confidence in the tool, the results have been verified against two point-of-departure designs of the DUUS. The tool has also been verified against Apollo moon mission elements and other manned space systems. This paper will focus on trading key propulsion technologies including chemical, Nuclear Thermal Propulsion (NTP), and Solar Electric Propulsion (SEP). All of the key performance inputs and relationships will be presented and discussed in light of the various missions. For each mission there are several trajectory options and each will be discussed in terms of delta-v required and transit duration. Each propulsion system will be modeled, sized, and judged based on their applicability to the whole range of beyond LEO missions. Criteria for scoring will include the resulting dry mass of the stage, resulting propellant required, time to destination, and an assessment of key enabling technologies. In addition to the larger metrics, this paper will present the results of several coupled sensitivity studies. The ultimate goals of these tools and studies are to provide NASA with the most mass-, technology-, and cost-effective in-space stage for its future exploration missions.
Solid Geometric Modeling - The Key to Improved Materiel Acquisition from Concept to Deployment
1984-09-01
M. J. Reisinger, "The GIFT Code User Manual; Volume I, Introduction and Input Requirements (U)," BRL Report No. 1802, July 1975. AD# A078364. 8 G...G. Kuehl, L. W. Bain, Jr., M. J. Reisinger, "The GIFT Code User Manual; Volume II, The Output Options (U)," USA ARRAOCOM Report No. 02189, Sep 79, AD...A078364 . • These results are plotted by a code called RunShot written by L. M. Rybak which takes input from GIFT and plots color shotlines on a
An accelerated training method for back propagation networks
NASA Technical Reports Server (NTRS)
Shelton, Robert O. (Inventor)
1993-01-01
The principal objective is to provide a training procedure for a feed forward, back propagation neural network which greatly accelerates the training process. A set of orthogonal singular vectors are determined from the input matrix such that the standard deviations of the projections of the input vectors along these singular vectors, as a set, are substantially maximized, thus providing an optimal means of presenting the input data. Novelty exists in the method of extracting from the set of input data, a set of features which can serve to represent the input data in a simplified manner, thus greatly reducing the time/expense to training the system.
Topological Schemas of Cognitive Maps and Spatial Learning.
Babichev, Andrey; Cheng, Sen; Dabaghian, Yuri A
2016-01-01
Spatial navigation in mammals is based on building a mental representation of their environment-a cognitive map. However, both the nature of this cognitive map and its underpinning in neural structures and activity remains vague. A key difficulty is that these maps are collective, emergent phenomena that cannot be reduced to a simple combination of inputs provided by individual neurons. In this paper we suggest computational frameworks for integrating the spiking signals of individual cells into a spatial map, which we call schemas. We provide examples of four schemas defined by different types of topological relations that may be neurophysiologically encoded in the brain and demonstrate that each schema provides its own large-scale characteristics of the environment-the schema integrals. Moreover, we find that, in all cases, these integrals are learned at a rate which is faster than the rate of complete training of neural networks. Thus, the proposed schema framework differentiates between the cognitive aspect of spatial learning and the physiological aspect at the neural network level.
Exploring family physician stress
Lee, F. Joseph; Brown, Judith Belle; Stewart, Moira
2009-01-01
ABSTRACT OBJECTIVE To explore the nature of professional stress and the strategies used by family physicians to deal with this stress. DESIGN Qualitative study. SETTING Kitchener-Waterloo, Ont. PARTICIPANTS Ten key-informant family physicians. METHODS In-depth interviews were conducted with key informants. A total of 40 key informants were identified, based on selected criteria; 24 provided consent. The potential participants were rank-ordered for interviews to provide maximum variation in age, sex, and years in practice. Interviews were conducted, audiotaped, transcribed verbatim, and analyzed until thematic saturation was reached, as determined through an iterative process. This occurred after 10 in-depth interviews. Immersion and crystallization techniques were used. MAIN FINDINGS The participants described professional stresses and strategies at the personal, occupational, and health care system levels. Personal stressors included personality traits and the need to balance family and career, which were countered by biological, psychological, social, and spiritual strategies. Occupational stressors included challenging patients, high workload, time limitations, competency issues, challenges of documentation and practice management, and changing roles within the workplace. Occupational stressors were countered by strategies such as setting limits, participating in continuing medical education, soliciting support from colleagues and staff, making use of teams, improving patient-physician relationships, exploring new forms of remuneration, and scheduling appropriately. Stressors affecting the wider health care system included limited resources, imposed rules and regulations, lack of support from specialists, feeling undervalued, and financial concerns. CONCLUSION Family physicians face a multitude of challenges at personal, occupational, and health care system levels. A systems approach provides a new framework in which proactive strategies can augment more than one level of a system and, in contrast, reactive strategies can have negative inputs for different system levels. PMID:19282541
Wesolowski, Edwin A.
1996-01-01
Two separate studies to simulate the effects of discharging treated wastewater to the Red River of the North at Fargo, North Dakota, and Moorhead, Minnesota, have been completed. In the first study, the Red River at Fargo Water-Quality Model was calibrated and verified for icefree conditions. In the second study, the Red River at Fargo Ice-Cover Water-Quality Model was verified for ice-cover conditions.To better understand and apply the Red River at Fargo Water-Quality Model and the Red River at Fargo Ice-Cover Water-Quality Model, the uncertainty associated with simulated constituent concentrations and property values was analyzed and quantified using the Enhanced Stream Water Quality Model-Uncertainty Analysis. The Monte Carlo simulation and first-order error analysis methods were used to analyze the uncertainty in simulated values for six constituents and properties at sites 5, 10, and 14 (upstream to downstream order). The constituents and properties analyzed for uncertainty are specific conductance, total organic nitrogen (reported as nitrogen), total ammonia (reported as nitrogen), total nitrite plus nitrate (reported as nitrogen), 5-day carbonaceous biochemical oxygen demand for ice-cover conditions and ultimate carbonaceous biochemical oxygen demand for ice-free conditions, and dissolved oxygen. Results are given in detail for both the ice-cover and ice-free conditions for specific conductance, total ammonia, and dissolved oxygen.The sensitivity and uncertainty of the simulated constituent concentrations and property values to input variables differ substantially between ice-cover and ice-free conditions. During ice-cover conditions, simulated specific-conductance values are most sensitive to the headwatersource specific-conductance values upstream of site 10 and the point-source specific-conductance values downstream of site 10. These headwater-source and point-source specific-conductance values also are the key sources of uncertainty. Simulated total ammonia concentrations are most sensitive to the point-source total ammonia concentrations at all three sites. Other input variables that contribute substantially to the variability of simulated total ammonia concentrations are the headwater-source total ammonia and the instream reaction coefficient for biological decay of total ammonia to total nitrite. Simulated dissolved-oxygen concentrations at all three sites are most sensitive to headwater-source dissolved-oxygen concentration. This input variable is the key source of variability for simulated dissolved-oxygen concentrations at sites 5 and 10. Headwatersource and point-source dissolved-oxygen concentrations are the key sources of variability for simulated dissolved-oxygen concentrations at site 14.During ice-free conditions, simulated specific-conductance values at all three sites are most sensitive to the headwater-source specific-conductance values. Headwater-source specificconductance values also are the key source of uncertainty. The input variables to which total ammonia and dissolved oxygen are most sensitive vary from site to site and may or may not correspond to the input variables that contribute the most to the variability. The input variables that contribute the most to the variability of simulated total ammonia concentrations are pointsource total ammonia, instream reaction coefficient for biological decay of total ammonia to total nitrite, and Manning's roughness coefficient. The input variables that contribute the most to the variability of simulated dissolved-oxygen concentrations are reaeration rate, sediment oxygen demand rate, and headwater-source algae as chlorophyll a.
Knierim, James J; Neunuebel, Joshua P; Deshmukh, Sachin S
2014-02-05
The hippocampus receives its major cortical input from the medial entorhinal cortex (MEC) and the lateral entorhinal cortex (LEC). It is commonly believed that the MEC provides spatial input to the hippocampus, whereas the LEC provides non-spatial input. We review new data which suggest that this simple dichotomy between 'where' versus 'what' needs revision. We propose a refinement of this model, which is more complex than the simple spatial-non-spatial dichotomy. MEC is proposed to be involved in path integration computations based on a global frame of reference, primarily using internally generated, self-motion cues and external input about environmental boundaries and scenes; it provides the hippocampus with a coordinate system that underlies the spatial context of an experience. LEC is proposed to process information about individual items and locations based on a local frame of reference, primarily using external sensory input; it provides the hippocampus with information about the content of an experience.
Knierim, James J.; Neunuebel, Joshua P.; Deshmukh, Sachin S.
2014-01-01
The hippocampus receives its major cortical input from the medial entorhinal cortex (MEC) and the lateral entorhinal cortex (LEC). It is commonly believed that the MEC provides spatial input to the hippocampus, whereas the LEC provides non-spatial input. We review new data which suggest that this simple dichotomy between ‘where’ versus ‘what’ needs revision. We propose a refinement of this model, which is more complex than the simple spatial–non-spatial dichotomy. MEC is proposed to be involved in path integration computations based on a global frame of reference, primarily using internally generated, self-motion cues and external input about environmental boundaries and scenes; it provides the hippocampus with a coordinate system that underlies the spatial context of an experience. LEC is proposed to process information about individual items and locations based on a local frame of reference, primarily using external sensory input; it provides the hippocampus with information about the content of an experience. PMID:24366146
Characterization of network structure in stereoEEG data using consensus-based partial coherence.
Ter Wal, Marije; Cardellicchio, Pasquale; LoRusso, Giorgio; Pelliccia, Veronica; Avanzini, Pietro; Orban, Guy A; Tiesinga, Paul He
2018-06-06
Coherence is a widely used measure to determine the frequency-resolved functional connectivity between pairs of recording sites, but this measure is confounded by shared inputs to the pair. To remove shared inputs, the 'partial coherence' can be computed by conditioning the spectral matrices of the pair on all other recorded channels, which involves the calculation of a matrix (pseudo-) inverse. It has so far remained a challenge to use the time-resolved partial coherence to analyze intracranial recordings with a large number of recording sites. For instance, calculating the partial coherence using a pseudoinverse method produces a high number of false positives when it is applied to a large number of channels. To address this challenge, we developed a new method that randomly aggregated channels into a smaller number of effective channels on which the calculation of partial coherence was based. We obtained a 'consensus' partial coherence (cPCOH) by repeating this approach for several random aggregations of channels (permutations) and only accepting those activations in time and frequency with a high enough consensus. Using model data we show that the cPCOH method effectively filters out the effect of shared inputs and performs substantially better than the pseudo-inverse. We successfully applied the cPCOH procedure to human stereotactic EEG data and demonstrated three key advantages of this method relative to alternative procedures. First, it reduces the number of false positives relative to the pseudo-inverse method. Second, it allows for titration of the amount of false positives relative to the false negatives by adjusting the consensus threshold, thus allowing the data-analyst to prioritize one over the other to meet specific analysis demands. Third, it substantially reduced the number of identified interactions compared to coherence, providing a sparser network of connections from which clear spatial patterns emerged. These patterns can serve as a starting point of further analyses that provide insight into network dynamics during cognitive processes. These advantages likely generalize to other modalities in which shared inputs introduce confounds, such as electroencephalography (EEG) and magneto-encephalography (MEG). Copyright © 2018. Published by Elsevier Inc.
Optical image encryption method based on incoherent imaging and polarized light encoding
NASA Astrophysics Data System (ADS)
Wang, Q.; Xiong, D.; Alfalou, A.; Brosseau, C.
2018-05-01
We propose an incoherent encoding system for image encryption based on a polarized encoding method combined with an incoherent imaging. Incoherent imaging is the core component of this proposal, in which the incoherent point-spread function (PSF) of the imaging system serves as the main key to encode the input intensity distribution thanks to a convolution operation. An array of retarders and polarizers is placed on the input plane of the imaging structure to encrypt the polarized state of light based on Mueller polarization calculus. The proposal makes full use of randomness of polarization parameters and incoherent PSF so that a multidimensional key space is generated to deal with illegal attacks. Mueller polarization calculus and incoherent illumination of imaging structure ensure that only intensity information is manipulated. Another key advantage is that complicated processing and recording related to a complex-valued signal are avoided. The encoded information is just an intensity distribution, which is advantageous for data storage and transition because information expansion accompanying conventional encryption methods is also avoided. The decryption procedure can be performed digitally or using optoelectronic devices. Numerical simulation tests demonstrate the validity of the proposed scheme.
High Capacity Single Table Performance Design Using Partitioning in Oracle or PostgreSQL
2012-03-01
Indicators ( KPIs ) 13 5. Conclusion 14 List of Symbols, Abbreviations, and Acronyms 15 Distribution List 16 iv List of Figures Figure 1. Oracle...Figure 7. Time to seek and return one record. 4. Additional Key Performance Indicators ( KPIs ) In addition to pure response time, there are other...Laboratory ASM Automatic Storage Management CPU central processing unit I/O input/output KPIs key performance indicators OS operating system
Using Monte Carlo Simulation to Prioritize Key Maritime Environmental Impacts of Port Infrastructure
NASA Astrophysics Data System (ADS)
Perez Lespier, L. M.; Long, S.; Shoberg, T.
2016-12-01
This study creates a Monte Carlo simulation model to prioritize key indicators of environmental impacts resulting from maritime port infrastructure. Data inputs are derived from LandSat imagery, government databases, and industry reports to create the simulation. Results are validated using subject matter experts and compared with those returned from time-series regression to determine goodness of fit. The Port of Prince Rupert, Canada is used as the location for the study.
Building capacity for health promotion--a case study from China.
Tang, Kwok-Cho; Nutbeam, Don; Kong, Lingzhi; Wang, Ruotao; Yan, Jun
2005-09-01
During the period 1997-2000 a technical assistance project to build capacity for community-based health promotion was implemented in seven cities and one province in China. The technical assistance project formed part of a much larger World Bank supported program to improve disease prevention capabilities in China, commonly known as Health VII. The technical assistance project was funded by the Australian Agency for International Development. It was designed to develop capacity within the Ministry of Health (MOH) and the cities and province in the management of community-based health promotion projects, as well as supporting institutional development and public health policy reform. There are some relatively unique features of this technical assistance which helped shape its implementation and impact. It sought to provide the Chinese MOH and the cities and province with an introduction to comprehensive health promotion strategies, in contrast to the more limited information, education and communication strategies. The project was provided on a continuing basis over 3 years through a single institution, rather than as a series of ad hoc consultancies by individuals. Teaching and learning processes were developmental, leading progressively to a greater degree of local Chinese input and management to ensure sustainability and maintenance of technical support for the project. Based on this experience, this paper presents a model for capacity building projects of this type. It describes the education, training and planning activities that were the key inputs to the project, as well as the limited available evidence on the impact of the project. It describes how the project evolved over time to meet the changing needs of the participants, specifically how the content of the project shifted from a risk-factor orientation to a settings-based focus, and the delivery of the project moved from an expert-led approach to a more participatory, problem based learning approach. In terms of impact, marked differences before and after the implementation of the training activities were identified in key areas for reform, in addition to the self reported positive change in knowledge, and a high level of participant satisfaction. Key lessons are summarized. Technical assistance projects of this kind benefit from continuity and a high level of coordination, the provision of culturally and linguistically appropriate teaching, and a clear understanding of the need to match workforce development with organizational/institutional development.
Multi-Modulator for Bandwidth-Efficient Communication
NASA Technical Reports Server (NTRS)
Gray, Andrew; Lee, Dennis; Lay, Norman; Cheetham, Craig; Fong, Wai; Yeh, Pen-Shu; King, Robin; Ghuman, Parminder; Hoy, Scott; Fisher, Dave
2009-01-01
A modulator circuit board has recently been developed to be used in conjunction with a vector modulator to generate any of a large number of modulations for bandwidth-efficient radio transmission of digital data signals at rates than can exceed 100 Mb/s. The modulations include quadrature phaseshift keying (QPSK), offset quadrature phase-shift keying (OQPSK), Gaussian minimum-shift keying (GMSK), and octonary phase-shift keying (8PSK) with square-root raised-cosine pulse shaping. The figure is a greatly simplified block diagram showing the relationship between the modulator board and the rest of the transmitter. The role of the modulator board is to encode the incoming data stream and to shape the resulting pulses, which are fed as inputs to the vector modulator. The combination of encoding and pulse shaping in a given application is chosen to maximize the bandwidth efficiency. The modulator board includes gallium arsenide serial-to-parallel converters at its input end. A complementary metal oxide/semiconductor (CMOS) field-programmable gate array (FPGA) performs the coding and modulation computations and utilizes parallel processing in doing so. The results of the parallel computation are combined and converted to pulse waveforms by use of gallium arsenide parallel-to-serial converters integrated with digital-to-analog converters. Without changing the hardware, one can configure the modulator to produce any of the designed combinations of coding and modulation by loading the appropriate bit configuration file into the FPGA.
Terrestrial litter inputs as determinants of food quality of organic matter in a forest stream
J.L. Meyer; C. Hax; J.B. Wallace; S.L. Eggert; J.R. Webster
2000-01-01
Inputs of leaf litter and other organic matter from the catchment exceed autochthonous production and provide an important food resource in most streams (WEBSTER & MEYER 1997, ANDERSON & SEDELL 1979). An experimental long-term exclusion of terrestrial litter inputs to a forested headwater stream (WALLACE et al. 1997) provided an opportunity to determine if the...
Jones, Richard; Jordan, Sue
2010-01-01
Background: In mental health nursing, Crisis Resolution and Home Treatment (CRHT) services are key components of the shift from in-patient to community care. CRHT has been developed mainly in urban settings, and deployment in more rural areas has not been examined. Aim: We aimed to evaluate CRHT services’ progress towards policy targets. Participants and Setting: All 18 CRHT teams in Wales were surveyed. Methods: A service profile questionnaire was distributed to team leaders. Findings: Fourteen of 18 teams responded in full. All but one were led by nurses, who formed the main professional group. All teams reported providing an alternative to hospital admission and assisting early discharge. With one exception, teams were ‘gatekeeping’ hospital beds. There was some divergence in clients seen, perceived impact of the service, operational hours, distances travelled, team structure, input of consultant psychiatrists and caseloads. We found some differences between the 8 urban teams and the 6 teams serving rural or mixed areas: rural teams travelled more, had fewer inpatient beds, and less medical input (0.067 compared to 0.688 whole time equivalents).. Most respondents felt that resource constraints were limiting further developments. Implications: Teams met standards for CHRT services in Wales; however, these are less onerous than those in England, particularly in relation to operational hours and staffing complement. As services develop, it will be important to ensure that rural and mixed areas receive the same level of input as urban areas. PMID:20502646
Jones, Richard; Jordan, Sue
2010-02-18
In mental health nursing, Crisis Resolution and Home Treatment (CRHT) services are key components of the shift from in-patient to community care. CRHT has been developed mainly in urban settings, and deployment in more rural areas has not been examined. We aimed to evaluate CRHT services' progress towards policy targets. All 18 CRHT teams in Wales were surveyed. A service profile questionnaire was distributed to team leaders. Fourteen of 18 teams responded in full. All but one were led by nurses, who formed the main professional group. All teams reported providing an alternative to hospital admission and assisting early discharge. With one exception, teams were 'gatekeeping' hospital beds. There was some divergence in clients seen, perceived impact of the service, operational hours, distances travelled, team structure, input of consultant psychiatrists and caseloads. We found some differences between the 8 urban teams and the 6 teams serving rural or mixed areas: rural teams travelled more, had fewer inpatient beds, and less medical input (0.067 compared to 0.688 whole time equivalents).. Most respondents felt that resource constraints were limiting further developments. Teams met standards for CHRT services in Wales; however, these are less onerous than those in England, particularly in relation to operational hours and staffing complement. As services develop, it will be important to ensure that rural and mixed areas receive the same level of input as urban areas.
Characterization of motor units in behaving adult mice shows a wide primary range
Ritter, Laura K.; Tresch, Matthew C.; Heckman, C. J.; Manuel, Marin
2014-01-01
The mouse is essential for genetic studies of motor function in both normal and pathological states. Thus it is important to consider whether the structure of motor output from the mouse is in fact analogous to that recorded in other animals. There is a striking difference in the basic electrical properties of mouse motoneurons compared with those in rats, cats, and humans. The firing evoked by injected currents produces a unique frequency-current (F-I) function that emphasizes recruitment of motor units at their maximum force. These F-I functions, however, were measured in anesthetized preparations that lacked two key components of normal synaptic input: high levels of synaptic noise and neuromodulatory inputs. Recent studies suggest that the alterations in the F-I function due to these two components are essential for recreating firing behavior of motor units in human subjects. In this study we provide the first data on firing patterns of motor units in the awake mouse, focusing on steady output in quiet stance. The resulting firing patterns did not match the predictions from the mouse F-I behaviors but instead revealed rate modulation across a remarkably wide range (10–60 Hz). The low end of the firing range may be due to changes in the F-I relation induced by synaptic noise and neuromodulatory inputs. The high end of the range may indicate that, unlike other species, quiet standing in the mouse involves recruitment of relatively fast-twitch motor units. PMID:24805075
QKD Via a Quantum Wavelength Router Using Spatial Soliton
NASA Astrophysics Data System (ADS)
Kouhnavard, M.; Amiri, I. S.; Afroozeh, A.; Jalil, M. A.; Ali, J.; Yupapin, P. P.
2011-05-01
A system for continuous variable quantum key distribution via a wavelength router is proposed. The Kerr type of light in the nonlinear microring resonator (NMRR) induces the chaotic behavior. In this proposed system chaotic signals are generated by an optical soliton or Gaussian pulse within a NMRR system. The parameters, such as input power, MRRs radii and coupling coefficients can change and plays important role in determining the results in which the continuous signals are generated spreading over the spectrum. Large bandwidth signals of optical soliton are generated by the input pulse propagating within the MRRs, which is allowed to form the continuous wavelength or frequency with large tunable channel capacity. The continuous variable QKD is formed by using the localized spatial soliton pulses via a quantum router and networks. The selected optical spatial pulse can be used to perform the secure communication network. Here the entangled photon generated by chaotic signals has been analyzed. The continuous entangled photon is generated by using the polarization control unit incorporating into the MRRs, required to provide the continuous variable QKD. Results obtained have shown that the application of such a system for the simultaneous continuous variable quantum cryptography can be used in the mobile telephone hand set and networks. In this study frequency band of 500 MHz and 2.0 GHz and wavelengths of 775 nm, 2,325 nm and 1.55 μm can be obtained for QKD use with input optical soliton and Gaussian beam respectively.
Cooper, Emily A.; Norcia, Anthony M.
2015-01-01
The nervous system has evolved in an environment with structure and predictability. One of the ubiquitous principles of sensory systems is the creation of circuits that capitalize on this predictability. Previous work has identified predictable non-uniformities in the distributions of basic visual features in natural images that are relevant to the encoding tasks of the visual system. Here, we report that the well-established statistical distributions of visual features -- such as visual contrast, spatial scale, and depth -- differ between bright and dark image components. Following this analysis, we go on to trace how these differences in natural images translate into different patterns of cortical input that arise from the separate bright (ON) and dark (OFF) pathways originating in the retina. We use models of these early visual pathways to transform natural images into statistical patterns of cortical input. The models include the receptive fields and non-linear response properties of the magnocellular (M) and parvocellular (P) pathways, with their ON and OFF pathway divisions. The results indicate that there are regularities in visual cortical input beyond those that have previously been appreciated from the direct analysis of natural images. In particular, several dark/bright asymmetries provide a potential account for recently discovered asymmetries in how the brain processes visual features, such as violations of classic energy-type models. On the basis of our analysis, we expect that the dark/bright dichotomy in natural images plays a key role in the generation of both cortical and perceptual asymmetries. PMID:26020624
Characterization of motor units in behaving adult mice shows a wide primary range.
Ritter, Laura K; Tresch, Matthew C; Heckman, C J; Manuel, Marin; Tysseling, Vicki M
2014-08-01
The mouse is essential for genetic studies of motor function in both normal and pathological states. Thus it is important to consider whether the structure of motor output from the mouse is in fact analogous to that recorded in other animals. There is a striking difference in the basic electrical properties of mouse motoneurons compared with those in rats, cats, and humans. The firing evoked by injected currents produces a unique frequency-current (F-I) function that emphasizes recruitment of motor units at their maximum force. These F-I functions, however, were measured in anesthetized preparations that lacked two key components of normal synaptic input: high levels of synaptic noise and neuromodulatory inputs. Recent studies suggest that the alterations in the F-I function due to these two components are essential for recreating firing behavior of motor units in human subjects. In this study we provide the first data on firing patterns of motor units in the awake mouse, focusing on steady output in quiet stance. The resulting firing patterns did not match the predictions from the mouse F-I behaviors but instead revealed rate modulation across a remarkably wide range (10-60 Hz). The low end of the firing range may be due to changes in the F-I relation induced by synaptic noise and neuromodulatory inputs. The high end of the range may indicate that, unlike other species, quiet standing in the mouse involves recruitment of relatively fast-twitch motor units. Copyright © 2014 the American Physiological Society.
Methods, systems and apparatus for controlling operation of two alternating current (AC) machines
Gallegos-Lopez, Gabriel [Torrance, CA; Nagashima, James M [Cerritos, CA; Perisic, Milun [Torrance, CA; Hiti, Silva [Redondo Beach, CA
2012-02-14
A system is provided for controlling two AC machines. The system comprises a DC input voltage source that provides a DC input voltage, a voltage boost command control module (VBCCM), a five-phase PWM inverter module coupled to the two AC machines, and a boost converter coupled to the inverter module and the DC input voltage source. The boost converter is designed to supply a new DC input voltage to the inverter module having a value that is greater than or equal to a value of the DC input voltage. The VBCCM generates a boost command signal (BCS) based on modulation indexes from the two AC machines. The BCS controls the boost converter such that the boost converter generates the new DC input voltage in response to the BCS. When the two AC machines require additional voltage that exceeds the DC input voltage required to meet a combined target mechanical power required by the two AC machines, the BCS controls the boost converter to drive the new DC input voltage generated by the boost converter to a value greater than the DC input voltage.
Development of a traffic data input system in Arizona for the MEPDG.
DOT National Transportation Integrated Search
2013-10-01
Accurate traffic data is one of the key data elements required for the cost-effective design of all rehabilitation and reconstruction of : pavement structures. This research study addresses the collection, preparation, and use of traffic data require...
Summary for Stakeholder Meetings on Black Carbon from Diesel Sources in the Russian Arctic
From January 28-February 1, 2013, EPA and its partners held meetings in Murmansk and Moscow with key Russian stakeholders to gather input into the project’s emissions inventory methodologies and potential pilot project ideas.
Human factors in cockpit input and display for data link.
DOT National Transportation Integrated Search
1971-01-01
Problems associated with the entry of air-ground-air : messages via keyboard for transmission by Data Link : are discussed. The ARINC proposal for a keyboard is : presented, and an alternative method for coding keys : is proposed for comparative eval...
Numerical simulation of intelligent compaction technology for construction quality control.
DOT National Transportation Integrated Search
2015-02-01
For eciently updating models of large-scale structures, the response surface (RS) method based on radial basis : functions (RBFs) is proposed to model the input-output relationship of structures. The key issues for applying : the proposed method a...
Nano-catalysts: Key to the Greener Pathways Leading to Sustainability
Synthetic processes using alternative energy input in combination with nano-catalysts shorten the reaction time that eliminate or minimize side product formation. This concept is already finding acceptance in the syntheses of pharmaceuticals, fine chemicals, and polymers and may ...
The energy balance of the nighttime thermosphere
NASA Technical Reports Server (NTRS)
Glenar, D. A.
1977-01-01
The discrepancy between the input from the day hemisphere and the observed loss rates is discussed in terms of ion-neutral processes and gravity wave inputs. There has been considerable speculation as to the energy balance of the thermosphere and in particular about the fraction of the total energy input supplied by ultraviolet radiation. The problem is considerably simplified by considering the energy balance of the nighttime hemisphere alone. Sunrise and sunset vapor trail measurements provide data on the wind systems at the terminator boundary, and temperature measurements provide information on the vertical energy conduction. North-south winds from high latitude vapor trail measurements provide a measure of the energy input from auroral processes.
Jiao, Jian; Bae, Eun Ju; Bandyopadhyay, Gautam; Oliver, Jason; Marathe, Chaitra; Chen, Michael; Hsu, Jer-Yuan; Chen, Yu; Tian, Hui; Olefsky, Jerrold M; Saberi, Maziyar
2013-04-01
Gastrointestinal bypass surgeries that result in rerouting and subsequent exclusion of nutrients from the duodenum appear to rapidly alleviate hyperglycemia and hyperinsulinemia independent of weight loss. While the mechanism(s) responsible for normalization of glucose homeostasis remains to be fully elucidated, this rapid normalization coupled with the well-known effects of vagal inputs into glucose homeostasis suggests a neurohormonally mediated mechanism. Our results show that duodenal bypass surgery on obese, insulin-resistant Zucker fa/fa rats restored insulin sensitivity in both liver and peripheral tissues independent of body weight. Restoration of normoglycemia was attributable to an enhancement in key insulin-signaling molecules, including insulin receptor substrate-2, and substrate metabolism through a multifaceted mechanism involving activation of AMP-activated protein kinase and downregulation of key regulatory genes involved in both lipid and glucose metabolism. Importantly, while central nervous system-derived vagal nerves were not essential for restoration of insulin sensitivity, rapid normalization in hepatic gluconeogenic capacity and basal hepatic glucose production required intact vagal innervation. Lastly, duodenal bypass surgery selectively altered the tissue concentration of intestinally derived glucoregulatory hormone peptides in a segment-specific manner. The present data highlight and support the significance of vagal inputs and intestinal hormone peptides toward normalization of glucose and lipid homeostasis after duodenal bypass surgery.
Kwon, Tae-Ho; Kim, Jai-Eun; Kim, Ki-Doo
2018-05-14
In the field of communication, synchronization is always an important issue. The communication between a light-emitting diode (LED) array (LEA) and a camera is known as visual multiple-input multiple-output (MIMO), for which the data transmitter and receiver must be synchronized for seamless communication. In visual-MIMO, LEDs generally have a faster data rate than the camera. Hence, we propose an effective time-sharing-based synchronization technique with its color-independent characteristics providing the key to overcome this synchronization problem in visual-MIMO communication. We also evaluated the performance of our synchronization technique by varying the distance between the LEA and camera. A graphical analysis is also presented to compare the symbol error rate (SER) at different distances.
Making business decisions using trend information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prevette, S.S., Westinghouse Hanford, Richland, WA
1997-11-24
Performance Measures, and the trend information that results from their analyses, can help managers in their decision making process. The business decisions that are to be discussed are: Assignment of limited Resources, Funding, Budget; Contractor Rewards/Incentives; Where to focus Process Improvement, Reengineering efforts; When to ask ``What Happened?!!``; Determine if a previous decision was effectively implemented. Trending can provide an input for rational Business Decisions. Key Element is determination of whether or not a significant trend exists - segregating Common Cause from Special Cause. The Control Chart is the tool for accomplishment of trending and determining if you are meetingmore » your Business Objectives. Eliminate Numerical Targets; the goal is Significant Improvement. Profound Knowledge requires integrating data results with gut feeling.« less
Conceptual design of an advanced Stirling conversion system for terrestrial power generation
NASA Technical Reports Server (NTRS)
1988-01-01
A free piston Stirling engine coupled to an electric generator or alternator with a nominal kWe power output absorbing thermal energy from a nominal 100 square meter parabolic solar collector and supplying electric power to a utility grid was identified. The results of the conceptual design study of an Advanced Stirling Conversion System (ASCS) were documented. The objectives are as follows: define the ASCS configuration; provide a manufacturability and cost evaluation; predict ASCS performance over the range of solar input required to produce power; estimate system and major component weights; define engine and electrical power condidtioning control requirements; and define key technology needs not ready by the late 1980s in meeting efficiency, life, cost, and with goalds for the ASCS.
Kraus, Johanna M.; Pletcher, Leanna T.; Vonesh, James R.
2010-01-01
1. Cross-ecosystem movements of resources, including detritus, nutrients and living prey, can strongly influence food web dynamics in recipient habitats. Variation in resource inputs is thought to be driven by factors external to the recipient habitat (e.g. donor habitat productivity and boundary conditions). However, inputs of or by ‘active’ living resources may be strongly influenced by recipient habitat quality when organisms exhibit behavioural habitat selection when crossing ecosystem boundaries. 2. To examine whether behavioural responses to recipient habitat quality alter the relative inputs of ‘active’ living and ‘passive’ detrital resources to recipient food webs, we manipulated the presence of caged predatory fish and measured biomass, energy and organic content of inputs to outdoor experimental pools of adult aquatic insects, frog eggs, terrestrial plant matter and terrestrial arthropods. 3. Caged fish reduced the biomass, energy and organic matter donated to pools by tree frog eggs by ∼70%, but did not alter insect colonisation or passive allochthonous inputs of terrestrial arthropods and plant material. Terrestrial plant matter and adult aquatic insects provided the most energy and organic matter inputs to the pools (40–50%), while terrestrial arthropods provided the least (7%). Inputs of frog egg were relatively small but varied considerably among pools and over time (3%, range = 0–20%). Absolute and proportional amounts varied by input type. 4. Aquatic predators can strongly affect the magnitude of active, but not passive, inputs and that the effect of recipient habitat quality on active inputs is variable. Furthermore, some active inputs (i.e. aquatic insect colonists) can provide similar amounts of energy and organic matter as passive inputs of terrestrial plant matter, which are well known to be important. Because inputs differ in quality and the trophic level they subsidise, proportional changes in input type could have strong effects on recipient food webs. 5. Cross-ecosystem resource inputs have previously been characterised as donor-controlled. However, control by the recipient food web could lead to greater feedback between resource flow and consumer dynamics than has been appreciated so far.
Optimizing Approximate Weighted Matching on Nvidia Kepler K40
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naim, Md; Manne, Fredrik; Halappanavar, Mahantesh
Matching is a fundamental graph problem with numerous applications in science and engineering. While algorithms for computing optimal matchings are difficult to parallelize, approximation algorithms on the other hand generally compute high quality solutions and are amenable to parallelization. In this paper, we present efficient implementations of the current best algorithm for half-approximate weighted matching, the Suitor algorithm, on Nvidia Kepler K-40 platform. We develop four variants of the algorithm that exploit hardware features to address key challenges for a GPU implementation. We also experiment with different combinations of work assigned to a warp. Using an exhaustive set ofmore » $269$ inputs, we demonstrate that the new implementation outperforms the previous best GPU algorithm by $10$ to $$100\\times$$ for over $100$ instances, and from $100$ to $$1000\\times$$ for $15$ instances. We also demonstrate up to $$20\\times$$ speedup relative to $2$ threads, and up to $$5\\times$$ relative to $16$ threads on Intel Xeon platform with $16$ cores for the same algorithm. The new algorithms and implementations provided in this paper will have a direct impact on several applications that repeatedly use matching as a key compute kernel. Further, algorithm designs and insights provided in this paper will benefit other researchers implementing graph algorithms on modern GPU architectures.« less
Beck, Susan L; Eaton, Linda H; Echeverria, Christina; Mooney, Kathi H
2017-10-01
SymptomCare@Home, an integrated symptom monitoring and management system, was designed as part of randomized clinical trials to help patients with cancer who receive chemotherapy in ambulatory clinics and often experience significant symptoms at home. An iterative design process was informed by chronic disease management theory and features of assessment and clinical decision support systems used in other diseases. Key stakeholders participated in the design process: nurse scientists, clinical experts, bioinformatics experts, and computer programmers. Especially important was input from end users, patients, and nurse practitioners participating in a series of studies testing the system. The system includes both a patient and clinician interface and fully integrates two electronic subsystems: a telephone computer-linked interactive voice response system and a Web-based Decision Support-Symptom Management System. Key features include (1) daily symptom monitoring, (2) self-management coaching, (3) alerting, and (4) nurse practitioner follow-up. The nurse practitioner is distinctively positioned to provide assessment, education, support, and pharmacologic and nonpharmacologic interventions to intensify management of poorly controlled symptoms at home. SymptomCare@Home is a model for providing telehealth. The system facilitates using evidence-based guidelines as part of a comprehensive symptom management approach. The design process and system features can be applied to other diseases and conditions.
Piezoelectric particle accelerator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kemp, Mark A.; Jongewaard, Erik N.; Haase, Andrew A.
2017-08-29
A particle accelerator is provided that includes a piezoelectric accelerator element, where the piezoelectric accelerator element includes a hollow cylindrical shape, and an input transducer, where the input transducer is disposed to provide an input signal to the piezoelectric accelerator element, where the input signal induces a mechanical excitation of the piezoelectric accelerator element, where the mechanical excitation is capable of generating a piezoelectric electric field proximal to an axis of the cylindrical shape, where the piezoelectric accelerator is configured to accelerate a charged particle longitudinally along the axis of the cylindrical shape according to the piezoelectric electric field.
Overview of Heat Addition and Efficiency Predictions for an Advanced Stirling Convertor
NASA Technical Reports Server (NTRS)
Wilson, Scott D.; Reid, Terry; Schifer, Nicholas; Briggs, Maxwell
2011-01-01
Past methods of predicting net heat input needed to be validated. Validation effort pursued with several paths including improving model inputs, using test hardware to provide validation data, and validating high fidelity models. Validation test hardware provided direct measurement of net heat input for comparison to predicted values. Predicted value of net heat input was 1.7 percent less than measured value and initial calculations of measurement uncertainty were 2.1 percent (under review). Lessons learned during validation effort were incorporated into convertor modeling approach which improved predictions of convertor efficiency.
A study of remote sensing as applied to regional and small watersheds. Volume 1: Summary report
NASA Technical Reports Server (NTRS)
Ambaruch, R.
1974-01-01
The accuracy of remotely sensed measurements to provide inputs to hydrologic models of watersheds is studied. A series of sensitivity analyses on continuous simulation models of three watersheds determined: (1)Optimal values and permissible tolerances of inputs to achieve accurate simulation of streamflow from the watersheds; (2) Which model inputs can be quantified from remote sensing, directly, indirectly or by inference; and (3) How accurate remotely sensed measurements (from spacecraft or aircraft) must be to provide a basis for quantifying model inputs within permissible tolerances.
A novel approach for connecting temporal-ontologies with blood flow simulations.
Weichert, F; Mertens, C; Walczak, L; Kern-Isberner, G; Wagner, M
2013-06-01
In this paper an approach for developing a temporal domain ontology for biomedical simulations is introduced. The ideas are presented in the context of simulations of blood flow in aneurysms using the Lattice Boltzmann Method. The advantages in using ontologies are manyfold: On the one hand, ontologies having been proven to be able to provide medical special knowledge e.g., key parameters for simulations. On the other hand, based on a set of rules and the usage of a reasoner, a system for checking the plausibility as well as tracking the outcome of medical simulations can be constructed. Likewise, results of simulations including data derived from them can be stored and communicated in a way that can be understood by computers. Later on, this set of results can be analyzed. At the same time, the ontologies provide a way to exchange knowledge between researchers. Lastly, this approach can be seen as a black-box abstraction of the internals of the simulation for the biomedical researcher as well. This approach is able to provide the complete parameter sets for simulations, part of the corresponding results and part of their analysis as well as e.g., geometry and boundary conditions. These inputs can be transferred to different simulation methods for comparison. Variations on the provided parameters can be automatically used to drive these simulations. Using a rule base, unphysical inputs or outputs of the simulation can be detected and communicated to the physician in a suitable and familiar way. An example for an instantiation of the blood flow simulation ontology and exemplary rules for plausibility checking are given. Copyright © 2013 Elsevier Inc. All rights reserved.
Chimberengwa, Pugie Tawanda; Masuka, Nyasha; Gombe, Notion Tafara; Tshimanga, Mufuta; Takundwa, Lucia; Bangure, Donewell
2015-01-01
Matabeleland South launched the malaria pre-elimination campaign in 2012 but provincial spraying coverage has failed to attain 95% target, with some districts still encountering malaria outbreaks. A study was conducted to evaluate program performance against achieving malaria pre-elimination. A descriptive cross sectional study was done in 5 districts carrying out IRS using the logical framework involving inputs, process, outputs and outcome evaluation. Health workers recruited into the study included direct program implementers, district and provincial program managers. An interviewer administered questionnaire, checklists, key informant interviewer guide and desk review of records were used to collect data. We enrolled 37 primary respondents and 5 key informants. Pre-elimination, Epidemic Preparedness and Response plans were absent in all districts. Shortages of inputs were reported by 97% of respondents, with districts receiving 80% of requested budget. Insecticides were procured centrally at national level. Spraying started late and districts failed to spray all targeted households by end of December. The province is using makeshift camps with inappropriate evaporation ponds where liquid DDT waste is not safely accounted for. The provincial IHRS coverage for 2011 was 84%. Challenges cited included; food shortages for spraymen, late delivery of inputs and poor state of IHRS equipment. The province has failed to achieve Malaria pre-elimination IRS coverage targets for 2011/12 season. Financial and logistical challenges led to delays in supply of program inputs, recruitment and training of sprayers. The Province should establish camping infrastructure with standard evaporation ponds to minimise contamination of the environment.
The Construct of Attention in Schizophrenia
Luck, Steven J.; Gold, James M.
2008-01-01
Schizophrenia is widely thought to involve deficits of attention. However, the term attention can be defined so broadly that impaired performance on virtually any task could be construed as evidence for a deficit in attention, and this has slowed cumulative progress in understanding attention deficits in schizophrenia. To address this problem, we divide the general concept of attention into two distinct constructs: input selection, the selection of task-relevant inputs for further processing; and rule selection, the selective activation of task-appropriate rules. These constructs are closely tied to working memory, because input selection mechanisms are used to control the transfer of information into working memory and because working memory stores the rules used by rule selection mechanisms. These constructs are also closely tied to executive function, because executive systems are used to guide input selection and because rule selection is itself at key aspect of executive function. Within the domain of input selection, it is important to distinguish between the control of selection—the processes that guide attention to task-relevant inputs—and the implementation of selection—the processes that enhance the processing of the relevant inputs and suppress the irrelevant inputs. Current evidence suggests that schizophrenia involves a significant impairment in the control of selection but little or no impairment in the implementation of selection. Consequently, the CNTRICS participants agreed by consensus that attentional control should be a priority target for measurement and treatment research in schizophrenia. PMID:18374901
670 GHz Schottky Diode Based Subharmonic Mixer with CPW Circuits and 70 GHz IF
NASA Technical Reports Server (NTRS)
Chattopadhyay, Goutam (Inventor); Schlecht, Erich T. (Inventor); Lee, Choonsup (Inventor); Lin, Robert H. (Inventor); Gill, John J. (Inventor); Sin, Seth (Inventor); Mehdi, Imran (Inventor)
2014-01-01
A coplanar waveguide (CPW) based subharmonic mixer working at 670 GHz using GaAs Schottky diodes. One example of the mixer has a LO input, an RF input and an IF output. Another possible mixer has a LO input, and IF input and an RF output. Each input or output is connected to a coplanar waveguide with a matching network. A pair of antiparallel diodes provides a signal at twice the LO frequency, which is then mixed with a second signal to provide signals having sum and difference frequencies. The output signal of interest is received after passing through a bandpass filter tuned to the frequency range of interest.
Prediction of surface distress using neural networks
NASA Astrophysics Data System (ADS)
Hamdi, Hadiwardoyo, Sigit P.; Correia, A. Gomes; Pereira, Paulo; Cortez, Paulo
2017-06-01
Road infrastructures contribute to a healthy economy throughout a sustainable distribution of goods and services. A road network requires appropriately programmed maintenance treatments in order to keep roads assets in good condition, providing maximum safety for road users under a cost-effective approach. Surface Distress is the key element to identify road condition and may be generated by many different factors. In this paper, a new approach is aimed to predict Surface Distress Index (SDI) values following a data-driven approach. Later this model will be accordingly applied by using data obtained from the Integrated Road Management System (IRMS) database. Artificial Neural Networks (ANNs) are used to predict SDI index using input variables related to the surface of distress, i.e., crack area and width, pothole, rutting, patching and depression. The achieved results show that ANN is able to predict SDI with high correlation factor (R2 = 0.996%). Moreover, a sensitivity analysis was applied to the ANN model, revealing the influence of the most relevant input parameters for SDI prediction, namely rutting (59.8%), crack width (29.9%) and crack area (5.0%), patching (3.0%), pothole (1.7%) and depression (0.3%).
Implementation of a Trailing-Edge Flap Analysis Model in the NASA Langley CAMRAD.MOD1/Hires Program
NASA Technical Reports Server (NTRS)
Charles, Bruce
1999-01-01
Continual advances in rotorcraft performance, vibration and acoustic characteristics are being sought by rotary-wing vehicle manufacturers to improve efficiency, handling qualities and community noise acceptance of their products. The rotor system aerodynamic and dynamic behavior are among the key factors which must be addressed to meet the desired goals. Rotor aerodynamicists study how airload redistribution impacts performance and noise, and seek ways to achieve better airload distribution through changes in local aerodynamic response characteristics. One method currently receiving attention is the use of trailing-edge flaps mounted on the rotor blades to provide direct control of a portion of the spanwise lift characteristics. The following work describes the incorporation of a trailing-edge flap model in the CAMRAD.Mod1/FHUS comprehensive rotorcraft analysis code. The CAM-RAD.Mod1/HIRES analysis consists of three separate executable codes. These include the comprehensive trim analysis, CAMRAD.Mod1, the Indicial Post-Processor, IPP, for high resolution airloads, and AIRFOIL, which produces the rotor airfoil tables from input airfoil section characteristics. The modifications made to these components permitting analysis of flapped rotor configurations are documented herein along with user instructions detailing the new input variables and operational notes.
Illusions of team working in health care.
West, Michael A; Lyubovnikova, Joanne
2013-01-01
The ubiquity and value of teams in healthcare are well acknowledged. However, in practice, healthcare teams vary dramatically in their structures and effectiveness in ways that can damage team processes and patient outcomes. The aim of this paper is to highlight these characteristics and to extrapolate several important aspects of teamwork that have a powerful impact on team effectiveness across healthcare contexts. The paper draws upon the literature from health services management and organisational behaviour to provide an overview of the current science of healthcare teams. Underpinned by the input-process-output framework of team effectiveness, team composition, team task, and organisational support are viewed as critical inputs that influence key team processes including team objectives, leadership and reflexivity, which in turn impact staff and patient outcomes. Team training interventions and care pathways can facilitate more effective interdisciplinary teamwork. The paper argues that the prevalence of the term "team" in healthcare makes the synthesis and advancement of the scientific understanding of healthcare teams a challenge. Future research therefore needs to better define the fundamental characteristics of teams in studies in order to ensure that findings based on real teams, rather than pseudo-like groups, are accumulated.
Metronome LKM: An open source virtual keyboard driver to measure experiment software latencies.
Garaizar, Pablo; Vadillo, Miguel A
2017-10-01
Experiment software is often used to measure reaction times gathered with keyboards or other input devices. In previous studies, the accuracy and precision of time stamps has been assessed through several means: (a) generating accurate square wave signals from an external device connected to the parallel port of the computer running the experiment software, (b) triggering the typematic repeat feature of some keyboards to get an evenly separated series of keypress events, or (c) using a solenoid handled by a microcontroller to press the input device (keyboard, mouse button, touch screen) that will be used in the experimental setup. Despite the advantages of these approaches in some contexts, none of them can isolate the measurement error caused by the experiment software itself. Metronome LKM provides a virtual keyboard to assess an experiment's software. Using this open source driver, researchers can generate keypress events using high-resolution timers and compare the time stamps collected by the experiment software with those gathered by Metronome LKM (with nanosecond resolution). Our software is highly configurable (in terms of keys pressed, intervals, SysRq activation) and runs on 2.6-4.8 Linux kernels.
Increasing the odds: applying emergentist theory in language intervention.
Poll, Gerard H
2011-10-01
This review introduces emergentism, which is a leading theory of language development that states that language ability is the product of interactions between the child's language environment and his or her learning capabilities. The review suggests ways in which emergentism provides a theoretical rationale for interventions that are designed to address developmental language delays in young children. A review of selected literature on emergentist theory and research is presented, with a focus on the acquisition of early morphology and syntax. A significant method for developing and testing emergentist theory, connectionist modeling, is described. Key themes from both connectionist and behavioral studies are summarized and applied with specific examples to language intervention techniques. A case study is presented to integrate elements of emergentism with language intervention. Evaluating the theoretical foundation for language interventions is an important step in evidence-based practice. This article introduces three themes in the emergentist literature that have implications for language intervention: (a) sufficiency of language input, (b) active engagement of the child with the input, and (c) factors that increase the odds for correctly mapping language form to meaning. Evidence supporting the importance of these factors in effective language intervention is presented, along with limitations in that evidence.
NASA Astrophysics Data System (ADS)
Waeles, Matthieu; Planquette, Hélène; Afandi, Imane; Delebecque, Nina; Bouthir, Fatimazohra; Donval, Anne; Shelley, Rachel U.; Auger, Pierre-Amaël.; Riso, Ricardo D.; Tito de Morais, Luis
2016-05-01
In this study, we report the distributions of total dissolvable cadmium and particulate cadmium from 27 stations in southern Moroccan coastal waters (22°N-30°N), which is part of the North-West African upwelling system. These distributions were predominantly controlled by upwelling of the North Atlantic Central Waters (NACWs) and uptake by primary production. Atmospheric inputs and phosphogypsum slurry inputs from the phosphate industry at Jorf Lasfar (33°N), recently estimated as an important source of dissolved cadmium (240 t Cd yr-1), are at best of minor importance for the studied waters. Our study provides new insights into the mechanisms fractionating cadmium from phosphate. In the upper 30 m, the anomalies observed in terms of Cd:P ratios in both the particulate and total dissolvable fractions were related to an overall preferential uptake of phosphate. We show that the type of phytoplanktonic assemblage (diatoms versus dinoflagellates) is also a determinant of the fractionation intensity. In subsurface waters (30-60 m), a clear preferential release of P (versus Cd) was observed indicating that remineralization in Oxygen Minimum Zones is a key process in sequestering Cd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gauntt, Randall O.; Mattie, Patrick D.; Bixler, Nathan E.
2014-02-01
This paper describes the knowledge advancements from the uncertainty analysis for the State-of- the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout accident scenario at the Peach Bottom Atomic Power Station. This work assessed key MELCOR and MELCOR Accident Consequence Code System, Version 2 (MACCS2) modeling uncertainties in an integrated fashion to quantify the relative importance of each uncertain input on potential accident progression, radiological releases, and off-site consequences. This quantitative uncertainty analysis provides measures of the effects on consequences, of each of the selected uncertain parameters both individually and in interaction with other parameters. The results measure the modelmore » response (e.g., variance in the output) to uncertainty in the selected input. Investigation into the important uncertain parameters in turn yields insights into important phenomena for accident progression and off-site consequences. This uncertainty analysis confirmed the known importance of some parameters, such as failure rate of the Safety Relief Valve in accident progression modeling and the dry deposition velocity in off-site consequence modeling. The analysis also revealed some new insights, such as dependent effect of cesium chemical form for different accident progressions. (auth)« less
NASA Astrophysics Data System (ADS)
Rothman, D. S.; Siraj, A.; Hughes, B.
2013-12-01
The international research community is currently in the process of developing new scenarios for climate change research. One component of these scenarios are the Shared Socio-economic Pathways (SSPs), which describe a set of possible future socioeconomic conditions. These are presented in narrative storylines with associated quantitative drivers. The core quantitative drivers include total population, average GDP per capita, educational attainment, and urbanization at the global, regional, and national levels. At the same time there have been calls, particularly by the IAV community, for the SSPs to include additional quantitative information on other key social factors, such as income inequality, governance, health, and access to key infrastructures, which are discussed in the narratives. The International Futures system (IFs), based at the Pardee Center at the University of Denver, is able to provide forecasts of many of these indicators. IFs cannot use the SSP drivers as exogenous inputs, but we are able to create development pathways that closely reproduce the core quantitative drivers defined by the different SSPs, as well as incorporating assumptions on other key driving factors described in the qualitative narratives. In this paper, we present forecasts for additional quantitative indicators based upon the implementation of the SSP development pathways in IFs. These results will be of value to many researchers.
Estimation of end point foot clearance points from inertial sensor data.
Santhiranayagam, Braveena K; Lai, Daniel T H; Begg, Rezaul K; Palaniswami, Marimuthu
2011-01-01
Foot clearance parameters provide useful insight into tripping risks during walking. This paper proposes a technique for the estimate of key foot clearance parameters using inertial sensor (accelerometers and gyroscopes) data. Fifteen features were extracted from raw inertial sensor measurements, and a regression model was used to estimate two key foot clearance parameters: First maximum vertical clearance (m x 1) after toe-off and the Minimum Toe Clearance (MTC) of the swing foot. Comparisons are made against measurements obtained using an optoelectronic motion capture system (Optotrak), at 4 different walking speeds. General Regression Neural Networks (GRNN) were used to estimate the desired parameters from the sensor features. Eight subjects foot clearance data were examined and a Leave-one-subject-out (LOSO) method was used to select the best model. The best average Root Mean Square Errors (RMSE) across all subjects obtained using all sensor features at the maximum speed for m x 1 was 5.32 mm and for MTC was 4.04 mm. Further application of a hill-climbing feature selection technique resulted in 0.54-21.93% improvement in RMSE and required fewer input features. The results demonstrated that using raw inertial sensor data with regression models and feature selection could accurately estimate key foot clearance parameters.
Prediction of Geomagnetic Activity and Key Parameters in High-Latitude Ionosphere-Basic Elements
NASA Technical Reports Server (NTRS)
Lyatsky, W.; Khazanov, G. V.
2007-01-01
Prediction of geomagnetic activity and related events in the Earth's magnetosphere and ionosphere is an important task of the Space Weather program. Prediction reliability is dependent on the prediction method and elements included in the prediction scheme. Two main elements are a suitable geomagnetic activity index and coupling function -- the combination of solar wind parameters providing the best correlation between upstream solar wind data and geomagnetic activity. The appropriate choice of these two elements is imperative for any reliable prediction model. The purpose of this work was to elaborate on these two elements -- the appropriate geomagnetic activity index and the coupling function -- and investigate the opportunity to improve the reliability of the prediction of geomagnetic activity and other events in the Earth's magnetosphere. The new polar magnetic index of geomagnetic activity and the new version of the coupling function lead to a significant increase in the reliability of predicting the geomagnetic activity and some key parameters, such as cross-polar cap voltage and total Joule heating in high-latitude ionosphere, which play a very important role in the development of geomagnetic and other activity in the Earth s magnetosphere, and are widely used as key input parameters in modeling magnetospheric, ionospheric, and thermospheric processes.
Soil Organic Matter in Its Native State: Unravelling the Most Complex Biomaterial on Earth.
Masoom, Hussain; Courtier-Murias, Denis; Farooq, Hashim; Soong, Ronald; Kelleher, Brian P; Zhang, Chao; Maas, Werner E; Fey, Michael; Kumar, Rajeev; Monette, Martine; Stronks, Henry J; Simpson, Myrna J; Simpson, André J
2016-02-16
Since the isolation of soil organic matter in 1786, tens of thousands of publications have searched for its structure. Nuclear magnetic resonance (NMR) spectroscopy has played a critical role in defining soil organic matter but traditional approaches remove key information such as the distribution of components at the soil-water interface and conformational information. Here a novel form of NMR with capabilities to study all physical phases termed Comprehensive Multiphase NMR, is applied to analyze soil in its natural swollen-state. The key structural components in soil organic matter are identified to be largely composed of macromolecular inputs from degrading biomass. Polar lipid heads and carbohydrates dominate the soil-water interface while lignin and microbes are arranged in a more hydrophobic interior. Lignin domains cannot be penetrated by aqueous solvents even at extreme pH indicating they are the most hydrophobic environment in soil and are ideal for sequestering hydrophobic contaminants. Here, for the first time, a complete range of physical states of a whole soil can be studied. This provides a more detailed understanding of soil organic matter at the molecular level itself key to develop the most efficient soil remediation and agricultural techniques, and better predict carbon sequestration and climate change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoogmartens, Rob, E-mail: rob.hoogmartens@uhasselt.be; Van Passel, Steven, E-mail: steven.vanpassel@uhasselt.be; Van Acker, Karel, E-mail: karel.vanacker@lrd.kuleuven.be
Increasing interest in sustainability has led to the development of sustainability assessment tools such as Life Cycle Analysis (LCA), Life Cycle Costing (LCC) and Cost–Benefit Analysis (CBA). Due to methodological disparity of these three tools, conflicting assessment results generate confusion for many policy and business decisions. In order to interpret and integrate assessment results, the paper provides a framework that clarifies the connections and coherence between the included assessment methodologies. Building on this framework, the paper further focuses on key aspects to adapt any of the methodologies to full sustainability assessments. Aspects dealt with in the review are for examplemore » the reported metrics, the scope, data requirements, discounting, product- or project-related and approaches with respect to scarcity and labor requirements. In addition to these key aspects, the review shows that important connections exist: (i) the three tools can cope with social inequality, (ii) processes such as valuation techniques for LCC and CBA are common, (iii) Environmental Impact Assessment (EIA) is used as input in both LCA and CBA and (iv) LCA can be used in parallel with LCC. Furthermore, the most integrated sustainability approach combines elements of LCA and LCC to achieve the Life Cycle Sustainability Assessment (LCSA). The key aspects and the connections referred to in the review are illustrated with a case study on the treatment of end-of-life automotive glass. - Highlights: • Proliferation of assessment tools creates ambiguity and confusion. • The developed assessment framework clarifies connections between assessment tools. • Broadening LCA, key aspects are metric and data requirements. • Broadening LCC, key aspects are scope, time frame and discounting. • Broadening CBA, focus point, timespan, references, labor and scarcity are key.« less
Ming, Y; Peiwen, Q
2001-03-01
The understanding of ultrasonic motor performances as a function of input parameters, such as the voltage amplitude, driving frequency, the preload on the rotor, is a key to many applications and control of ultrasonic motor. This paper presents performances estimation of the piezoelectric rotary traveling wave ultrasonic motor as a function of input voltage amplitude and driving frequency and preload. The Love equation is used to derive the traveling wave amplitude on the stator surface. With the contact model of the distributed spring-rigid body between the stator and rotor, a two-dimension analytical model of the rotary traveling wave ultrasonic motor is constructed. Then the performances of stead rotation speed and stall torque are deduced. With MATLAB computational language and iteration algorithm, we estimate the performances of rotation speed and stall torque versus input parameters respectively. The same experiments are completed with the optoelectronic tachometer and stand weight. Both estimation and experiment results reveal the pattern of performance variation as a function of its input parameters.
Improved Neural Networks with Random Weights for Short-Term Load Forecasting
Lang, Kun; Zhang, Mingyuan; Yuan, Yongbo
2015-01-01
An effective forecasting model for short-term load plays a significant role in promoting the management efficiency of an electric power system. This paper proposes a new forecasting model based on the improved neural networks with random weights (INNRW). The key is to introduce a weighting technique to the inputs of the model and use a novel neural network to forecast the daily maximum load. Eight factors are selected as the inputs. A mutual information weighting algorithm is then used to allocate different weights to the inputs. The neural networks with random weights and kernels (KNNRW) is applied to approximate the nonlinear function between the selected inputs and the daily maximum load due to the fast learning speed and good generalization performance. In the application of the daily load in Dalian, the result of the proposed INNRW is compared with several previously developed forecasting models. The simulation experiment shows that the proposed model performs the best overall in short-term load forecasting. PMID:26629825
Improved Neural Networks with Random Weights for Short-Term Load Forecasting.
Lang, Kun; Zhang, Mingyuan; Yuan, Yongbo
2015-01-01
An effective forecasting model for short-term load plays a significant role in promoting the management efficiency of an electric power system. This paper proposes a new forecasting model based on the improved neural networks with random weights (INNRW). The key is to introduce a weighting technique to the inputs of the model and use a novel neural network to forecast the daily maximum load. Eight factors are selected as the inputs. A mutual information weighting algorithm is then used to allocate different weights to the inputs. The neural networks with random weights and kernels (KNNRW) is applied to approximate the nonlinear function between the selected inputs and the daily maximum load due to the fast learning speed and good generalization performance. In the application of the daily load in Dalian, the result of the proposed INNRW is compared with several previously developed forecasting models. The simulation experiment shows that the proposed model performs the best overall in short-term load forecasting.
Cell type-specific long-range connections of basal forebrain circuit.
Do, Johnny Phong; Xu, Min; Lee, Seung-Hee; Chang, Wei-Cheng; Zhang, Siyu; Chung, Shinjae; Yung, Tyler J; Fan, Jiang Lan; Miyamichi, Kazunari; Luo, Liqun; Dan, Yang
2016-09-19
The basal forebrain (BF) plays key roles in multiple brain functions, including sleep-wake regulation, attention, and learning/memory, but the long-range connections mediating these functions remain poorly characterized. Here we performed whole-brain mapping of both inputs and outputs of four BF cell types - cholinergic, glutamatergic, and parvalbumin-positive (PV+) and somatostatin-positive (SOM+) GABAergic neurons - in the mouse brain. Using rabies virus -mediated monosynaptic retrograde tracing to label the inputs and adeno-associated virus to trace axonal projections, we identified numerous brain areas connected to the BF. The inputs to different cell types were qualitatively similar, but the output projections showed marked differences. The connections to glutamatergic and SOM+ neurons were strongly reciprocal, while those to cholinergic and PV+ neurons were more unidirectional. These results reveal the long-range wiring diagram of the BF circuit with highly convergent inputs and divergent outputs and point to both functional commonality and specialization of different BF cell types.
PCDAQ, A Windows Based DAQ System
NASA Astrophysics Data System (ADS)
Hogan, Gary
1998-10-01
PCDAQ is a Windows NT based general DAQ/Analysis/Monte Carlo shell developed as part of the Proton Radiography project at LANL (Los Alamos National Laboratory). It has been adopted by experiments outside of the Proton Radiography project at Brookhaven National Laboratory (BNL) and at LANL. The program provides DAQ, Monte Carlo, and replay (disk file input) modes. Data can be read from hardware (CAMAC) or other programs (ActiveX servers). Future versions will read VME. User supplied data analysis routines can be written in Fortran, C++, or Visual Basic. Histogramming, testing, and plotting packages are provided. Histogram data can be exported to spreadsheets or analyzed in user supplied programs. Plots can be copied and pasted as bitmap objects into other Windows programs or printed. A text database keyed by the run number is provided. Extensive software control flags are provided so that the user can control the flow of data through the program. Control flags can be set either in script command files or interactively. The program can be remotely controlled and data accessed over the Internet through its ActiveX DCOM interface.
NASA Astrophysics Data System (ADS)
Lan, Ganhui; Tu, Yuhai
2016-05-01
Living systems have to constantly sense their external environment and adjust their internal state in order to survive and reproduce. Biological systems, from as complex as the brain to a single E. coli cell, have to process these data in order to make appropriate decisions. How do biological systems sense external signals? How do they process the information? How do they respond to signals? Through years of intense study by biologists, many key molecular players and their interactions have been identified in different biological machineries that carry out these signaling functions. However, an integrated, quantitative understanding of the whole system is still lacking for most cellular signaling pathways, not to say the more complicated neural circuits. To study signaling processes in biology, the key thing to measure is the input-output relationship. The input is the signal itself, such as chemical concentration, external temperature, light (intensity and frequency), and more complex signals such as the face of a cat. The output can be protein conformational changes and covalent modifications (phosphorylation, methylation, etc), gene expression, cell growth and motility, as well as more complex output such as neuron firing patterns and behaviors of higher animals. Due to the inherent noise in biological systems, the measured input-output dependence is often noisy. These noisy data can be analysed by using powerful tools and concepts from information theory such as mutual information, channel capacity, and the maximum entropy hypothesis. This information theory approach has been successfully used to reveal the underlying correlations between key components of biological networks, to set bounds for network performance, and to understand possible network architecture in generating observed correlations. Although the information theory approach provides a general tool in analysing noisy biological data and may be used to suggest possible network architectures in preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network—the main players (nodes) and their interactions (links)—in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also study the thermodynamic costs of adaptation for cells to maintain an accurate memory. The statistical physics based approach described here should be useful in understanding design principles for cellular biochemical circuits in general.
Lan, Ganhui; Tu, Yuhai
2016-05-01
Living systems have to constantly sense their external environment and adjust their internal state in order to survive and reproduce. Biological systems, from as complex as the brain to a single E. coli cell, have to process these data in order to make appropriate decisions. How do biological systems sense external signals? How do they process the information? How do they respond to signals? Through years of intense study by biologists, many key molecular players and their interactions have been identified in different biological machineries that carry out these signaling functions. However, an integrated, quantitative understanding of the whole system is still lacking for most cellular signaling pathways, not to say the more complicated neural circuits. To study signaling processes in biology, the key thing to measure is the input-output relationship. The input is the signal itself, such as chemical concentration, external temperature, light (intensity and frequency), and more complex signals such as the face of a cat. The output can be protein conformational changes and covalent modifications (phosphorylation, methylation, etc), gene expression, cell growth and motility, as well as more complex output such as neuron firing patterns and behaviors of higher animals. Due to the inherent noise in biological systems, the measured input-output dependence is often noisy. These noisy data can be analysed by using powerful tools and concepts from information theory such as mutual information, channel capacity, and the maximum entropy hypothesis. This information theory approach has been successfully used to reveal the underlying correlations between key components of biological networks, to set bounds for network performance, and to understand possible network architecture in generating observed correlations. Although the information theory approach provides a general tool in analysing noisy biological data and may be used to suggest possible network architectures in preserving information, it does not reveal the underlying mechanism that leads to the observed input-output relationship, nor does it tell us much about which information is important for the organism and how biological systems use information to carry out specific functions. To do that, we need to develop models of the biological machineries, e.g. biochemical networks and neural networks, to understand the dynamics of biological information processes. This is a much more difficult task. It requires deep knowledge of the underlying biological network-the main players (nodes) and their interactions (links)-in sufficient detail to build a model with predictive power, as well as quantitative input-output measurements of the system under different perturbations (both genetic variations and different external conditions) to test the model predictions to guide further development of the model. Due to the recent growth of biological knowledge thanks in part to high throughput methods (sequencing, gene expression microarray, etc) and development of quantitative in vivo techniques such as various florescence technology, these requirements are starting to be realized in different biological systems. The possible close interaction between quantitative experimentation and theoretical modeling has made systems biology an attractive field for physicists interested in quantitative biology. In this review, we describe some of the recent work in developing a quantitative predictive model of bacterial chemotaxis, which can be considered as the hydrogen atom of systems biology. Using statistical physics approaches, such as the Ising model and Langevin equation, we study how bacteria, such as E. coli, sense and amplify external signals, how they keep a working memory of the stimuli, and how they use these data to compute the chemical gradient. In particular, we will describe how E. coli cells avoid cross-talk in a heterogeneous receptor cluster to keep a ligand-specific memory. We will also study the thermodynamic costs of adaptation for cells to maintain an accurate memory. The statistical physics based approach described here should be useful in understanding design principles for cellular biochemical circuits in general.
Variable Delay Element For Jitter Control In High Speed Data Links
Livolsi, Robert R.
2002-06-11
A circuit and method for decreasing the amount of jitter present at the receiver input of high speed data links which uses a driver circuit for input from a high speed data link which comprises a logic circuit having a first section (1) which provides data latches, a second section (2) which provides a circuit generates a pre-destorted output and for compensating for level dependent jitter having an OR function element and a NOR function element each of which is coupled to two inputs and to a variable delay element as an input which provides a bi-modal delay for pulse width pre-distortion, a third section (3) which provides a muxing circuit, and a forth section (4) for clock distribution in the driver circuit. A fifth section is used for logic testing the driver circuit.
Dual physiological rate measurement instrument
NASA Technical Reports Server (NTRS)
Cooper, Tommy G. (Inventor)
1990-01-01
The object of the invention is to provide an instrument for converting a physiological pulse rate into a corresponding linear output voltage. The instrument which accurately measures the rate of an unknown rectangular pulse wave over an extended range of values comprises a phase-locked loop including a phase comparator, a filtering network, and a voltage-controlled oscillator, arranged in cascade. The phase comparator has a first input responsive to the pulse wave and a second input responsive to the output signal of the voltage-controlled oscillator. The comparator provides a signal dependent on the difference in phase and frequency between the signals appearing on the first and second inputs. A high-input impedance amplifier accepts an output from the filtering network and provides an amplified output DC signal to a utilization device for providing a measurement of the rate of the pulse wave.
Rui, Yichao; Murphy, Daniel V; Wang, Xiaoli; Hoyle, Frances C
2016-10-18
Rebuilding 'lost' soil carbon (C) is a priority in mitigating climate change and underpinning key soil functions that support ecosystem services. Microorganisms determine if fresh C input is converted into stable soil organic matter (SOM) or lost as CO 2 . Here we quantified if microbial biomass and respiration responded positively to addition of light fraction organic matter (LFOM, representing recent inputs of plant residue) in an infertile semi-arid agricultural soil. Field trial soil with different historical plant residue inputs [soil C content: control (tilled) = 9.6 t C ha -1 versus tilled + plant residue treatment (tilled + OM) = 18.0 t C ha -1 ] were incubated in the laboratory with a gradient of LFOM equivalent to 0 to 3.8 t C ha -1 (0 to 500% LFOM). Microbial biomass C significantly declined under increased rates of LFOM addition while microbial respiration increased linearly, leading to a decrease in the microbial C use efficiency. We hypothesise this was due to insufficient nutrients to form new microbial biomass as LFOM input increased the ratio of C to nitrogen, phosphorus and sulphur of soil. Increased CO 2 efflux but constrained microbial growth in response to LFOM input demonstrated the difficulty for C storage in this environment.
Incorporating Grasslands into Cropping Systems: What are the Keys?
USDA-ARS?s Scientific Manuscript database
American agriculture in the 20th century has been shaped by social/political, economic, environmental and technological drivers. During this time, American agricultural systems became increasingly specialized and input driven resulting in agricultural production being dominated by ‘commodity crop p...
Highly efficient model updating for structural condition assessment of large-scale bridges.
DOT National Transportation Integrated Search
2015-02-01
For eciently updating models of large-scale structures, the response surface (RS) method based on radial basis : functions (RBFs) is proposed to model the input-output relationship of structures. The key issues for applying : the proposed method a...
Brain Friendly School Libraries
ERIC Educational Resources Information Center
Sykes, Judith Anne
2006-01-01
This title gives concrete practical examples of how to align school library programs and instructional practice with the six key concepts of brain-compatible learning: increasing input to the brain; increasing experiential data; multiple source feedback; reducing threat; involving students in learning decision making; and interdisciplinary unit…
The EPA Control Strategy Tool (CoST) is a software tool for projecting potential future control scenarios, their effects on emissions and estimated costs. This tool uses the NEI and the Control Measures Dataset as key inputs. CoST outputs are projections of future control scenarios.
Teaching Additional Languages. Educational Practices Series 6.
ERIC Educational Resources Information Center
Judd, Elliot L.; Tan, Lihua; Walberg, Herbert J.
This booklet describes key principles of and research on teaching additional languages. The 10 chapters focus on the following: (1) "Comprehensible Input" (learners need exposure to meaningful, understandable language); (2) "Language Opportunities" (classroom activities should let students use natural and meaningful language with their…
Neural Correlates of Sensory Substitution in Vestibular Pathways Following Complete Vestibular Loss
Sadeghi, Soroush G.; Minor, Lloyd B.; Cullen, Kathleen E.
2012-01-01
Sensory substitution is the term typically used in reference to sensory prosthetic devices designed to replace input from one defective modality with input from another modality. Such devices allow an alternative encoding of sensory information that is no longer directly provided by the defective modality in a purposeful and goal-directed manner. The behavioral recovery that follows complete vestibular loss is impressive and has long been thought to take advantage of a natural form of sensory substitution in which head motion information is no longer provided by vestibular inputs, but instead by extra-vestibular inputs such as proprioceptive and motor efference copy signals. Here we examined the neuronal correlates of this behavioral recovery after complete vestibular loss in alert behaving monkeys (Macaca mulata). We show for the first time that extra-vestibular inputs substitute for the vestibular inputs to stabilize gaze at the level of single neurons in the VOR premotor circuitry. The summed weighting of neck proprioceptive and efference copy information was sufficient to explain simultaneously observed behavioral improvements in gaze stability. Furthermore, by altering correspondence between intended and actual head movement we revealed a four-fold increase in the weight of neck motor efference copy signals consistent with the enhanced behavioral recovery observed when head movements are voluntary versus unexpected. Thus, taken together our results provide direct evidence that the substitution by extra-vestibular inputs in vestibular pathways provides a neural correlate for the improvements in gaze stability that are observed following the total loss of vestibular inputs. PMID:23077054
FSCATT: Angular Dependence and Filter Options.
The input routines to the code have been completely rewritten to allow for a free-form input format. The input routines now provide self-consistency checks and diagnostics for the user’s edification .
NASA Astrophysics Data System (ADS)
Zhuo, L.; Mekonnen, M. M.; Hoekstra, A. Y.
2014-06-01
Water Footprint Assessment is a fast-growing field of research, but as yet little attention has been paid to the uncertainties involved. This study investigates the sensitivity of and uncertainty in crop water footprint (in m3 t-1) estimates related to uncertainties in important input variables. The study focuses on the green (from rainfall) and blue (from irrigation) water footprint of producing maize, soybean, rice, and wheat at the scale of the Yellow River basin in the period 1996-2005. A grid-based daily water balance model at a 5 by 5 arcmin resolution was applied to compute green and blue water footprints of the four crops in the Yellow River basin in the period considered. The one-at-a-time method was carried out to analyse the sensitivity of the crop water footprint to fractional changes of seven individual input variables and parameters: precipitation (PR), reference evapotranspiration (ET0), crop coefficient (Kc), crop calendar (planting date with constant growing degree days), soil water content at field capacity (Smax), yield response factor (Ky) and maximum yield (Ym). Uncertainties in crop water footprint estimates related to uncertainties in four key input variables: PR, ET0, Kc, and crop calendar were quantified through Monte Carlo simulations. The results show that the sensitivities and uncertainties differ across crop types. In general, the water footprint of crops is most sensitive to ET0 and Kc, followed by the crop calendar. Blue water footprints were more sensitive to input variability than green water footprints. The smaller the annual blue water footprint is, the higher its sensitivity to changes in PR, ET0, and Kc. The uncertainties in the total water footprint of a crop due to combined uncertainties in climatic inputs (PR and ET0) were about ±20% (at 95% confidence interval). The effect of uncertainties in ET0was dominant compared to that of PR. The uncertainties in the total water footprint of a crop as a result of combined key input uncertainties were on average ±30% (at 95% confidence level).
Bridging Empirical and Physical Approaches for Landslide Monitoring and Early Warning
NASA Technical Reports Server (NTRS)
Kirschbaum, Dalia; Peters-Lidard, Christa; Adler, Robert; Kumar, Sujay; Harrison, Ken
2011-01-01
Rainfall-triggered landslides typically occur and are evaluated at local scales, using slope-stability models to calculate coincident changes in driving and resisting forces at the hillslope level in order to anticipate slope failures. Over larger areas, detailed high resolution landslide modeling is often infeasible due to difficulties in quantifying the complex interaction between rainfall infiltration and surface materials as well as the dearth of available in situ soil and rainfall estimates and accurate landslide validation data. This presentation will discuss how satellite precipitation and surface information can be applied within a landslide hazard assessment framework to improve landslide monitoring and early warning by considering two disparate approaches to landslide hazard assessment: an empirical landslide forecasting algorithm and a physical slope-stability model. The goal of this research is to advance near real-time landslide hazard assessment and early warning at larger spatial scales. This is done by employing high resolution surface and precipitation information within a probabilistic framework to provide more physically-based grounding to empirical landslide triggering thresholds. The empirical landslide forecasting tool, running in near real-time at http://trmm.nasa.gov, considers potential landslide activity at the global scale and relies on Tropical Rainfall Measuring Mission (TRMM) precipitation data and surface products to provide a near real-time picture of where landslides may be triggered. The physical approach considers how rainfall infiltration on a hillslope affects the in situ hydro-mechanical processes that may lead to slope failure. Evaluation of these empirical and physical approaches are performed within the Land Information System (LIS), a high performance land surface model processing and data assimilation system developed within the Hydrological Sciences Branch at NASA's Goddard Space Flight Center. LIS provides the capabilities to quantify uncertainty from model inputs and calculate probabilistic estimates for slope failures. Results indicate that remote sensing data can provide many of the spatiotemporal requirements for accurate landslide monitoring and early warning; however, higher resolution precipitation inputs will help to better identify small-scale precipitation forcings that contribute to significant landslide triggering. Future missions, such as the Global Precipitation Measurement (GPM) mission will provide more frequent and extensive estimates of precipitation at the global scale, which will serve as key inputs to significantly advance the accuracy of landslide hazard assessment, particularly over larger spatial scales.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alfonsi, Andrea; Rabiti, Cristian; Mandelli, Diego
2016-06-01
RAVEN is a software framework able to perform parametric and stochastic analysis based on the response of complex system codes. The initial development was aimed at providing dynamic risk analysis capabilities to the thermohydraulic code RELAP-7, currently under development at Idaho National Laboratory (INL). Although the initial goal has been fully accomplished, RAVEN is now a multi-purpose stochastic and uncertainty quantification platform, capable of communicating with any system code. In fact, the provided Application Programming Interfaces (APIs) allow RAVEN to interact with any code as long as all the parameters that need to be perturbed are accessible by input filesmore » or via python interfaces. RAVEN is capable of investigating system response and explore input space using various sampling schemes such as Monte Carlo, grid, or Latin hypercube. However, RAVEN strength lies in its system feature discovery capabilities such as: constructing limit surfaces, separating regions of the input space leading to system failure, and using dynamic supervised learning techniques. The development of RAVEN started in 2012 when, within the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program, the need to provide a modern risk evaluation framework arose. RAVEN’s principal assignment is to provide the necessary software and algorithms in order to employ the concepts developed by the Risk Informed Safety Margin Characterization (RISMC) program. RISMC is one of the pathways defined within the Light Water Reactor Sustainability (LWRS) program. In the RISMC approach, the goal is not just to identify the frequency of an event potentially leading to a system failure, but the proximity (or lack thereof) to key safety-related events. Hence, the approach is interested in identifying and increasing the safety margins related to those events. A safety margin is a numerical value quantifying the probability that a safety metric (e.g. peak pressure in a pipe) is exceeded under certain conditions. Most of the capabilities, implemented having RELAP-7 as a principal focus, are easily deployable to other system codes. For this reason, several side activates have been employed (e.g. RELAP5-3D, any MOOSE-based App, etc.) or are currently ongoing for coupling RAVEN with several different software. The aim of this document is to provide a set of commented examples that can help the user to become familiar with the RAVEN code usage.« less
What and how much do we eat? 24-hour dietary recall method.
Salvador Castell, Gemma; Serra-Majem, Lluis; Ribas-Barba, Lourdes
2015-02-26
Diet, along with lifestyle factors, is an important determinant of the health status of an individual and of a community. Dietary assessment at the population level provides us with key information on the frequency and distribution of possible inadequate diets and/or nutritional status. It is also useful as input into the elaboration of food and nutrition policies aiming to improve dietary habits and the health status of a community. This article reviews the characteristics, advantages and limitations of the 24-hour dietary recall method (24hDR), which is one of the most widely used tools in nutrition epidemiology to identify food, energy and nutrient intake in national nutrition surveys, cross-sectional studies, clinical trials and cohort studies as well as in the evaluation of individual dietary intake and Total Diet assessment. To reduce the key factors associated with bias, the importance of previously trained interviewers is highlighted, as well as the role of support materials and the contribution of novel technologies. Copyright AULA MEDICA EDICIONES 2015. Published by AULA MEDICA. All rights reserved.
Collaboration is key: The actual experience of disciplines working together in child care.
Garvis, Susanne; Kirkby, Jane; McMahon, Keryn; Meyer, Colleen
2016-03-01
Promoting young children's academic and developmental outcomes can no longer be achieved by the single efforts of one profession, but requires professionals to work together in inter-professional teams to understand the complexity of young children's lives. Collaboration in early childhood programs involves health professionals, educators, and other professionals sharing information, validating each other's roles, and providing input around which strategies promote positive outcomes for all children. There are, however, limited studies available within early childhood education on inter-disciplinary relationships between nurses and teachers. This paper helps to fill this void by exploring the relationship of an early childhood teacher and maternal and child health nurse working alongside one another in an Australian kindergarten. Through a narrative approach, a number of characteristics of the relationship were identified as key elements to a productive relationship. Findings are important for health professionals working with early childhood educators. By understanding the complexity within and between disciplines, professionals can work effectively to support young children and their families. © 2015 Wiley Publishing Asia Pty Ltd.
van Hemmen, J Leo
2014-10-01
This article analyzes the question of whether neuroscience allows for mathematical descriptions and whether an interaction between experimental and theoretical neuroscience can be expected to benefit both of them. It is argued that a mathematization of natural phenomena never happens by itself. First, appropriate key concepts must be found that are intimately connected with the phenomena one wishes to describe and explain mathematically. Second, the scale on, and not beyond, which a specific description can hold must be specified. Different scales allow for different conceptual and mathematical descriptions. This is the scaling hypothesis. Third, can a mathematical description be universally valid and, if so, how? Here we put forth the argument that universals also exist in theoretical neuroscience, that evolution proves the rule, and that theoretical neuroscience is a domain with still lots of space for new developments initiated by an intensive interaction with experiment. Finally, major insight is provided by a careful analysis of the way in which particular brain structures respond to perceptual input and in so doing induce action in an animal's surroundings.
It's time to swim! Zebrafish and the circadian clock.
Vatine, Gad; Vallone, Daniela; Gothilf, Yoav; Foulkes, Nicholas S
2011-05-20
The zebrafish represents a fascinating model for studying key aspects of the vertebrate circadian timing system. Easy access to early embryonic development has made this species ideal for investigating how the clock is first established during embryogenesis. In particular, the molecular basis for the functional development of the zebrafish pineal gland has received much attention. In addition to this dedicated clock and photoreceptor organ, and unlike the situation in mammals, the clocks in zebrafish peripheral tissues and even cell lines are entrainable by direct exposure to light thus providing unique insight into the function and evolution of the light input pathway. Finally, the small size, low maintenance costs and high fecundity of this fish together with the availability of genetic tools make this an attractive model for forward genetic analysis of the circadian clock. Here, we review the work that has established the zebrafish as a valuable clock model organism and highlight the key questions that will shape the future direction of research. Copyright © 2011 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.
Grefsheim, Suzanne F; Rankin, Jocelyn A; Perry, Gerald J; McKibbon, K Ann
2008-04-01
Building on its 1995 research policy statement, the Medical Library Association (MLA) has issued a new research policy, The Research Imperative. This paper shares the background research that informed the new policy. Semi-structured interviews were conducted with fifty-one key informants representing various library types, functions, geographic locations, ages, and ethnicities. The grounded theory approach was used to analyze the resulting textual database. Additionally, to gather input from the membership as a whole, two open forums were held at MLA annual meetings. Key informant data indicated that the policy should provide roles for MLA in leadership, advocacy, collaboration, services, education, publishing, and development of a research agenda. Evidence-based library and information practice was emphasized. Six themes emerged to center the new policy: creation of a research culture, challenges, domains of research, research skills set, roles of stakeholders, and measurement of progress. Reflecting the interests and beliefs of the membership, The Research Imperative challenges MLA members to build a supportive culture that values and contributes to a research base that is recognized as an essential tool for future practice.
Distributed parametric amplifier for RZ-DPSK signal transmission system.
Xu, Xing; Zhang, Chi; Yuk, T I; Wong, Kenneth K Y
2012-08-13
We have experimentally demonstrated a single pump distributed parametric amplification (DPA) system for differential phase shift keying (DPSK) signal in a spool of dispersion-shifted fiber (DSF). The gain spectrum of single pump DPA is thoroughly investigated by both simulation and experiment, and a possible reference for optimal input pump power and fiber length relationship is provided to DPA based applications. Furthermore, DPSK format is compared with on-off keying (OOK) within DPA scheme. Eight WDM signal channels at 10-Gb/s are utilized, and approximately 0.5-dB power penalties at the bit-error rate (BER) of 10(-9) are achieved for return-to-zero DPSK (RZ-DPSK), comparing to larger than 1.5-dB with OOK format. In order to improve the system power efficiency, at the receiver, the pump is recycled by a photovoltaic cell and the converted energy can be used by potential low-power-consuming devices, i.e sensors or small-scale electronic circuits. Additionally, with suitable components, the whole DPA concept could be directly applied to the 1.3-μm telecommunication window along the most commonly used single-mode fiber (SMF).
Systems and methods for improved telepresence
Anderson, Matthew O.; Willis, W. David; Kinoshita, Robert A.
2005-10-25
The present invention provides a modular, flexible system for deploying multiple video perception technologies. The telepresence system of the present invention is capable of allowing an operator to control multiple mono and stereo video inputs in a hands-free manner. The raw data generated by the input devices is processed into a common zone structure that corresponds to the commands of the user, and the commands represented by the zone structure are transmitted to the appropriate device. This modularized approach permits input devices to be easily interfaced with various telepresence devices. Additionally, new input devices and telepresence devices are easily added to the system and are frequently interchangeable. The present invention also provides a modular configuration component that allows an operator to define a plurality of views each of which defines the telepresence devices to be controlled by a particular input device. The present invention provides a modular flexible system for providing telepresence for a wide range of applications. The modularization of the software components combined with the generalized zone concept allows the systems and methods of the present invention to be easily expanded to encompass new devices and new uses.
Input design for identification of aircraft stability and control derivatives
NASA Technical Reports Server (NTRS)
Gupta, N. K.; Hall, W. E., Jr.
1975-01-01
An approach for designing inputs to identify stability and control derivatives from flight test data is presented. This approach is based on finding inputs which provide the maximum possible accuracy of derivative estimates. Two techniques of input specification are implemented for this objective - a time domain technique and a frequency domain technique. The time domain technique gives the control input time history and can be used for any allowable duration of test maneuver, including those where data lengths can only be of short duration. The frequency domain technique specifies the input frequency spectrum, and is best applied for tests where extended data lengths, much longer than the time constants of the modes of interest, are possible. These technqiues are used to design inputs to identify parameters in longitudinal and lateral linear models of conventional aircraft. The constraints of aircraft response limits, such as on structural loads, are realized indirectly through a total energy constraint on the input. Tests with simulated data and theoretical predictions show that the new approaches give input signals which can provide more accurate parameter estimates than can conventional inputs of the same total energy. Results obtained indicate that the approach has been brought to the point where it should be used on flight tests for further evaluation.
High frequency inductive lamp and power oscillator
Kirkpatrick, Douglas A.; Gitsevich, Aleksandr
2005-09-27
An oscillator includes an amplifier having an input and an output, a feedback network connected between the input of the amplifier and the output of the amplifier, the feedback network being configured to provide suitable positive feedback from the output of the amplifier to the input of the amplifier to initiate and sustain an oscillating condition, and a tuning circuit connected to the input of the amplifier, wherein the tuning circuit is continuously variable and consists of solid state electrical components with no mechanically adjustable devices including a pair of diodes connected to each other at their respective cathodes with a control voltage connected at the junction of the diodes. Another oscillator includes an amplifier having an input and an output, a feedback network connected between the input of the amplifier and the output of the amplifier, the feedback network being configured to provide suitable positive feedback from the output of the amplifier to the input of the amplifier to initiate and sustain an oscillating condition, and transmission lines connected to the input of the amplifier with an input pad and a perpendicular transmission line extending from the input pad and forming a leg of a resonant "T", and wherein the feedback network is coupled to the leg of the resonant "T".
Frequency and function in the basal ganglia: the origins of beta and gamma band activity.
Blenkinsop, Alexander; Anderson, Sean; Gurney, Kevin
2017-07-01
Neuronal oscillations in the basal ganglia have been observed to correlate with behaviours, although the causal mechanisms and functional significance of these oscillations remain unknown. We present a novel computational model of the healthy basal ganglia, constrained by single unit recordings from non-human primates. When the model is run using inputs that might be expected during performance of a motor task, the network shows emergent phenomena: it functions as a selection mechanism and shows spectral properties that match those seen in vivo. Beta frequency oscillations are shown to require pallido-striatal feedback, and occur with behaviourally relevant cortical input. Gamma oscillations arise in the subthalamic-globus pallidus feedback loop, and occur during movement. The model provides a coherent framework for the study of spectral, temporal and functional analyses of the basal ganglia and lays the foundation for an integrated approach to study basal ganglia pathologies such as Parkinson's disease in silico. Neural oscillations in the basal ganglia (BG) are well studied yet remain poorly understood. Behavioural correlates of spectral activity are well described, yet a quantitative hypothesis linking time domain dynamics and spectral properties to BG function has been lacking. We show, for the first time, that a unified description is possible by interpreting previously ignored structure in data describing globus pallidus interna responses to cortical stimulation. These data were used to expose a pair of distinctive neuronal responses to the stimulation. This observation formed the basis for a new mathematical model of the BG, quantitatively fitted to the data, which describes the dynamics in the data, and is validated against other stimulus protocol experiments. A key new result is that when the model is run using inputs hypothesised to occur during the performance of a motor task, beta and gamma frequency oscillations emerge naturally during static-force and movement, respectively, consistent with experimental local field potentials. This new model predicts that the pallido-striatum connection has a key role in the generation of beta band activity, and that the gamma band activity associated with motor task performance has its origins in the pallido-subthalamic feedback loop. The network's functionality as a selection mechanism also occurs as an emergent property, and closer fits to the data gave better selection properties. The model provides a coherent framework for the study of spectral, temporal and functional analyses of the BG and therefore lays the foundation for an integrated approach to study BG pathologies such as Parkinson's disease in silico. © 2017 The Authors. The Journal of Physiology © 2017 The Physiological Society.
Impact of inherent meteorology uncertainty on air quality model predictions
It is well established that there are a number of different classifications and sources of uncertainties in environmental modeling systems. Air quality models rely on two key inputs, namely, meteorology and emissions. When using air quality models for decision making, it is impor...
Public Engagement. IDRA Focus.
ERIC Educational Resources Information Center
IDRA Newsletter, 1996
1996-01-01
This newsletter includes six articles that examine key issues facing public schools and communities related to accountability, bilingual education, immigrant education, school finance, and school choice. In addressing these issues, articles focus on the importance of community involvement and input in local school reform efforts aimed at achieving…
Applications of Land Surface Temperature from Microwave Observations
USDA-ARS?s Scientific Manuscript database
Land surface temperature (LST) is a key input for physically-based retrieval algorithms of hydrological states and fluxes. Yet, it remains a poorly constrained parameter for global scale studies. The main two observational methods to remotely measure T are based on thermal infrared (TIR) observation...
TxACOL workshop : Texas asphalt concrete overlay design and analysis system.
DOT National Transportation Integrated Search
2010-01-01
General Information: : -Two workshops were held respectively on Aug. 25 at Paris, Tx and on Oct. 6 at Austin, Tx, : -More than 30 representatives from TxDOT attended, : -Introduction of TxACOL software, key input parameters, and related lab and field...
Residential air exchange rates (AERs) are a key determinant in the infiltration of ambient air pollution indoors. Population-based human exposure models using probabilistic approaches to estimate personal exposure to air pollutants have relied on input distributions from AER meas...
Application of the wavelet transform for speech processing
NASA Technical Reports Server (NTRS)
Maes, Stephane
1994-01-01
Speaker identification and word spotting will shortly play a key role in space applications. An approach based on the wavelet transform is presented that, in the context of the 'modulation model,' enables extraction of speech features which are used as input for the classification process.
Impacts of Lateral Boundary Conditions on U.S. Ozone Modeling Analyses
Chemical boundary conditions are a key input to regional-scale photochemical models. In this study, we perform annual simulations over North America with chemical boundary conditions prepared from two global models (GEOS-CHEM and Hemispheric CMAQ). Results indicate that the impac...
Sperry Univac speech communications technology
NASA Technical Reports Server (NTRS)
Medress, Mark F.
1977-01-01
Technology and systems for effective verbal communication with computers were developed. A continuous speech recognition system for verbal input, a word spotting system to locate key words in conversational speech, prosodic tools to aid speech analysis, and a prerecorded voice response system for speech output are described.
NASA Astrophysics Data System (ADS)
Cotrufo, M. F.
2017-12-01
Mineral-associated organic matter (MAOM) is the largest and most persistent pool of carbon in soil. Understanding and correctly modeling its dynamic is key to suggest management practices that can augment soil carbon storage for climate change mitigation, as well as increase soil organic matter (SOM) stocks to support soil health on the long-term. In the Microbial Efficiency Mineral Stabilization (MEMS) framework we proposed that, contrary to what originally thought, this form of persistent SOM is derived from the labile components of plant inputs, through their efficient microbial processing. I will present results from several experiments using dual isotope labeling of plant inputs that largely confirm this opinion, and point to the key role of dissolved organic matter in MAOM formation, and to the dynamic nature of the outer layer of MAOM. I will also show how we are incorporating this understanding in a new SOM model, which uses physically defined measurable pools rather than turnover-defined pools to forecast C cycling in soil.
High-throughput automated microfluidic sample preparation for accurate microbial genomics
Kim, Soohong; De Jonghe, Joachim; Kulesa, Anthony B.; Feldman, David; Vatanen, Tommi; Bhattacharyya, Roby P.; Berdy, Brittany; Gomez, James; Nolan, Jill; Epstein, Slava; Blainey, Paul C.
2017-01-01
Low-cost shotgun DNA sequencing is transforming the microbial sciences. Sequencing instruments are so effective that sample preparation is now the key limiting factor. Here, we introduce a microfluidic sample preparation platform that integrates the key steps in cells to sequence library sample preparation for up to 96 samples and reduces DNA input requirements 100-fold while maintaining or improving data quality. The general-purpose microarchitecture we demonstrate supports workflows with arbitrary numbers of reaction and clean-up or capture steps. By reducing the sample quantity requirements, we enabled low-input (∼10,000 cells) whole-genome shotgun (WGS) sequencing of Mycobacterium tuberculosis and soil micro-colonies with superior results. We also leveraged the enhanced throughput to sequence ∼400 clinical Pseudomonas aeruginosa libraries and demonstrate excellent single-nucleotide polymorphism detection performance that explained phenotypically observed antibiotic resistance. Fully-integrated lab-on-chip sample preparation overcomes technical barriers to enable broader deployment of genomics across many basic research and translational applications. PMID:28128213
Graphene-assisted multiple-input high-base optical computing
Hu, Xiao; Wang, Andong; Zeng, Mengqi; Long, Yun; Zhu, Long; Fu, Lei; Wang, Jian
2016-01-01
We propose graphene-assisted multiple-input high-base optical computing. We fabricate a nonlinear optical device based on a fiber pigtail cross-section coated with a single-layer graphene grown by chemical vapor deposition (CVD) method. An approach to implementing modulo 4 operations of three-input hybrid addition and subtraction of quaternary base numbers in the optical domain using multiple non-degenerate four-wave mixing (FWM) processes in graphene coated optical fiber device and (differential) quadrature phase-shift keying ((D)QPSK) signals is presented. We demonstrate 10-Gbaud modulo 4 operations of three-input quaternary hybrid addition and subtraction (A + B − C, A + C − B, B + C − A) in the experiment. The measured optical signal-to-noise ratio (OSNR) penalties for modulo 4 operations of three-input quaternary hybrid addition and subtraction (A + B − C, A + C − B, B + C − A) are measured to be less than 7 dB at a bit-error rate (BER) of 2 × 10−3. The BER performance as a function of the relative time offset between three signals (signal offset) is also evaluated showing favorable performance. PMID:27604866
Cerina, Federica; Zhu, Zhen; Chessa, Alessandro; Riccaboni, Massimo
2015-01-01
Production systems, traditionally analyzed as almost independent national systems, are increasingly connected on a global scale. Only recently becoming available, the World Input-Output Database (WIOD) is one of the first efforts to construct the global multi-regional input-output (GMRIO) tables. By viewing the world input-output system as an interdependent network where the nodes are the individual industries in different economies and the edges are the monetary goods flows between industries, we analyze respectively the global, regional, and local network properties of the so-called world input-output network (WION) and document its evolution over time. At global level, we find that the industries are highly but asymmetrically connected, which implies that micro shocks can lead to macro fluctuations. At regional level, we find that the world production is still operated nationally or at most regionally as the communities detected are either individual economies or geographically well defined regions. Finally, at local level, for each industry we compare the network-based measures with the traditional methods of backward linkages. We find that the network-based measures such as PageRank centrality and community coreness measure can give valuable insights into identifying the key industries. PMID:26222389
Using SMAP data to improve drought early warning over the US Great Plains
NASA Astrophysics Data System (ADS)
Fu, R.; Fernando, N.; Tang, W.
2015-12-01
A drought prone region such as the Great Plains of the United States (US GP) requires credible and actionable drought early warning. Such information cannot simply be extracted from available climate forecasts because of their large uncertainties at regional scales, and unclear connections to the needs of the decision makers. In particular, current dynamic seasonal predictions and climate projections, such as those produced by the NOAA North American Multi-Model Ensemble experiment (NMME) are much more reliable for winter and spring than for the summer season for the US GP. To mitigate the weaknesses of dynamic prediction/projections, we have identified three key processes behind the spring-to-summer dry memory through observational studies, as the scientific basis for a statistical drought early warning system. This system uses percentile soil moisture anomalies in spring as a key input to provide a probabilistic summer drought early warning. The latter outperforms the dynamic prediction over the US Southern Plains and has been used by the Texas state water agency to support state drought preparedness. A main source of uncertainty for this drought early warning system is the soil moisture input obtained from the NOAA Climate Forecasting System (CFS). We are testing use of the beta version of NASA Soil Moisture Active Passive (SMAP) soil moisture data, along with the Soil Moisture and Ocean Salinity (SMOS), and the long-term Essential Climate Variable Soil Moisture (ECV-SM) soil moisture data, to reduce this uncertainty. Preliminary results based on ECV-SM suggests satellite based soil moisture data could improve early warning of rainfall anomalies over the western US GP with less dense vegetation. The skill degrades over the eastern US GP where denser vegetation is found. We evaluate our SMAP-based drought early warning for 2015 summer against observations.
Improving informed consent: Stakeholder views
Anderson, Emily E.; Newman, Susan B.; Matthews, Alicia K.
2017-01-01
Purpose Innovation will be required to improve the informed consent process in research. We aimed to obtain input from key stakeholders—research participants and those responsible for obtaining informed consent—to inform potential development of a multimedia informed consent “app.” Methods This descriptive study used a mixed-methods approach. Five 90-minute focus groups were conducted with volunteer samples of former research participants and researchers/research staff responsible for obtaining informed consent. Participants also completed a brief survey that measured background information and knowledge and attitudes regarding research and the use of technology. Established qualitative methods were used to conduct the focus groups and data analysis. Results We conducted five focus groups with 41 total participants: three groups with former research participants (total n = 22), and two groups with researchers and research coordinators (total n = 19). Overall, individuals who had previously participated in research had positive views regarding their experiences. However, further discussion elicited that the informed consent process often did not meet its intended objectives. Findings from both groups are presented according to three primary themes: content of consent forms, experience of the informed consent process, and the potential of technology to improve the informed consent process. A fourth theme, need for lay input on informed consent, emerged from the researcher groups. Conclusions Our findings add to previous research that suggests that the use of interactive technology has the potential to improve the process of informed consent. However, our focus-group findings provide additional insight that technology cannot replace the human connection that is central to the informed consent process. More research that incorporates the views of key stakeholders is needed to ensure that multimedia consent processes do not repeat the mistakes of paper-based consent forms. PMID:28949896
Improving informed consent: Stakeholder views.
Anderson, Emily E; Newman, Susan B; Matthews, Alicia K
2017-01-01
Innovation will be required to improve the informed consent process in research. We aimed to obtain input from key stakeholders-research participants and those responsible for obtaining informed consent-to inform potential development of a multimedia informed consent "app." This descriptive study used a mixed-methods approach. Five 90-minute focus groups were conducted with volunteer samples of former research participants and researchers/research staff responsible for obtaining informed consent. Participants also completed a brief survey that measured background information and knowledge and attitudes regarding research and the use of technology. Established qualitative methods were used to conduct the focus groups and data analysis. We conducted five focus groups with 41 total participants: three groups with former research participants (total n = 22), and two groups with researchers and research coordinators (total n = 19). Overall, individuals who had previously participated in research had positive views regarding their experiences. However, further discussion elicited that the informed consent process often did not meet its intended objectives. Findings from both groups are presented according to three primary themes: content of consent forms, experience of the informed consent process, and the potential of technology to improve the informed consent process. A fourth theme, need for lay input on informed consent, emerged from the researcher groups. Our findings add to previous research that suggests that the use of interactive technology has the potential to improve the process of informed consent. However, our focus-group findings provide additional insight that technology cannot replace the human connection that is central to the informed consent process. More research that incorporates the views of key stakeholders is needed to ensure that multimedia consent processes do not repeat the mistakes of paper-based consent forms.
NASA Technical Reports Server (NTRS)
Sturman, J.
1968-01-01
Stable input stage was designed for the use with a integrated circuit operational amplifier to provide improved performance as an instrumentation-type amplifier. The circuit provides high input impedance, stable gain, good common mode rejection, very low drift, and low output impedance.
Time Triggered Ethernet System Testing Means and Method
NASA Technical Reports Server (NTRS)
Smithgall, William Todd (Inventor); Hall, Brendan (Inventor); Varadarajan, Srivatsan (Inventor)
2014-01-01
Methods and apparatus are provided for evaluating the performance of a Time Triggered Ethernet (TTE) system employing Time Triggered (TT) communication. A real TTE system under test (SUT) having real input elements communicating using TT messages with output elements via one or more first TTE switches during a first time interval schedule established for the SUT. A simulation system is also provided having input simulators that communicate using TT messages via one or more second TTE switches with the same output elements during a second time interval schedule established for the simulation system. The first and second time interval schedules are off-set slightly so that messages from the input simulators, when present, arrive at the output elements prior to messages from the analogous real inputs, thereby having priority over messages from the real inputs and causing the system to operate based on the simulated inputs when present.
Integrated controls design optimization
Lou, Xinsheng; Neuschaefer, Carl H.
2015-09-01
A control system (207) for optimizing a chemical looping process of a power plant includes an optimizer (420), an income algorithm (230) and a cost algorithm (225) and a chemical looping process models. The process models are used to predict the process outputs from process input variables. Some of the process in puts and output variables are related to the income of the plant; and some others are related to the cost of the plant operations. The income algorithm (230) provides an income input to the optimizer (420) based on a plurality of input parameters (215) of the power plant. The cost algorithm (225) provides a cost input to the optimizer (420) based on a plurality of output parameters (220) of the power plant. The optimizer (420) determines an optimized operating parameter solution based on at least one of the income input and the cost input, and supplies the optimized operating parameter solution to the power plant.
Dual Brushless Resolver Rate Sensor
NASA Technical Reports Server (NTRS)
Howard, David E. (Inventor)
1996-01-01
This invention relates to dual analog angular rate sensors which are implemented without the use of mechanical brushes. A resolver rate sensor which includes two brushless resolvers which are mechanically coupled to the same output shaft is provided with inputs which are provided to each resolver by providing the first resolver with a DC input and the second resolver with an AC sinusoidal input. A trigonometric identity in which the sum of the squares of the sin and cosine components equal one is used to advantage in providing a sensor of increased accuracy. The first resolver may have a fixed or variable DC input to permit dynamic adjustment of resolver sensitivity thus permitting a wide range of coverage. Novelty and advantages of the invention reside in the excitation of a resolver with a DC signal and in the utilization of two resolvers and the trigonometric identity of cos(exp 2)(theta) + sin(exp 2)(theta) = 1 to provide an accurate rate sensor which is sensitive to direction and accurate through zero rate.
High-performance ultra-low power VLSI analog processor for data compression
NASA Technical Reports Server (NTRS)
Tawel, Raoul (Inventor)
1996-01-01
An apparatus for data compression employing a parallel analog processor. The apparatus includes an array of processor cells with N columns and M rows wherein the processor cells have an input device, memory device, and processor device. The input device is used for inputting a series of input vectors. Each input vector is simultaneously input into each column of the array of processor cells in a pre-determined sequential order. An input vector is made up of M components, ones of which are input into ones of M processor cells making up a column of the array. The memory device is used for providing ones of M components of a codebook vector to ones of the processor cells making up a column of the array. A different codebook vector is provided to each of the N columns of the array. The processor device is used for simultaneously comparing the components of each input vector to corresponding components of each codebook vector, and for outputting a signal representative of the closeness between the compared vector components. A combination device is used to combine the signal output from each processor cell in each column of the array and to output a combined signal. A closeness determination device is then used for determining which codebook vector is closest to an input vector from the combined signals, and for outputting a codebook vector index indicating which of the N codebook vectors was the closest to each input vector input into the array.
Support to X-33/Reusable Launch Vehicle Technology Program
NASA Technical Reports Server (NTRS)
2000-01-01
The Primary activities of Lee & Associates for the referenced Purchase Order has been in direct support of the X-33/Reusable Launch Vehicle Technology Program. An independent review to evaluate the X-33 liquid hydrogen fuel tank failure, which recently occurred after-test of the starboard tank has been provided. The purpose of the Investigation team was to assess the tank design modifications, provide an assessment of the testing approach used by MSFC (Marshall Space Flight Center) in determining the flight worthiness of the tank, assessing the structural integrity, and determining the cause of the failure of the tank. The approach taken to satisfy the objectives has been for Lee & Associates to provide the expertise of Mr. Frank Key and Mr. Wayne Burton who have relevant experience from past programs and a strong background of experience in the fields critical to the success of the program. Mr. Key and Mr. Burton participated in the NASA established Failure Investigation Review Team to review the development and process data and to identify any design, testing or manufacturing weaknesses and potential problem areas. This approach worked well in satisfying the objectives and providing the Review Team with valuable information including the development of a Fault Tree. The detailed inputs were made orally in real time in the Review Team daily meetings. The results of the investigation were presented to the MSFC Center Director by the team on February 15, 2000. Attached are four charts taken from that presentation which includes 1) An executive summary, 2) The most probable cause, 3) Technology assessment, and 4) Technology Recommendations for Cryogenic tanks.
NASA Astrophysics Data System (ADS)
Sus, Oliver; Stengel, Martin; Stapelberg, Stefan; McGarragh, Gregory; Poulsen, Caroline; Povey, Adam C.; Schlundt, Cornelia; Thomas, Gareth; Christensen, Matthew; Proud, Simon; Jerg, Matthias; Grainger, Roy; Hollmann, Rainer
2018-06-01
We present here the key features of the Community Cloud retrieval for CLimate (CC4CL) processing algorithm. We focus on the novel features of the framework: the optimal estimation approach in general, explicit uncertainty quantification through rigorous propagation of all known error sources into the final product, and the consistency of our long-term, multi-platform time series provided at various resolutions, from 0.5 to 0.02°. By describing all key input data and processing steps, we aim to inform the user about important features of this new retrieval framework and its potential applicability to climate studies. We provide an overview of the retrieved and derived output variables. These are analysed for four, partly very challenging, scenes collocated with CALIOP (Cloud-Aerosol lidar with Orthogonal Polarization) observations in the high latitudes and over the Gulf of Guinea-West Africa. The results show that CC4CL provides very realistic estimates of cloud top height and cover for optically thick clouds but, where optically thin clouds overlap, returns a height between the two layers. CC4CL is a unique, coherent, multi-instrument cloud property retrieval framework applicable to passive sensor data of several EO missions. Through its flexibility, CC4CL offers the opportunity for combining a variety of historic and current EO missions into one dataset, which, compared to single sensor retrievals, is improved in terms of accuracy and temporal sampling.
Transfer Function Control for Biometric Monitoring System
NASA Technical Reports Server (NTRS)
Chmiel, Alan J. (Inventor); Grodinsky, Carlos M. (Inventor); Humphreys, Bradley T. (Inventor)
2015-01-01
A modular apparatus for acquiring biometric data may include circuitry operative to receive an input signal indicative of a biometric condition, the circuitry being configured to process the input signal according to a transfer function thereof and to provide a corresponding processed input signal. A controller is configured to provide at least one control signal to the circuitry to programmatically modify the transfer function of the modular system to facilitate acquisition of the biometric data.
A Simple Semaphore Signaling Technique for Ultra-High Frequency Spacecraft Communications
NASA Technical Reports Server (NTRS)
Butman, S.; Satorius, E.; Ilott, P.
2005-01-01
For planetary lander missions such as the upcoming Phoenix mission to Mars, the most challenging phase of the spacecraft-to-ground communications is during the critical phase termed entry, descent, and landing (EDL). At 8.4 GHz (X-band), the signals received by the largest Deep Space Network (DSN) antennas can be too weak for even 1 bit per second (bps) and therefore not able to communicate critical information to Earth. Fortunately, the lander s ultra-high frequency (UHF) link to an orbiting relay can meet the EDL requirements, but the data rate needs to be low enough to fit the capability of the UHF link during some or all of EDL. On Phoenix, the minimum data rate of the as-built UHF radio is 8 kbps and requires a signal level at the Odyssey orbiter of at least -120 dBm. For lower signaling levels, the effective data rate needs to be reduced, but without incurring the cost of rebuilding and requalifying the equipment. To address this scenario, a simple form of frequency-shift keying (FSK) has been devised by appropriately programming the data stream that is input to the UHF transceiver. This article describes this technique and provides performance estimates. Laboratory testing reveals that input signal levels at -140 dBm and lower can routinely be demodulated with the proposed signaling scheme, thereby providing a 20-dB and greater margin over the 8-kbps threshold.
A Simple Semaphore Signaling Technique for Ultra-High Frequency Spacecraft Communications
NASA Astrophysics Data System (ADS)
Butman, S.; Satorius, E.; Illott, P.
2005-11-01
For planetary lander missions such as the upcoming Phoenix mission to Mars, the most challenging phase of the spacecraft-to-ground communications is during the critical phase termed entry, descent, and landing (EDL). At 8.4 GHz (X-band), the signals received by the largest Deep Space Network (DSN) antennas can be too weak for even 1 bit per second (bps) and therefore not able to communicate critical information to Earth. Fortunately, the lander's ultra-high frequency (UHF) link to an orbiting relay can meet the EDL requirements, but the data rate needs to be low enough to fit the capability of the UHF link during some or all of EDL. On Phoenix, the minimum data rate of the as-built UHF radio is 8 kbps and requires a signal level at the Odyssey orbiter of at least minus 120 dBm. For lower signaling levels, the effective data rate needs to be reduced, but without incurring the cost of rebuilding and requalifying the equipment. To address this scenario, a simple form of frequency-shift keying (FSK) has been devised by appropriately programming the data stream that is input to the UHF transceiver. This article describes this technique and provides performance estimates. Laboratory testing reveals that input signal levels at minus 140 dBm and lower can routinely be demodulated with the proposed signaling scheme, thereby providing a 20-dB and greater margin over the 8-kbps threshold.
Mathers, Jonathan; Taylor, Rebecca; Parry, Jayne
2017-03-01
The Health Trainers Service is one of the few public health policies where a bespoke database-the Data Collection and Reporting System (DCRS)-was developed to monitor performance. We seek to understand the context within which local services and staff have used the DCRS and to consider how this might influence interpretation of collected data. In-depth case studies of six local services purposively sampled to represent the range of service provider arrangements, including detailed interviews with key stakeholders (n = 118). Capturing detailed information on activity with clients was alien to many health trainers' work practices. This related to technical challenges, but it also ran counter to beliefs as to how a 'lay' service would operate. Interviewees noted the inadequacy of the dataset to capture all client impacts; that is, it did not enable them to input information about issues a client living in a deprived neighbourhood might experience and seek help to address. The utility of the DCRS may be compromised both by incomplete ascertainment of activity and by incorrect data inputted by some Health Trainers. The DCRS is also underestimate the effectiveness of work health trainers have undertaken to address 'upstream' factors affecting client health. © The Author 2016. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Sensitivity Analysis as a Tool to assess Energy-Water Nexus in India
NASA Astrophysics Data System (ADS)
Priyanka, P.; Banerjee, R.
2017-12-01
Rapid urbanization, population growth and related structural changes with-in the economy of a developing country act as a stressor on energy and water demand, which forms a well-established energy-water nexus. Energy-water nexus is thoroughly studied at various spatial scales viz. city level, river basin level and national level- to guide different stakeholders for sustainable management of energy and water. However, temporal dimensions of energy-water nexus at national level have not been thoroughly investigated because of unavailability of relevant time-series data. In this study we investigated energy-water nexus at national level using environmentally-extended input-output tables for Indian economy (2004-2013) as provided by EORA database. Perturbation based sensitivity analysis is proposed to highlight the critical nodes of interactions among economic sectors which is further linked to detect the synergistic effects of energy and water consumption. Technology changes (interpreted as change in value of nodes) results in modification of interactions among economic sectors and synergy is affected through direct as well as indirect effects. Indirect effects are not easily understood through preliminary examination of data, hence sensitivity analysis within an input-output framework is important to understand the indirect effects. Furthermore, time series data helps in developing the understanding on dynamics of synergistic effects. We identified the key sectors and technology changes for Indian economy which will provide the better decision support for policy makers about sustainable use of energy-water resources in India.
Nonlinear dynamic characteristics of dielectric elastomer membranes
NASA Astrophysics Data System (ADS)
Fox, Jason W.; Goulbourne, Nakhiah C.
2008-03-01
The dynamic response of dielectric elastomer membranes subject to time-varying voltage inputs for various initial inflation states is investigated. These results provide new insight into the differences observed between quasi-static and dynamic actuation and presents a new challenge to modeling efforts. Dielectric elastomer membranes are a potentially enabling technology for soft robotics and biomedical devices such as implants and surgical tools. In this work, two key system parameters are varied: the chamber volume and the voltage signal offset. The chamber volume experiments reveal that increasing the size of the chamber onto which the membrane is clamped will increase the deformations as well as cause the membrane's resonance peaks to shift and change in number. For prestretched dielectric elastomer membranes at the smallest chamber volume, the maximum actuation displacement is 81 microns; while at the largest chamber volume, the maximum actuation displacement is 1431 microns. This corresponds to a 1767% increase in maximum pole displacement. In addition, actuating the membrane at the resonance frequencies provides hundreds of percent increase in strain compared to the quasi-static strain. Adding a voltage offset to the time-varying input signal causes the membrane to oscillate at two distinct frequencies rather than one and also presents a unique opportunity to increase the output displacement without electrically overloading the membrane. Experiments to capture the entire motion of the membrane reveal that classical membrane mode shapes are electrically generated although all points of the membrane do not pass through equilibrium at the same moments in time.
Biosurveillance applying scan statistics with multiple, disparate data sources.
Burkom, Howard S
2003-06-01
Researchers working on the Department of Defense Global Emerging Infections System (DoD-GEIS) pilot system, the Electronic Surveillance System for the Early Notification of Community-Based Epidemics (ESSENCE), have applied scan statistics for early outbreak detection using both traditional and nontraditional data sources. These sources include medical data indexed by International Classification of Disease, 9th Revision (ICD-9) diagnosis codes, as well as less-specific, but potentially timelier, indicators such as records of over-the-counter remedy sales and of school absenteeism. Early efforts employed the Kulldorff scan statistic as implemented in the SaTScan software of the National Cancer Institute. A key obstacle to this application is that the input data streams are typically based on time-varying factors, such as consumer behavior, rather than simply on the populations of the component subregions. We have used both modeling and recent historical data distributions to obtain background spatial distributions. Data analyses have provided guidance on how to condition and model input data to avoid excessive clustering. We have used this methodology in combining data sources for both retrospective studies of known outbreaks and surveillance of high-profile events of concern to local public health authorities. We have integrated the scan statistic capability into a Microsoft Access-based system in which we may include or exclude data sources, vary time windows separately for different data sources, censor data from subsets of individual providers or subregions, adjust the background computation method, and run retrospective or simulated studies.
The Nooksack-Abbotsford-Sumas (NAS) Transboundary Watershed, spanning which spans a portion of the western interface of British Columbia, Washington State, and the Lummi Nation and the Nooksack Tribal lands , supports agriculture, estuarine fisheries, diverse wildlife, and urban ...
Assessment of Important SPECIATE Profiles in EPA’s Emissions Modeling Platform and Current Data Gaps
The US Environmental Protection Agency (EPA)’s SPECIATE database contains speciation profiles for both particulate matter (PM) and volatile organic compounds (VOCs) that are key inputs for creating speciated emission inventories for air quality modeling. The objective of th...
Error characterization of microwave satellite soil moisture data sets using fourier analysis
USDA-ARS?s Scientific Manuscript database
Soil moisture is a key geophysical variable in hydrological and meteorological processes. Accurate and current observations of soil moisture over meso to global scales used as inputs to hydrological, weather and climate modelling will benefit the predictability and understanding of these processes. ...
Error characterization of microwave satellite soil moisture data sets using fourier analysis
USDA-ARS?s Scientific Manuscript database
Abstract: Soil moisture is a key geophysical variable in hydrological and meteorological processes. Accurate and current observations of soil moisture over mesoscale to global scales as inputs to hydrological, weather and climate modelling will benefit the predictability and understanding of these p...
Progress on 58 m2 Passive Resonant Ring Laser Gyroscope,
Pad; design of the optical-mechanical hardware to input the laser to the ring; investigations to insure against ZERODUR bar buckling associated with the...ring evacuation force; verification of ZERODUR physical properties which are key to this application, e.g. compressibility resulting from the usual
Exploring the Underlying Mechanisms of the Xenopus laevis Embryonic Cell Cycle.
Zhang, Kun; Wang, Jin
2018-05-31
The cell cycle is an indispensable process in proliferation and development. Despite significant efforts, global quantification and physical understanding are still challenging. In this study, we explored the mechanisms of the Xenopus laevis embryonic cell cycle by quantifying the underlying landscape and flux. We uncovered the Mexican hat landscape of the Xenopus laevis embryonic cell cycle with several local basins and barriers on the oscillation path. The local basins characterize the different phases of the Xenopus laevis embryonic cell cycle, and the local barriers represent the checkpoints. The checkpoint mechanism of the cell cycle is revealed by the landscape basins and barriers. While landscape shape determines the stabilities of the states on the oscillation path, the curl flux force determines the stability of the cell cycle flow. Replication is fundamental for biology of living cells. We quantify the input energy (through the entropy production) as the thermodynamic requirement for initiation and sustainability of single cell life (cell cycle). Furthermore, we also quantify curl flux originated from the input energy as the dynamical requirement for the emergence of a new stable phase (cell cycle). This can provide a new quantitative insight for the origin of single cell life. In fact, the curl flux originated from the energy input or nutrition supply determines the speed and guarantees the progression of the cell cycle. The speed of the cell cycle is a hallmark of cancer. We characterized the quality of the cell cycle by the coherence time and found it is supported by the flux and energy cost. We are also able to quantify the degree of time irreversibility by the cross correlation function forward and backward in time from the stochastic traces in the simulation or experiments, providing a way for the quantification of the time irreversibility and the flux. Through global sensitivity analysis upon landscape and flux, we can identify the key elements for controlling the cell cycle speed. This can help to design an effective strategy for drug discovery against cancer.
Trees, soils, and food security
Sanchez, P. A.; Buresh, R. J.; Leakey, R. R. B.
1997-01-01
Trees have a different impact on soil properties than annual crops, because of their longer residence time, larger biomass accumulation, and longer-lasting, more extensive root systems. In natural forests nutrients are efficiently cycled with very small inputs and outputs from the system. In most agricultural systems the opposite happens. Agroforestry encompasses the continuum between these extremes, and emerging hard data is showing that successful agroforestry systems increase nutrient inputs, enhance internal flows, decrease nutrient losses and provide environmental benefits: when the competition for growth resources between the tree and the crop component is well managed. The three main determinants for overcoming rural poverty in Africa are (i) reversing soil fertility depletion, (ii) intensifying and diversifying land use with high-value products, and (iii) providing an enabling policy environment for the smallholder farming sector. Agroforestry practices can improve food production in a sustainable way through their contribution to soil fertility replenishment. The use of organic inputs as a source of biologically-fixed nitrogen, together with deep nitrate that is captured by trees, plays a major role in nitrogen replenishment. The combination of commercial phosphorus fertilizers with available organic resources may be the key to increasing and sustaining phosphorus capital. High-value trees, 'Cinderella' species, can fit in specific niches on farms, thereby making the system ecologically stable and more rewarding economically, in addition to diversifying and increasing rural incomes and improving food security. In the most heavily populated areas of East Africa, where farm size is extremely small, the number of trees on farms is increasing as farmers seek to reduce labour demands, compatible with the drift of some members of the family into the towns to earn off-farm income. Contrary to the concept that population pressure promotes deforestation, there is evidence that demonstrates that there are conditions under which increasing tree planting is occurring on farms in the tropics through successful agroforestry as human population density increases.
Parameter Estimation for Thurstone Choice Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vojnovic, Milan; Yun, Seyoung
We consider the estimation accuracy of individual strength parameters of a Thurstone choice model when each input observation consists of a choice of one item from a set of two or more items (so called top-1 lists). This model accommodates the well-known choice models such as the Luce choice model for comparison sets of two or more items and the Bradley-Terry model for pair comparisons. We provide a tight characterization of the mean squared error of the maximum likelihood parameter estimator. We also provide similar characterizations for parameter estimators defined by a rank-breaking method, which amounts to deducing one ormore » more pair comparisons from a comparison of two or more items, assuming independence of these pair comparisons, and maximizing a likelihood function derived under these assumptions. We also consider a related binary classification problem where each individual parameter takes value from a set of two possible values and the goal is to correctly classify all items within a prescribed classification error. The results of this paper shed light on how the parameter estimation accuracy depends on given Thurstone choice model and the structure of comparison sets. In particular, we found that for unbiased input comparison sets of a given cardinality, when in expectation each comparison set of given cardinality occurs the same number of times, for a broad class of Thurstone choice models, the mean squared error decreases with the cardinality of comparison sets, but only marginally according to a diminishing returns relation. On the other hand, we found that there exist Thurstone choice models for which the mean squared error of the maximum likelihood parameter estimator can decrease much faster with the cardinality of comparison sets. We report empirical evaluation of some claims and key parameters revealed by theory using both synthetic and real-world input data from some popular sport competitions and online labor platforms.« less
Efficient encoding of motion is mediated by gap junctions in the fly visual system.
Wang, Siwei; Borst, Alexander; Zaslavsky, Noga; Tishby, Naftali; Segev, Idan
2017-12-01
Understanding the computational implications of specific synaptic connectivity patterns is a fundamental goal in neuroscience. In particular, the computational role of ubiquitous electrical synapses operating via gap junctions remains elusive. In the fly visual system, the cells in the vertical-system network, which play a key role in visual processing, primarily connect to each other via axonal gap junctions. This network therefore provides a unique opportunity to explore the functional role of gap junctions in sensory information processing. Our information theoretical analysis of a realistic VS network model shows that within 10 ms following the onset of the visual input, the presence of axonal gap junctions enables the VS system to efficiently encode the axis of rotation, θ, of the fly's ego motion. This encoding efficiency, measured in bits, is near-optimal with respect to the physical limits of performance determined by the statistical structure of the visual input itself. The VS network is known to be connected to downstream pathways via a subset of triplets of the vertical system cells; we found that because of the axonal gap junctions, the efficiency of this subpopulation in encoding θ is superior to that of the whole vertical system network and is robust to a wide range of signal to noise ratios. We further demonstrate that this efficient encoding of motion by this subpopulation is necessary for the fly's visually guided behavior, such as banked turns in evasive maneuvers. Because gap junctions are formed among the axons of the vertical system cells, they only impact the system's readout, while maintaining the dendritic input intact, suggesting that the computational principles implemented by neural circuitries may be much richer than previously appreciated based on point neuron models. Our study provides new insights as to how specific network connectivity leads to efficient encoding of sensory stimuli.
NASA Astrophysics Data System (ADS)
Almehmadi, Fares S.; Chatterjee, Monish R.
2014-12-01
Using intensity feedback, the closed-loop behavior of an acousto-optic hybrid device under profiled beam propagation has been recently shown to exhibit wider chaotic bands potentially leading to an increase in both the dynamic range and sensitivity to key parameters that characterize the encryption. In this work, a detailed examination is carried out vis-à-vis the robustness of the encryption/decryption process relative to parameter mismatch for both analog and pulse code modulation signals, and bit error rate (BER) curves are used to examine the impact of additive white noise. The simulations with profiled input beams are shown to produce a stronger encryption key (i.e., much lower parametric tolerance thresholds) relative to simulations with uniform plane wave input beams. In each case, it is shown that the tolerance for key parameters drops by factors ranging from 10 to 20 times below those for uniform plane wave propagation. Results are shown to be at consistently lower tolerances for secure transmission of analog and digital signals using parameter tolerance measures, as well as BER performance measures for digital signals. These results hold out the promise for considerably greater information transmission security for such a system.
Streeter, K.A.; Baker-Herman, T.L.
2014-01-01
Phrenic motor neurons receive rhythmic synaptic inputs throughout life. Since even brief disruption in phrenic neural activity is detrimental to life, on-going neural activity may play a key role in shaping phrenic motor output. To test the hypothesis that spinal mechanisms sense and respond to reduced phrenic activity, anesthetized, ventilated rats received micro-injections of procaine in the C2 ventrolateral funiculus (VLF) to transiently (~30 min) block axon conduction in bulbospinal axons from medullary respiratory neurons that innervate one phrenic motor pool; during procaine injections, contralateral phrenic neural activity was maintained. Once axon conduction resumed, a prolonged increase in phrenic burst amplitude was observed in the ipsilateral phrenic nerve, demonstrating inactivity-induced phrenic motor facilitation (iPMF). Inhibition of tumor necrosis factor alpha (TNFα) and atypical PKC (aPKC) activity in spinal segments containing the phrenic motor nucleus impaired ipsilateral iPMF, suggesting a key role for spinal TNFα and aPKC in iPMF following unilateral axon conduction block. A small phrenic burst amplitude facilitation was also observed contralateral to axon conduction block, indicating crossed spinal phrenic motor facilitation (csPMF). csPMF was independent of spinal TNFα and aPKC. Ipsilateral iPMF and csPMF following unilateral withdrawal of phrenic synaptic inputs were associated with proportional increases in phrenic responses to chemoreceptor stimulation (hypercapnia), suggesting iPMF and csPMF increase phrenic dynamic range. These data suggest that local, spinal mechanisms sense and respond to reduced synaptic inputs to phrenic motor neurons. We hypothesize that iPMF and csPMF may represent compensatory mechanisms that assure adequate motor output is maintained in a physiological system in which prolonged inactivity ends life. PMID:24681155
Biometrics based key management of double random phase encoding scheme using error control codes
NASA Astrophysics Data System (ADS)
Saini, Nirmala; Sinha, Aloka
2013-08-01
In this paper, an optical security system has been proposed in which key of the double random phase encoding technique is linked to the biometrics of the user to make it user specific. The error in recognition due to the biometric variation is corrected by encoding the key using the BCH code. A user specific shuffling key is used to increase the separation between genuine and impostor Hamming distance distribution. This shuffling key is then further secured using the RSA public key encryption to enhance the security of the system. XOR operation is performed between the encoded key and the feature vector obtained from the biometrics. The RSA encoded shuffling key and the data obtained from the XOR operation are stored into a token. The main advantage of the present technique is that the key retrieval is possible only in the simultaneous presence of the token and the biometrics of the user which not only authenticates the presence of the original input but also secures the key of the system. Computational experiments showed the effectiveness of the proposed technique for key retrieval in the decryption process by using the live biometrics of the user.
NASA Astrophysics Data System (ADS)
Foster, K.
1994-09-01
This document is a description of a computer program called Format( )MEDIC( )Input. The purpose of this program is to allow the user to quickly reformat wind velocity data in the Model Evaluation Database (MEDb) into a reasonable 'first cut' set of MEDIC input files (MEDIC.nml, StnLoc.Met, and Observ.Met). The user is cautioned that these resulting input files must be reviewed for correctness and completeness. This program will not format MEDb data into a Problem Station Library or Problem Metdata File. A description of how the program reformats the data is provided, along with a description of the required and optional user input and a description of the resulting output files. A description of the MEDb is not provided here but can be found in the RAS Division Model Evaluation Database Description document.
Stochastic analysis of multiphase flow in porous media: II. Numerical simulations
NASA Astrophysics Data System (ADS)
Abin, A.; Kalurachchi, J. J.; Kemblowski, M. W.; Chang, C.-M.
1996-08-01
The first paper (Chang et al., 1995b) of this two-part series described the stochastic analysis using spectral/perturbation approach to analyze steady state two-phase (water and oil) flow in a, liquid-unsaturated, three fluid-phase porous medium. In this paper, the results between the numerical simulations and closed-form expressions obtained using the perturbation approach are compared. We present the solution to the one-dimensional, steady-state oil and water flow equations. The stochastic input processes are the spatially correlated logk where k is the intrinsic permeability and the soil retention parameter, α. These solutions are subsequently used in the numerical simulations to estimate the statistical properties of the key output processes. The comparison between the results of the perturbation analysis and numerical simulations showed a good agreement between the two methods over a wide range of logk variability with three different combinations of input stochastic processes of logk and soil parameter α. The results clearly demonstrated the importance of considering the spatial variability of key subsurface properties under a variety of physical scenarios. The variability of both capillary pressure and saturation is affected by the type of input stochastic process used to represent the spatial variability. The results also demonstrated the applicability of perturbation theory in predicting the system variability and defining effective fluid properties through the ergodic assumption.
Bird, David A.
1983-01-01
A low-noise pulse conditioner is provided for driving electronic digital processing circuitry directly from differentially induced input pulses. The circuit uses a unique differential-to-peak detector circuit to generate a dynamic reference signal proportional to the input peak voltage. The input pulses are compared with the reference signal in an input network which operates in full differential mode with only a passive input filter. This reduces the introduction of circuit-induced noise, or jitter, generated in ground referenced input elements normally used in pulse conditioning circuits, especially speed transducer processing circuits.
Adaptive Neural Network Control for the Trajectory Tracking of the Furuta Pendulum.
Moreno-Valenzuela, Javier; Aguilar-Avelar, Carlos; Puga-Guzman, Sergio A; Santibanez, Victor
2016-12-01
The purpose of this paper is to introduce a novel adaptive neural network-based control scheme for the Furuta pendulum, which is a two degree-of-freedom underactuated system. Adaptation laws for the input and output weights are also provided. The proposed controller is able to guarantee tracking of a reference signal for the arm while the pendulum remains in the upright position. The key aspect of the derivation of the controller is the definition of an output function that depends on the position and velocity errors. The internal and external dynamics are rigorously analyzed, thereby proving the uniform ultimate boundedness of the error trajectories. By using real-time experiments, the new scheme is compared with other control methodologies, therein demonstrating the improved performance of the proposed adaptive algorithm.
NASA Technical Reports Server (NTRS)
Wilson, Emily L.; DiGregorio, A. J.; Riot, Vincent J.; Ammons, Mark S.; Bruner, WIlliam W.; Carter, Darrell; Mao, Jianping; Ramanathan, Anand; Strahan, Susan E.; Oman, Luke D.;
2017-01-01
We present a design for a 4 U (20 cm 20 cm 10 cm) occultation-viewing laser heterodyne radiometer (LHR) that measures methane (CH4), carbon dioxide (CO2) and water vapor(H2O) in the limb that is designed for deployment on a 6 U CubeSat. The LHR design collects sunlight that has undergone absorption by the trace gas and mixes it with a distributive feedback (DFB) laser centered at 1640 nm that scans across CO2, CH4, and H2O absorption features. Upper troposphere lower stratosphere measurements of these gases provide key inputs to stratospheric circulation models: measuring stratospheric circulation and its variability is essential for projecting how climate change will affect stratospheric ozone.
Case studies in Bayesian microbial risk assessments.
Kennedy, Marc C; Clough, Helen E; Turner, Joanne
2009-12-21
The quantification of uncertainty and variability is a key component of quantitative risk analysis. Recent advances in Bayesian statistics make it ideal for integrating multiple sources of information, of different types and quality, and providing a realistic estimate of the combined uncertainty in the final risk estimates. We present two case studies related to foodborne microbial risks. In the first, we combine models to describe the sequence of events resulting in illness from consumption of milk contaminated with VTEC O157. We used Monte Carlo simulation to propagate uncertainty in some of the inputs to computer models describing the farm and pasteurisation process. Resulting simulated contamination levels were then assigned to consumption events from a dietary survey. Finally we accounted for uncertainty in the dose-response relationship and uncertainty due to limited incidence data to derive uncertainty about yearly incidences of illness in young children. Options for altering the risk were considered by running the model with different hypothetical policy-driven exposure scenarios. In the second case study we illustrate an efficient Bayesian sensitivity analysis for identifying the most important parameters of a complex computer code that simulated VTEC O157 prevalence within a managed dairy herd. This was carried out in 2 stages, first to screen out the unimportant inputs, then to perform a more detailed analysis on the remaining inputs. The method works by building a Bayesian statistical approximation to the computer code using a number of known code input/output pairs (training runs). We estimated that the expected total number of children aged 1.5-4.5 who become ill due to VTEC O157 in milk is 8.6 per year, with 95% uncertainty interval (0,11.5). The most extreme policy we considered was banning on-farm pasteurisation of milk, which reduced the estimate to 6.4 with 95% interval (0,11). In the second case study the effective number of inputs was reduced from 30 to 7 in the screening stage, and just 2 inputs were found to explain 82.8% of the output variance. A combined total of 500 runs of the computer code were used. These case studies illustrate the use of Bayesian statistics to perform detailed uncertainty and sensitivity analyses, integrating multiple information sources in a way that is both rigorous and efficient.
Wavelength meter having single mode fiber optics multiplexed inputs
Hackel, R.P.; Paris, R.D.; Feldman, M.
1993-02-23
A wavelength meter having a single mode fiber optics input is disclosed. The single mode fiber enables a plurality of laser beams to be multiplexed to form a multiplexed input to the wavelength meter. The wavelength meter can provide a determination of the wavelength of any one or all of the plurality of laser beams by suitable processing. Another aspect of the present invention is that one of the laser beams could be a known reference laser having a predetermined wavelength. Hence, the improved wavelength meter can provide an on-line calibration capability with the reference laser input as one of the plurality of laser beams.
Wavelength meter having single mode fiber optics multiplexed inputs
Hackel, Richard P.; Paris, Robert D.; Feldman, Mark
1993-01-01
A wavelength meter having a single mode fiber optics input is disclosed. The single mode fiber enables a plurality of laser beams to be multiplexed to form a multiplexed input to the wavelength meter. The wavelength meter can provide a determination of the wavelength of any one or all of the plurality of laser beams by suitable processing. Another aspect of the present invention is that one of the laser beams could be a known reference laser having a predetermined wavelength. Hence, the improved wavelength meter can provide an on-line calibration capability with the reference laser input as one of the plurality of laser beams.
NASA Technical Reports Server (NTRS)
Birchenough, Arthur G.
2003-01-01
Improvements in the efficiency and size of DC-DC converters have resulted from advances in components, primarily semiconductors, and improved topologies. One topology, which has shown very high potential in limited applications, is the Series Connected Boost Unit (SCBU), wherein a small DC-DC converter output is connected in series with the input bus to provide an output voltage equal to or greater than the input voltage. Since the DC-DC converter switches only a fraction of the power throughput, the overall system efficiency is very high. But this technique is limited to applications where the output is always greater than the input. The Series Connected Buck Boost Regulator (SCBBR) concept extends partial power processing technique used in the SCBU to operation when the desired output voltage is higher or lower than the input voltage, and the implementation described can even operate as a conventional buck converter to operate at very low output to input voltage ratios. This paper describes the operation and performance of an SCBBR configured as a bus voltage regulator providing 50 percent voltage regulation range, bus switching, and overload limiting, operating above 98 percent efficiency. The technique does not provide input-output isolation.
Doorenweerd, Camiel; van Haren, Merel M; Schermer, Maarten; Pieterse, Sander; van Nieukerken, Erik J
2014-01-01
We present an interactive key that is available online through any web browser without the need to install any additional software, making it an easily accessible tool for the larger public. The key can be found at http://identify.naturalis.nl/lithocolletinae. The key includes all 86 North-West European Lithocolletinae, a subfamily of smaller moths ("micro-moths") that is commonly not treated in field guides. The user can input data on several external morphological character systems in addition to distribution, host plant and even characteristics of the larval feeding traces to reach an identification. We expect that this will enable more people to contribute with reliable observation data on this group of moths and alleviate the workload of taxonomic specialists, allowing them to focus on other new keys or taxonomic work.
NASA Technical Reports Server (NTRS)
Myers, Jerry G.; Young, M.; Goodenow, Debra A.; Keenan, A.; Walton, M.; Boley, L.
2015-01-01
Model and simulation (MS) credibility is defined as, the quality to elicit belief or trust in MS results. NASA-STD-7009 [1] delineates eight components (Verification, Validation, Input Pedigree, Results Uncertainty, Results Robustness, Use History, MS Management, People Qualifications) that address quantifying model credibility, and provides guidance to the model developers, analysts, and end users for assessing the MS credibility. Of the eight characteristics, input pedigree, or the quality of the data used to develop model input parameters, governing functions, or initial conditions, can vary significantly. These data quality differences have varying consequences across the range of MS application. NASA-STD-7009 requires that the lowest input data quality be used to represent the entire set of input data when scoring the input pedigree credibility of the model. This requirement provides a conservative assessment of model inputs, and maximizes the communication of the potential level of risk of using model outputs. Unfortunately, in practice, this may result in overly pessimistic communication of the MS output, undermining the credibility of simulation predictions to decision makers. This presentation proposes an alternative assessment mechanism, utilizing results parameter robustness, also known as model input sensitivity, to improve the credibility scoring process for specific simulations.
Wilson, R; Abbott, J H
2018-04-01
To describe the construction and preliminary validation of a new population-based microsimulation model developed to analyse the health and economic burden and cost-effectiveness of treatments for knee osteoarthritis (OA) in New Zealand (NZ). We developed the New Zealand Management of Osteoarthritis (NZ-MOA) model, a discrete-time state-transition microsimulation model of the natural history of radiographic knee OA. In this article, we report on the model structure, derivation of input data, validation of baseline model parameters against external data sources, and validation of model outputs by comparison of the predicted population health loss with previous estimates. The NZ-MOA model simulates both the structural progression of radiographic knee OA and the stochastic development of multiple disease symptoms. Input parameters were sourced from NZ population-based data where possible, and from international sources where NZ-specific data were not available. The predicted distributions of structural OA severity and health utility detriments associated with OA were externally validated against other sources of evidence, and uncertainty resulting from key input parameters was quantified. The resulting lifetime and current population health-loss burden was consistent with estimates of previous studies. The new NZ-MOA model provides reliable estimates of the health loss associated with knee OA in the NZ population. The model structure is suitable for analysis of the effects of a range of potential treatments, and will be used in future work to evaluate the cost-effectiveness of recommended interventions within the NZ healthcare system. Copyright © 2018 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Gabrielse, C.; Nishimura, T.; Lyons, L. R.; Gallardo-Lacourt, B.; Deng, Y.; McWilliams, K. A.; Ruohoniemi, J. M.
2017-12-01
NASA's Heliophysics Decadal Survey put forth several imperative, Key Science Goals. The second goal communicates the urgent need to "Determine the dynamics and coupling of Earth's magnetosphere, ionosphere, and atmosphere and their response to solar and terrestrial inputs...over a range of spatial and temporal scales." Sun-Earth connections (called Space Weather) have strong societal impacts because extreme events can disturb radio communications and satellite operations. The field's current modeling capabilities of such Space Weather phenomena include large-scale, global responses of the Earth's upper atmosphere to various inputs from the Sun, but the meso-scale ( 50-500 km) structures that are much more dynamic and powerful in the coupled system remain uncharacterized. Their influences are thus far poorly understood. We aim to quantify such structures, particularly auroral flows and streamers, in order to create an empirical model of their size, location, speed, and orientation based on activity level (AL index), season, solar cycle (F10.7), interplanetary magnetic field (IMF) inputs, etc. We present a statistical study of meso-scale flow channels in the nightside auroral oval and polar cap using SuperDARN. These results are used to inform global models such as the Global Ionosphere Thermosphere Model (GITM) in order to evaluate the role of meso-scale disturbances on the fully coupled magnetosphere-ionosphere-thermosphere system. Measuring the ionospheric footpoint of magnetospheric fast flows, our analysis technique from the ground also provides a 2D picture of flows and their characteristics during different activity levels that spacecraft alone cannot.
NASA Astrophysics Data System (ADS)
Deng, H.; Wood, L.; Overeem, I.; Hutton, E.
2016-12-01
Submarine topography has a fundamental control on the movement of sediment gravity flows as well as the distribution, morphology, and internal heterogeneity of resultant overlying, healing-phase, deep-water reservoirs. Some of the most complex deep-water topography is generated through both destructive and constructive mass transport processes. A series of numerical models using Sedflux software have been constructed over high resolution mass transport complexes (MTCs) top paleobathymetric surfaces mapped from 3D seismic data in offshore Morocco and offshore eastern Trinidad. Morocco's margin is characterized by large, extant rafted blocks and a flow perpendicular fabric. Trinidad's margin is characterized by muddier, plastic flows and isolated extrusive diapiric buttresses. In addition, Morocco's margin is a dry, northern latitude margin that lacks major river inputs, while Trinidad's margin is an equatorial, wet climate that is fed by the Orinoco River and delta. These models quantitatively delineate the interaction of healing-phase gravity flows on the tops of two very different topographies and provide insights into healing-phase reservoir distribution and stratigraphic trap development. Slopes roughness, curvatures, and surface shapes are measured and quantified relative to input points to quantify depositional surface character. A variety of sediment gravity flow types have been input and the resultant interval assessed for thickness and distribution relative to key topography parameters. Mathematical relationships are to be analyzed and compared with seismic data interpretation of healing-phase interval character, toward an improved model of gravity sedimentation and topography interactions.
Luo, Ye; Chamanzar, Maysamreza; Apuzzo, Aniello; Salas-Montiel, Rafael; Nguyen, Kim Ngoc; Blaize, Sylvain; Adibi, Ali
2015-02-11
The enhancement and confinement of electromagnetic radiation to nanometer scale have improved the performances and decreased the dimensions of optical sources and detectors for several applications including spectroscopy, medical applications, and quantum information. Realization of on-chip nanofocusing devices compatible with silicon photonics platform adds a key functionality and provides opportunities for sensing, trapping, on-chip signal processing, and communications. Here, we discuss the design, fabrication, and experimental demonstration of light nanofocusing in a hybrid plasmonic-photonic nanotaper structure. We discuss the physical mechanisms behind the operation of this device, the coupling mechanisms, and how to engineer the energy transfer from a propagating guided mode to a trapped plasmonic mode at the apex of the plasmonic nanotaper with minimal radiation loss. Optical near-field measurements and Fourier modal analysis carried out using a near-field scanning optical microscope (NSOM) show a tight nanofocusing of light in this structure to an extremely small spot of 0.00563(λ/(2n(rmax)))(3) confined in 3D and an exquisite power input conversion of 92%. Our experiments also verify the mode selectivity of the device (low transmission of a TM-like input mode and high transmission of a TE-like input mode). A large field concentration factor (FCF) of about 4.9 is estimated from our NSOM measurement with a radius of curvature of about 20 nm at the apex of the nanotaper. The agreement between our theory and experimental results reveals helpful insights about the operation mechanism of the device, the interplay of the modes, and the gradual power transfer to the nanotaper apex.
Hybrid powertrain system including smooth shifting automated transmission
Beaty, Kevin D.; Nellums, Richard A.
2006-10-24
A powertrain system is provided that includes a prime mover and a change-gear transmission having an input, at least two gear ratios, and an output. The powertrain system also includes a power shunt configured to route power applied to the transmission by one of the input and the output to the other one of the input and the output. A transmission system and a method for facilitating shifting of a transmission system are also provided.
Multiplexer and time duration measuring circuit
Gray, Jr., James
1980-01-01
A multiplexer device is provided for multiplexing data in the form of randomly developed, variable width pulses from a plurality of pulse sources to a master storage. The device includes a first multiplexer unit which includes a plurality of input circuits each coupled to one of the pulse sources, with all input circuits being disabled when one input circuit receives an input pulse so that only one input pulse is multiplexed by the multiplexer unit at any one time.
NASA Astrophysics Data System (ADS)
Wang, Lijuan; Yan, Yong; Wang, Xue; Wang, Tao
2017-03-01
Input variable selection is an essential step in the development of data-driven models for environmental, biological and industrial applications. Through input variable selection to eliminate the irrelevant or redundant variables, a suitable subset of variables is identified as the input of a model. Meanwhile, through input variable selection the complexity of the model structure is simplified and the computational efficiency is improved. This paper describes the procedures of the input variable selection for the data-driven models for the measurement of liquid mass flowrate and gas volume fraction under two-phase flow conditions using Coriolis flowmeters. Three advanced input variable selection methods, including partial mutual information (PMI), genetic algorithm-artificial neural network (GA-ANN) and tree-based iterative input selection (IIS) are applied in this study. Typical data-driven models incorporating support vector machine (SVM) are established individually based on the input candidates resulting from the selection methods. The validity of the selection outcomes is assessed through an output performance comparison of the SVM based data-driven models and sensitivity analysis. The validation and analysis results suggest that the input variables selected from the PMI algorithm provide more effective information for the models to measure liquid mass flowrate while the IIS algorithm provides a fewer but more effective variables for the models to predict gas volume fraction.
Rui, Yichao; Murphy, Daniel V.; Wang, Xiaoli; Hoyle, Frances C.
2016-01-01
Rebuilding ‘lost’ soil carbon (C) is a priority in mitigating climate change and underpinning key soil functions that support ecosystem services. Microorganisms determine if fresh C input is converted into stable soil organic matter (SOM) or lost as CO2. Here we quantified if microbial biomass and respiration responded positively to addition of light fraction organic matter (LFOM, representing recent inputs of plant residue) in an infertile semi-arid agricultural soil. Field trial soil with different historical plant residue inputs [soil C content: control (tilled) = 9.6 t C ha−1 versus tilled + plant residue treatment (tilled + OM) = 18.0 t C ha−1] were incubated in the laboratory with a gradient of LFOM equivalent to 0 to 3.8 t C ha−1 (0 to 500% LFOM). Microbial biomass C significantly declined under increased rates of LFOM addition while microbial respiration increased linearly, leading to a decrease in the microbial C use efficiency. We hypothesise this was due to insufficient nutrients to form new microbial biomass as LFOM input increased the ratio of C to nitrogen, phosphorus and sulphur of soil. Increased CO2 efflux but constrained microbial growth in response to LFOM input demonstrated the difficulty for C storage in this environment. PMID:27752083
Image encryption based on nonlinear encryption system and public-key cryptography
NASA Astrophysics Data System (ADS)
Zhao, Tieyu; Ran, Qiwen; Chi, Yingying
2015-03-01
Recently, optical asymmetric cryptosystem (OACS) has became the focus of discussion and concern of researchers. Some researchers pointed out that OACS was not tenable because of misunderstanding the concept of asymmetric cryptosystem (ACS). We propose an improved cryptosystem using RSA public-key algorithm based on existing OACS and the new system conforms to the basic agreement of public key cryptosystem. At the beginning of the encryption process, the system will produce an independent phase matrix and allocate the input image, which also conforms to one-time pad cryptosystem. The simulation results show that the validity of the improved cryptosystem and the high robustness against attack scheme using phase retrieval technique.
Planning for Success: Integrating Analysis with Decision Making.
ERIC Educational Resources Information Center
Goho, James; Webb, Ken
2003-01-01
Describes a successful strategic planning process at a large community college, which linked the analytic inputs of research with the authority and intuition of leaders. Reports key factors attributed to the process' success, including a collegial and organized structure, detailed project management plans, and confidence in the environmental scan.…
Consensus Building: A Key to School Transformation
ERIC Educational Resources Information Center
Baron, Daniel
2008-01-01
Consensus-based decision making can turn faculty meetings into meaningful and productive work sessions in which faculty members know that their input is respected and valued and important decisions are made. Reaching consensus has different meanings in different contexts. Decisions are made "by consensus" when the decision affects the entire…
Farming strategies to feed people, facilitate essential soil services, and fuel the economy
USDA-ARS?s Scientific Manuscript database
Perennial cellulosic biomass and food crop residues are important on-farm resources, which have become potential valuable sources of income as a harvestable commodity contributing to biofuel production demands. Inputs of carbon embedded in above-ground plant biomass are a key biological energy sourc...
Racial and Cultural Factors and Learning Transfer
ERIC Educational Resources Information Center
Closson, Rosemary
2013-01-01
Baldwin and Ford (1988) specifically include learner characteristics as one of three key inputs into the learning transfer process but infrequently (actually almost never) has race, ethnicity, or culture been included as a variable when describing trainee characteristics. For the most part one is left to speculate as to the potential influence…
Optimizing Indicator Choosing for Canal Control System and Simulation Study
USDA-ARS?s Scientific Manuscript database
One Key problem for canal system control is how to select appropriate performance indicators and how to tune the controller with these indicators. A canal system is a multi-input and multi-output (MIMO) system. The judging of control performance can be extremely complicated. In this paper, frequentl...
Turkish Student Science Teachers' Conceptions of Sustainable Development: A Phenomenography
ERIC Educational Resources Information Center
Kilinc, Ahmet; Aydin, Abdullah
2013-01-01
In creating a society whose citizens have sustainable lifestyles, education for sustainable development (ESD) plays a key role. However, the concept of sustainable development (SD) has developed independently from the input of educators; therefore, ESD presents current teachers with many challenges. At this point, understanding how stakeholders in…
Assumptions to the Annual Energy Outlook
2017-01-01
This report presents the major assumptions of the National Energy Modeling System (NEMS) used to generate the projections in the Annual Energy Outlook, including general features of the model structure, assumptions concerning energy markets, and the key input data and parameters that are the most significant in formulating the model results.
Chemical boundary conditions are a key input to regional-scale photochemical models. In this study, performed during the third phase of the Air Quality Model Evaluation International Initiative (AQMEII3), we perform annual simulations over North America with chemical boundary con...
Toward an inventory of nitrogen input to the United States
Accurate accounting of nitrogen inputs is increasingly necessary for policy decisions related to aquatic nutrient pollution. Here we synthesize available data to provide the first integrated estimates of the amount and uncertainty of nitrogen inputs to the United States. Abou...
Real-Time Biologically Inspired Action Recognition from Key Poses Using a Neuromorphic Architecture.
Layher, Georg; Brosch, Tobias; Neumann, Heiko
2017-01-01
Intelligent agents, such as robots, have to serve a multitude of autonomous functions. Examples are, e.g., collision avoidance, navigation and route planning, active sensing of its environment, or the interaction and non-verbal communication with people in the extended reach space. Here, we focus on the recognition of the action of a human agent based on a biologically inspired visual architecture of analyzing articulated movements. The proposed processing architecture builds upon coarsely segregated streams of sensory processing along different pathways which separately process form and motion information (Layher et al., 2014). Action recognition is performed in an event-based scheme by identifying representations of characteristic pose configurations (key poses) in an image sequence. In line with perceptual studies, key poses are selected unsupervised utilizing a feature-driven criterion which combines extrema in the motion energy with the horizontal and the vertical extendedness of a body shape. Per class representations of key pose frames are learned using a deep convolutional neural network consisting of 15 convolutional layers. The network is trained using the energy-efficient deep neuromorphic networks ( Eedn ) framework (Esser et al., 2016), which realizes the mapping of the trained synaptic weights onto the IBM Neurosynaptic System platform (Merolla et al., 2014). After the mapping, the trained network achieves real-time capabilities for processing input streams and classify input images at about 1,000 frames per second while the computational stages only consume about 70 mW of energy (without spike transduction). Particularly regarding mobile robotic systems, a low energy profile might be crucial in a variety of application scenarios. Cross-validation results are reported for two different datasets and compared to state-of-the-art action recognition approaches. The results demonstrate, that (I) the presented approach is on par with other key pose based methods described in the literature, which select key pose frames by optimizing classification accuracy, (II) compared to the training on the full set of frames, representations trained on key pose frames result in a higher confidence in class assignments, and (III) key pose representations show promising generalization capabilities in a cross-dataset evaluation.
Real-Time Biologically Inspired Action Recognition from Key Poses Using a Neuromorphic Architecture
Layher, Georg; Brosch, Tobias; Neumann, Heiko
2017-01-01
Intelligent agents, such as robots, have to serve a multitude of autonomous functions. Examples are, e.g., collision avoidance, navigation and route planning, active sensing of its environment, or the interaction and non-verbal communication with people in the extended reach space. Here, we focus on the recognition of the action of a human agent based on a biologically inspired visual architecture of analyzing articulated movements. The proposed processing architecture builds upon coarsely segregated streams of sensory processing along different pathways which separately process form and motion information (Layher et al., 2014). Action recognition is performed in an event-based scheme by identifying representations of characteristic pose configurations (key poses) in an image sequence. In line with perceptual studies, key poses are selected unsupervised utilizing a feature-driven criterion which combines extrema in the motion energy with the horizontal and the vertical extendedness of a body shape. Per class representations of key pose frames are learned using a deep convolutional neural network consisting of 15 convolutional layers. The network is trained using the energy-efficient deep neuromorphic networks (Eedn) framework (Esser et al., 2016), which realizes the mapping of the trained synaptic weights onto the IBM Neurosynaptic System platform (Merolla et al., 2014). After the mapping, the trained network achieves real-time capabilities for processing input streams and classify input images at about 1,000 frames per second while the computational stages only consume about 70 mW of energy (without spike transduction). Particularly regarding mobile robotic systems, a low energy profile might be crucial in a variety of application scenarios. Cross-validation results are reported for two different datasets and compared to state-of-the-art action recognition approaches. The results demonstrate, that (I) the presented approach is on par with other key pose based methods described in the literature, which select key pose frames by optimizing classification accuracy, (II) compared to the training on the full set of frames, representations trained on key pose frames result in a higher confidence in class assignments, and (III) key pose representations show promising generalization capabilities in a cross-dataset evaluation. PMID:28381998
Happel, Max F K; Jeschke, Marcus; Ohl, Frank W
2010-08-18
Primary sensory cortex integrates sensory information from afferent feedforward thalamocortical projection systems and convergent intracortical microcircuits. Both input systems have been demonstrated to provide different aspects of sensory information. Here we have used high-density recordings of laminar current source density (CSD) distributions in primary auditory cortex of Mongolian gerbils in combination with pharmacological silencing of cortical activity and analysis of the residual CSD, to dissociate the feedforward thalamocortical contribution and the intracortical contribution to spectral integration. We found a temporally highly precise integration of both types of inputs when the stimulation frequency was in close spectral neighborhood of the best frequency of the measurement site, in which the overlap between both inputs is maximal. Local intracortical connections provide both directly feedforward excitatory and modulatory input from adjacent cortical sites, which determine how concurrent afferent inputs are integrated. Through separate excitatory horizontal projections, terminating in cortical layers II/III, information about stimulus energy in greater spectral distance is provided even over long cortical distances. These projections effectively broaden spectral tuning width. Based on these data, we suggest a mechanism of spectral integration in primary auditory cortex that is based on temporally precise interactions of afferent thalamocortical inputs and different short- and long-range intracortical networks. The proposed conceptual framework allows integration of different and partly controversial anatomical and physiological models of spectral integration in the literature.
40 CFR 60.4176 - Additional requirements to provide heat input data.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 6 2010-07-01 2010-07-01 false Additional requirements to provide heat... Compliance Times for Coal-Fired Electric Steam Generating Units Monitoring and Reporting § 60.4176 Additional requirements to provide heat input data. The owner or operator of a Hg Budget unit that monitors and reports Hg...
Self-Assembled Resonance Energy Transfer Keys for Secure Communication over Classical Channels.
Nellore, Vishwa; Xi, Sam; Dwyer, Chris
2015-12-22
Modern authentication and communication protocols increasingly use physical keys in lieu of conventional software-based keys for security. This shift is primarily driven by the ability to derive a unique, unforgeable signature from a physical key. The sole demonstration of an unforgeable key, thus far, has been through quantum key distribution, which suffers from limited communication distances and expensive infrastructure requirements. Here, we show a method for creating unclonable keys by molecular self-assembly of resonance energy transfer (RET) devices. It is infeasible to clone the RET-key due to the inability to characterize the key using current technology, the large number of input-output combinations per key, and the variation of the key's response with time. However, the manufacturer can produce multiple identical devices, which enables inexpensive, secure authentication and communication over classical channels, and thus any distance. Through a detailed experimental survey of the nanoscale keys, we demonstrate that legitimate users are successfully authenticated 99.48% of the time and the false-positives are only 0.39%, over two attempts. We estimate that a legitimate user would have a computational advantage of more than 10(340) years over an attacker. Our method enables the discovery of physical key based multiparty authentication and communication schemes that are both practical and possess unprecedented security.
Some conservative estimates in quantum cryptography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Molotkov, S. N.
2006-08-15
Relationship is established between the security of the BB84 quantum key distribution protocol and the forward and converse coding theorems for quantum communication channels. The upper bound Q{sub c} {approx} 11% on the bit error rate compatible with secure key distribution is determined by solving the transcendental equation H(Q{sub c})=C-bar({rho})/2, where {rho} is the density matrix of the input ensemble, C-bar({rho}) is the classical capacity of a noiseless quantum channel, and H(Q) is the capacity of a classical binary symmetric channel with error rate Q.
Multi-Beam Approach for Accelerating Alignment and Calibration of HyspIRI-Like Imaging Spectrometers
NASA Technical Reports Server (NTRS)
Eastwood, Michael L.; Green, Robert O.; Mouroulis, Pantazis; Hochberg, Eric B.; Hein, Randall C.; Kroll, Linley A.; Geier, Sven; Coles, James B.; Meehan, Riley
2012-01-01
A paper describes an optical stimulus that produces more consistent results, and can be automated for unattended, routine generation of data analysis products needed by the integration and testing team assembling a high-fidelity imaging spectrometer system. One key attribute of the system is an arrangement of pick-off mirrors that provides multiple input beams (five in this implementation) to simultaneously provide stimulus light to several field angles along the field of view of the sensor under test, allowing one data set to contain all the information that previously required five data sets to be separately collected. This stimulus can also be fed by quickly reconfigured sources that ultimately provide three data set types that would previously be collected separately using three different setups: Spectral Response Function (SRF), Cross-track Response Function (CRF), and Along-track Response Function (ARF), respectively. This method also lends itself to expansion of the number of field points if less interpolation across the field of view is desirable. An absolute minimum of three is required at the beginning stages of imaging spectrometer alignment.
NASA Astrophysics Data System (ADS)
Cheok, Adrian David
Cuteness in interactive systems is a relatively new development, yet having its roots in the aesthetics of many historical and cultural elements. Symbols of cuteness abound in nature as in the creatures of neotenous proportions; drawing in the care and concern of the parent and the care from a protector. We provide an in depth look at the role of cuteness in interactive systems beginning with a history. We particularly focus on the Japanese culture of Kawaii, which has made a large impact around the world, especially in entertainment, fashion, and animation. We then take the approach of defining cuteness in contemporary popular perception. User studies are presented offering an in-depth understanding of key perceptual elements which are identified as cute. This knowledge provides for the possibility to create a cute filter which can transform inputs and automatically create more cute outputs. The development of cute social computing and entertainment projects are discussed as well, providing an insight into the next generation of interactive systems which bring happiness and comfort to users of all ages and cultures through the soft power of cute.
DAKOTA Design Analysis Kit for Optimization and Terascale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Dalbey, Keith R.; Eldred, Michael S.
2010-02-24
The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes (computational models) and iterative analysis methods. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and analysis of computational models on high performance computers.A user provides a set of DAKOTA commands in an input file and launches DAKOTA. DAKOTA invokes instances of the computational models, collects their results, and performs systems analyses. DAKOTA contains algorithms for optimization with gradient and nongradient-basedmore » methods; uncertainty quantification with sampling, reliability, polynomial chaos, stochastic collocation, and epistemic methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as hybrid optimization, surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. Services for parallel computing, simulation interfacing, approximation modeling, fault tolerance, restart, and graphics are also included.« less
NASA Astrophysics Data System (ADS)
Heynderickx, Daniel; Glover, Alexi
Operational space weather services rely heavily on reliable data streams from spacecraft and ground-based facilities, as well as from services providing processed data products. This event focuses on an unusual solar maximum viewed from several different perspectives, and as such highlights the important contribution of long term archives in supporting space weather studies and services. We invite the space weather community to contribute to a discussion on the key topics listed below, with the aim of formulating recommendations and guidelines for policy makers, stakeholders, data and service providers: - facilitating access to and awareness of existing data resources - establishing clear guidelines for space weather data archives including data quality, interoperability and metadata standards - ensuring data ownership and terms of (re)use are clearly identified such that this information can be taken into account when (potentially commercial) services are developed based on data provided without charge for scientific purposes only All participants are invited to submit input for the discussion to the authors ahead of the Assembly. The outcome of the session will be formulated as a set of proposed panel recommendations.