Pe'er, Guy; Zurita, Gustavo A.; Schober, Lucia; Bellocq, Maria I.; Strer, Maximilian; Müller, Michael; Pütz, Sandro
2013-01-01
Landscape simulators are widely applied in landscape ecology for generating landscape patterns. These models can be divided into two categories: pattern-based models that generate spatial patterns irrespective of the processes that shape them, and process-based models that attempt to generate patterns based on the processes that shape them. The latter often tend toward complexity in an attempt to obtain high predictive precision, but are rarely used for generic or theoretical purposes. Here we show that a simple process-based simulator can generate a variety of spatial patterns including realistic ones, typifying landscapes fragmented by anthropogenic activities. The model “G-RaFFe” generates roads and fields to reproduce the processes in which forests are converted into arable lands. For a selected level of habitat cover, three factors dominate its outcomes: the number of roads (accessibility), maximum field size (accounting for land ownership patterns), and maximum field disconnection (which enables field to be detached from roads). We compared the performance of G-RaFFe to three other models: Simmap (neutral model), Qrule (fractal-based) and Dinamica EGO (with 4 model versions differing in complexity). A PCA-based analysis indicated G-RaFFe and Dinamica version 4 (most complex) to perform best in matching realistic spatial patterns, but an alternative analysis which considers model variability identified G-RaFFe and Qrule as performing best. We also found model performance to be affected by habitat cover and the actual land-uses, the latter reflecting on land ownership patterns. We suggest that simple process-based generators such as G-RaFFe can be used to generate spatial patterns as templates for theoretical analyses, as well as for gaining better understanding of the relation between spatial processes and patterns. We suggest caution in applying neutral or fractal-based approaches, since spatial patterns that typify anthropogenic landscapes are often non-fractal in nature. PMID:23724108
Pe'er, Guy; Zurita, Gustavo A; Schober, Lucia; Bellocq, Maria I; Strer, Maximilian; Müller, Michael; Pütz, Sandro
2013-01-01
Landscape simulators are widely applied in landscape ecology for generating landscape patterns. These models can be divided into two categories: pattern-based models that generate spatial patterns irrespective of the processes that shape them, and process-based models that attempt to generate patterns based on the processes that shape them. The latter often tend toward complexity in an attempt to obtain high predictive precision, but are rarely used for generic or theoretical purposes. Here we show that a simple process-based simulator can generate a variety of spatial patterns including realistic ones, typifying landscapes fragmented by anthropogenic activities. The model "G-RaFFe" generates roads and fields to reproduce the processes in which forests are converted into arable lands. For a selected level of habitat cover, three factors dominate its outcomes: the number of roads (accessibility), maximum field size (accounting for land ownership patterns), and maximum field disconnection (which enables field to be detached from roads). We compared the performance of G-RaFFe to three other models: Simmap (neutral model), Qrule (fractal-based) and Dinamica EGO (with 4 model versions differing in complexity). A PCA-based analysis indicated G-RaFFe and Dinamica version 4 (most complex) to perform best in matching realistic spatial patterns, but an alternative analysis which considers model variability identified G-RaFFe and Qrule as performing best. We also found model performance to be affected by habitat cover and the actual land-uses, the latter reflecting on land ownership patterns. We suggest that simple process-based generators such as G-RaFFe can be used to generate spatial patterns as templates for theoretical analyses, as well as for gaining better understanding of the relation between spatial processes and patterns. We suggest caution in applying neutral or fractal-based approaches, since spatial patterns that typify anthropogenic landscapes are often non-fractal in nature.
A Search Algorithm for Generating Alternative Process Plans in Flexible Manufacturing System
NASA Astrophysics Data System (ADS)
Tehrani, Hossein; Sugimura, Nobuhiro; Tanimizu, Yoshitaka; Iwamura, Koji
Capabilities and complexity of manufacturing systems are increasing and striving for an integrated manufacturing environment. Availability of alternative process plans is a key factor for integration of design, process planning and scheduling. This paper describes an algorithm for generation of alternative process plans by extending the existing framework of the process plan networks. A class diagram is introduced for generating process plans and process plan networks from the viewpoint of the integrated process planning and scheduling systems. An incomplete search algorithm is developed for generating and searching the process plan networks. The benefit of this algorithm is that the whole process plan network does not have to be generated before the search algorithm starts. This algorithm is applicable to large and enormous process plan networks and also to search wide areas of the network based on the user requirement. The algorithm can generate alternative process plans and to select a suitable one based on the objective functions.
LISP based simulation generators for modeling complex space processes
NASA Technical Reports Server (NTRS)
Tseng, Fan T.; Schroer, Bernard J.; Dwan, Wen-Shing
1987-01-01
The development of a simulation assistant for modeling discrete event processes is presented. Included are an overview of the system, a description of the simulation generators, and a sample process generated using the simulation assistant.
Coherence-generating power of quantum dephasing processes
NASA Astrophysics Data System (ADS)
Styliaris, Georgios; Campos Venuti, Lorenzo; Zanardi, Paolo
2018-03-01
We provide a quantification of the capability of various quantum dephasing processes to generate coherence out of incoherent states. The measures defined, admitting computable expressions for any finite Hilbert-space dimension, are based on probabilistic averages and arise naturally from the viewpoint of coherence as a resource. We investigate how the capability of a dephasing process (e.g., a nonselective orthogonal measurement) to generate coherence depends on the relevant bases of the Hilbert space over which coherence is quantified and the dephasing process occurs, respectively. We extend our analysis to include those Lindblad time evolutions which, in the infinite-time limit, dephase the system under consideration and calculate their coherence-generating power as a function of time. We further identify specific families of such time evolutions that, although dephasing, have optimal (over all quantum processes) coherence-generating power for some intermediate time. Finally, we investigate the coherence-generating capability of random dephasing channels.
Testing Strategies for Model-Based Development
NASA Technical Reports Server (NTRS)
Heimdahl, Mats P. E.; Whalen, Mike; Rajan, Ajitha; Miller, Steven P.
2006-01-01
This report presents an approach for testing artifacts generated in a model-based development process. This approach divides the traditional testing process into two parts: requirements-based testing (validation testing) which determines whether the model implements the high-level requirements and model-based testing (conformance testing) which determines whether the code generated from a model is behaviorally equivalent to the model. The goals of the two processes differ significantly and this report explores suitable testing metrics and automation strategies for each. To support requirements-based testing, we define novel objective requirements coverage metrics similar to existing specification and code coverage metrics. For model-based testing, we briefly describe automation strategies and examine the fault-finding capability of different structural coverage metrics using tests automatically generated from the model.
D.C. - ARC plasma generator for nonequilibrium plasmachemical processes
NASA Astrophysics Data System (ADS)
Kvaltin, J.
1990-06-01
The analysis of conditions for generation of nonequilibrium plasma to plasmachemical processes is made and the design of d.c.-arc plasma generator on the base of integral criterion is suggested. The measurement of potentials on the plasma column of that generator is presented.
Rius, Manuel; Bolea, Mario; Mora, José; Ortega, Beatriz; Capmany, José
2015-05-18
We experimentally demonstrate, for the first time, a chirped microwave pulses generator based on the processing of an incoherent optical signal by means of a nonlinear dispersive element. Different capabilities have been demonstrated such as the control of the time-bandwidth product and the frequency tuning increasing the flexibility of the generated waveform compared to coherent techniques. Moreover, the use of differential detection improves considerably the limitation over the signal-to-noise ratio related to incoherent processing.
Autonomous Agents for Dynamic Process Planning in the Flexible Manufacturing System
NASA Astrophysics Data System (ADS)
Nik Nejad, Hossein Tehrani; Sugimura, Nobuhiro; Iwamura, Koji; Tanimizu, Yoshitaka
Rapid changes of market demands and pressures of competition require manufacturers to maintain highly flexible manufacturing systems to cope with a complex manufacturing environment. This paper deals with development of an agent-based architecture of dynamic systems for incremental process planning in the manufacturing systems. In consideration of alternative manufacturing processes and machine tools, the process plans and the schedules of the manufacturing resources are generated incrementally and dynamically. A negotiation protocol is discussed, in this paper, to generate suitable process plans for the target products real-timely and dynamically, based on the alternative manufacturing processes. The alternative manufacturing processes are presented by the process plan networks discussed in the previous paper, and the suitable process plans are searched and generated to cope with both the dynamic changes of the product specifications and the disturbances of the manufacturing resources. We initiatively combine the heuristic search algorithms of the process plan networks with the negotiation protocols, in order to generate suitable process plans in the dynamic manufacturing environment.
NASA Astrophysics Data System (ADS)
Han, Xuesong; Li, Haiyan; Zhao, Fu
2017-07-01
Particle-fluid based surface generation process has already become one of the most important materials processing technology for many advanced materials such as optical crystal, ceramics and so on. Most of the particle-fluid based surface generation technology involves two key process: chemical reaction which is responsible for surface softening; physical behavior which is responsible for materials removal/deformation. Presently, researchers cannot give a reasonable explanation about the complex process in the particle-fluid based surface generation technology because of the small temporal-spatial scale and the concurrent influence of physical-chemical process. Molecular dynamics (MD) method has already been proved to be a promising approach for constructing effective model of atomic scale phenomenon and can serve as a predicting simulation tool in analyzing the complex surface generation mechanism and is employed in this research to study the essence of surface generation. The deformation and piles of water molecule is induced with the feeding of abrasive particle which justifies the property mutation of water at nanometer scale. There are little silica molecule aggregation or materials removal because the water-layer greatly reduce the strength of mechanical interaction between particle and materials surface and minimize the stress concentration. Furthermore, chemical effect is also observed at the interface: stable chemical bond is generated between water and silica which lead to the formation of silconl and the reaction rate changes with the amount of water molecules in the local environment. Novel ring structure is observed in the silica surface and it is justified to be favored of chemical reaction with water molecule. The siloxane bond formation process quickly strengthened across the interface with the feeding of abrasive particle because of the compressive stress resulted by the impacting behavior.
Hardware based redundant multi-threading inside a GPU for improved reliability
Sridharan, Vilas; Gurumurthi, Sudhanva
2015-05-05
A system and method for verifying computation output using computer hardware are provided. Instances of computation are generated and processed on hardware-based processors. As instances of computation are processed, each instance of computation receives a load accessible to other instances of computation. Instances of output are generated by processing the instances of computation. The instances of output are verified against each other in a hardware based processor to ensure accuracy of the output.
NASA Astrophysics Data System (ADS)
Li, Zhongyang; Wang, Silei; Wang, Mengtao; Yuan, Bin; Wang, Weishu
2017-10-01
Terahertz (THz) generation by difference frequency generation (DFG) processes with dual signal waves is theoretically analyzed. The dual signal waves are generated by an optical parametric oscillator (OPO) with periodically inverted KTiOPO4 (KTP) plates based on adhesive-free-bonded (AFB) technology. The phase-matching conditions in a same AFB KTP composite for the OPO generating signals and idlers and for the DFG generating THz wave can be simultaneously satisfied by selecting the thickness of each KTP plate. Moreover, 4-order cascaded DFG processes can be realized in the same AFB KTP composite. The cascaded Stokes interaction processes generating THz photons and the cascaded anti-Stokes interaction processes consuming THz photons are investigated from coupled wave equations. Take an example of 3.106 THz which locates in the vicinity of polariton resonances, THz intensities and quantum conversion efficiencies are calculated. Compared with non-cascaded DFG processes, THz intensities of 3.106 THz in 4-order cascaded DFG processes increase to 5.56 times. When the pump intensity equals 20 MW mm-2, the quantum conversion efficiency of 259% in 4-order cascaded DFG processes can be realized, which exceeds the Manley-Rowe limit.
A Model of Generating Visual Place Cells Based on Environment Perception and Similar Measure.
Zhou, Yang; Wu, Dewei
2016-01-01
It is an important content to generate visual place cells (VPCs) in the field of bioinspired navigation. By analyzing the firing characteristic of biological place cells and the existing methods for generating VPCs, a model of generating visual place cells based on environment perception and similar measure is abstracted in this paper. VPCs' generation process is divided into three phases, including environment perception, similar measure, and recruiting of a new place cell. According to this process, a specific method for generating VPCs is presented. External reference landmarks are obtained based on local invariant characteristics of image and a similar measure function is designed based on Euclidean distance and Gaussian function. Simulation validates the proposed method is available. The firing characteristic of the generated VPCs is similar to that of biological place cells, and VPCs' firing fields can be adjusted flexibly by changing the adjustment factor of firing field (AFFF) and firing rate's threshold (FRT).
A Model of Generating Visual Place Cells Based on Environment Perception and Similar Measure
2016-01-01
It is an important content to generate visual place cells (VPCs) in the field of bioinspired navigation. By analyzing the firing characteristic of biological place cells and the existing methods for generating VPCs, a model of generating visual place cells based on environment perception and similar measure is abstracted in this paper. VPCs' generation process is divided into three phases, including environment perception, similar measure, and recruiting of a new place cell. According to this process, a specific method for generating VPCs is presented. External reference landmarks are obtained based on local invariant characteristics of image and a similar measure function is designed based on Euclidean distance and Gaussian function. Simulation validates the proposed method is available. The firing characteristic of the generated VPCs is similar to that of biological place cells, and VPCs' firing fields can be adjusted flexibly by changing the adjustment factor of firing field (AFFF) and firing rate's threshold (FRT). PMID:27597859
AIRSAR Web-Based Data Processing
NASA Technical Reports Server (NTRS)
Chu, Anhua; Van Zyl, Jakob; Kim, Yunjin; Hensley, Scott; Lou, Yunling; Madsen, Soren; Chapman, Bruce; Imel, David; Durden, Stephen; Tung, Wayne
2007-01-01
The AIRSAR automated, Web-based data processing and distribution system is an integrated, end-to-end synthetic aperture radar (SAR) processing system. Designed to function under limited resources and rigorous demands, AIRSAR eliminates operational errors and provides for paperless archiving. Also, it provides a yearly tune-up of the processor on flight missions, as well as quality assurance with new radar modes and anomalous data compensation. The software fully integrates a Web-based SAR data-user request subsystem, a data processing system to automatically generate co-registered multi-frequency images from both polarimetric and interferometric data collection modes in 80/40/20 MHz bandwidth, an automated verification quality assurance subsystem, and an automatic data distribution system for use in the remote-sensor community. Features include Survey Automation Processing in which the software can automatically generate a quick-look image from an entire 90-GB SAR raw data 32-MB/s tape overnight without operator intervention. Also, the software allows product ordering and distribution via a Web-based user request system. To make AIRSAR more user friendly, it has been designed to let users search by entering the desired mission flight line (Missions Searching), or to search for any mission flight line by entering the desired latitude and longitude (Map Searching). For precision image automation processing, the software generates the products according to each data processing request stored in the database via a Queue management system. Users are able to have automatic generation of coregistered multi-frequency images as the software generates polarimetric and/or interferometric SAR data processing in ground and/or slant projection according to user processing requests for one of the 12 radar modes.
A Positive Generation Effect on Memory for Auditory Context
Overman, Amy A.; Richard, Alison G.; Stephens, Joseph D. W.
2016-01-01
Self-generation of information during memory encoding has large positive effects on subsequent memory for items, but mixed effects on memory for contextual information associated with items. A processing account of generation effects on context memory (Mulligan, 2004; Mulligan, Lozito, & Rosner, 2006) proposes that these effects depend on whether the generation task causes any shift in processing of the type of context features for which memory is being tested. Mulligan and colleagues have used this account to predict various negative effects of generation on context memory, but the account also predicts positive generation effects under certain circumstances. The present experiment provided a critical test of the processing account by examining how generation affected memory for auditory rather than visual context. Based on the processing account, we predicted that generation of rhyme words should enhance processing of auditory information associated with the words (i.e., voice gender) whereas generation of antonym words should have no effect. These predictions were confirmed, providing support to the processing account. PMID:27696145
Auto Code Generation for Simulink-Based Attitude Determination Control System
NASA Technical Reports Server (NTRS)
MolinaFraticelli, Jose Carlos
2012-01-01
This paper details the work done to auto generate C code from a Simulink-Based Attitude Determination Control System (ADCS) to be used in target platforms. NASA Marshall Engineers have developed an ADCS Simulink simulation to be used as a component for the flight software of a satellite. This generated code can be used for carrying out Hardware in the loop testing of components for a satellite in a convenient manner with easily tunable parameters. Due to the nature of the embedded hardware components such as microcontrollers, this simulation code cannot be used directly, as it is, on the target platform and must first be converted into C code; this process is known as auto code generation. In order to generate C code from this simulation; it must be modified to follow specific standards set in place by the auto code generation process. Some of these modifications include changing certain simulation models into their atomic representations which can bring new complications into the simulation. The execution order of these models can change based on these modifications. Great care must be taken in order to maintain a working simulation that can also be used for auto code generation. After modifying the ADCS simulation for the auto code generation process, it is shown that the difference between the output data of the former and that of the latter is between acceptable bounds. Thus, it can be said that the process is a success since all the output requirements are met. Based on these results, it can be argued that this generated C code can be effectively used by any desired platform as long as it follows the specific memory requirements established in the Simulink Model.
On the design of henon and logistic map-based random number generator
NASA Astrophysics Data System (ADS)
Magfirawaty; Suryadi, M. T.; Ramli, Kalamullah
2017-10-01
The key sequence is one of the main elements in the cryptosystem. True Random Number Generators (TRNG) method is one of the approaches to generating the key sequence. The randomness source of the TRNG divided into three main groups, i.e. electrical noise based, jitter based and chaos based. The chaos based utilizes a non-linear dynamic system (continuous time or discrete time) as an entropy source. In this study, a new design of TRNG based on discrete time chaotic system is proposed, which is then simulated in LabVIEW. The principle of the design consists of combining 2D and 1D chaotic systems. A mathematical model is implemented for numerical simulations. We used comparator process as a harvester method to obtain the series of random bits. Without any post processing, the proposed design generated random bit sequence with high entropy value and passed all NIST 800.22 statistical tests.
Process-based Cost Estimation for Ramjet/Scramjet Engines
NASA Technical Reports Server (NTRS)
Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John
2003-01-01
Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.
Mirus, B.B.; Ebel, B.A.; Heppner, C.S.; Loague, K.
2011-01-01
Concept development simulation with distributed, physics-based models provides a quantitative approach for investigating runoff generation processes across environmental conditions. Disparities within data sets employed to design and parameterize boundary value problems used in heuristic simulation inevitably introduce various levels of bias. The objective was to evaluate the impact of boundary value problem complexity on process representation for different runoff generation mechanisms. The comprehensive physics-based hydrologic response model InHM has been employed to generate base case simulations for four well-characterized catchments. The C3 and CB catchments are located within steep, forested environments dominated by subsurface stormflow; the TW and R5 catchments are located in gently sloping rangeland environments dominated by Dunne and Horton overland flows. Observational details are well captured within all four of the base case simulations, but the characterization of soil depth, permeability, rainfall intensity, and evapotranspiration differs for each. These differences are investigated through the conversion of each base case into a reduced case scenario, all sharing the same level of complexity. Evaluation of how individual boundary value problem characteristics impact simulated runoff generation processes is facilitated by quantitative analysis of integrated and distributed responses at high spatial and temporal resolution. Generally, the base case reduction causes moderate changes in discharge and runoff patterns, with the dominant process remaining unchanged. Moderate differences between the base and reduced cases highlight the importance of detailed field observations for parameterizing and evaluating physics-based models. Overall, similarities between the base and reduced cases indicate that the simpler boundary value problems may be useful for concept development simulation to investigate fundamental controls on the spectrum of runoff generation mechanisms. Copyright 2011 by the American Geophysical Union.
Alterations in choice behavior by manipulations of world model.
Green, C S; Benson, C; Kersten, D; Schrater, P
2010-09-14
How to compute initially unknown reward values makes up one of the key problems in reinforcement learning theory, with two basic approaches being used. Model-free algorithms rely on the accumulation of substantial amounts of experience to compute the value of actions, whereas in model-based learning, the agent seeks to learn the generative process for outcomes from which the value of actions can be predicted. Here we show that (i) "probability matching"-a consistent example of suboptimal choice behavior seen in humans-occurs in an optimal Bayesian model-based learner using a max decision rule that is initialized with ecologically plausible, but incorrect beliefs about the generative process for outcomes and (ii) human behavior can be strongly and predictably altered by the presence of cues suggestive of various generative processes, despite statistically identical outcome generation. These results suggest human decision making is rational and model based and not consistent with model-free learning.
Alterations in choice behavior by manipulations of world model
Green, C. S.; Benson, C.; Kersten, D.; Schrater, P.
2010-01-01
How to compute initially unknown reward values makes up one of the key problems in reinforcement learning theory, with two basic approaches being used. Model-free algorithms rely on the accumulation of substantial amounts of experience to compute the value of actions, whereas in model-based learning, the agent seeks to learn the generative process for outcomes from which the value of actions can be predicted. Here we show that (i) “probability matching”—a consistent example of suboptimal choice behavior seen in humans—occurs in an optimal Bayesian model-based learner using a max decision rule that is initialized with ecologically plausible, but incorrect beliefs about the generative process for outcomes and (ii) human behavior can be strongly and predictably altered by the presence of cues suggestive of various generative processes, despite statistically identical outcome generation. These results suggest human decision making is rational and model based and not consistent with model-free learning. PMID:20805507
Error Generation in CATS-Based Agents
NASA Technical Reports Server (NTRS)
Callantine, Todd
2003-01-01
This research presents a methodology for generating errors from a model of nominally preferred correct operator activities, given a particular operational context, and maintaining an explicit link to the erroneous contextual information to support analyses. It uses the Crew Activity Tracking System (CATS) model as the basis for error generation. This report describes how the process works, and how it may be useful for supporting agent-based system safety analyses. The report presents results obtained by applying the error-generation process and discusses implementation issues. The research is supported by the System-Wide Accident Prevention Element of the NASA Aviation Safety Program.
Automated knowledge generation
NASA Technical Reports Server (NTRS)
Myler, Harley R.; Gonzalez, Avelino J.
1988-01-01
The general objectives of the NASA/UCF Automated Knowledge Generation Project were the development of an intelligent software system that could access CAD design data bases, interpret them, and generate a diagnostic knowledge base in the form of a system model. The initial area of concentration is in the diagnosis of the process control system using the Knowledge-based Autonomous Test Engineer (KATE) diagnostic system. A secondary objective was the study of general problems of automated knowledge generation. A prototype was developed, based on object-oriented language (Flavors).
The role of optimization in the next generation of computer-based design tools
NASA Technical Reports Server (NTRS)
Rogan, J. Edward
1989-01-01
There is a close relationship between design optimization and the emerging new generation of computer-based tools for engineering design. With some notable exceptions, the development of these new tools has not taken full advantage of recent advances in numerical design optimization theory and practice. Recent work in the field of design process architecture has included an assessment of the impact of next-generation computer-based design tools on the design process. These results are summarized, and insights into the role of optimization in a design process based on these next-generation tools are presented. An example problem has been worked out to illustrate the application of this technique. The example problem - layout of an aircraft main landing gear - is one that is simple enough to be solved by many other techniques. Although the mathematical relationships describing the objective function and constraints for the landing gear layout problem can be written explicitly and are quite straightforward, an approximation technique has been used in the solution of this problem that can just as easily be applied to integrate supportability or producibility assessments using theory of measurement techniques into the design decision-making process.
NASA Astrophysics Data System (ADS)
Görgl, Richard; Brandstätter, Elmar
2017-01-01
The article presents an overview of what is possible nowadays in the field of laser materials processing. The state of the art in the complete process chain is shown, starting with the generation of a specific components CAD data and continuing with the automated motion path generation for the laser head carried by a CNC or robot system. Application examples from laser cladding and laser-based additive manufacturing are given.
Impact Assessment and Environmental Evaluation of Various Ammonia Production Processes
NASA Astrophysics Data System (ADS)
Bicer, Yusuf; Dincer, Ibrahim; Vezina, Greg; Raso, Frank
2017-05-01
In the current study, conventional resources-based ammonia generation routes are comparatively studied through a comprehensive life cycle assessment. The selected ammonia generation options range from mostly used steam methane reforming to partial oxidation of heavy oil. The chosen ammonia synthesis process is the most common commercially available Haber-Bosch process. The essential energy input for the methods are used from various conventional resources such as coal, nuclear, natural gas and heavy oil. Using the life cycle assessment methodology, the environmental impacts of selected methods are identified and quantified from cradle to gate. The life cycle assessment outcomes of the conventional resources based ammonia production routes show that nuclear electrolysis-based ammonia generation method yields the lowest global warming and climate change impacts while the coal-based electrolysis options bring higher environmental problems. The calculated greenhouse gas emission from nuclear-based electrolysis is 0.48 kg CO2 equivalent while it is 13.6 kg CO2 per kg of ammonia for coal-based electrolysis method.
Impact Assessment and Environmental Evaluation of Various Ammonia Production Processes.
Bicer, Yusuf; Dincer, Ibrahim; Vezina, Greg; Raso, Frank
2017-05-01
In the current study, conventional resources-based ammonia generation routes are comparatively studied through a comprehensive life cycle assessment. The selected ammonia generation options range from mostly used steam methane reforming to partial oxidation of heavy oil. The chosen ammonia synthesis process is the most common commercially available Haber-Bosch process. The essential energy input for the methods are used from various conventional resources such as coal, nuclear, natural gas and heavy oil. Using the life cycle assessment methodology, the environmental impacts of selected methods are identified and quantified from cradle to gate. The life cycle assessment outcomes of the conventional resources based ammonia production routes show that nuclear electrolysis-based ammonia generation method yields the lowest global warming and climate change impacts while the coal-based electrolysis options bring higher environmental problems. The calculated greenhouse gas emission from nuclear-based electrolysis is 0.48 kg CO 2 equivalent while it is 13.6 kg CO 2 per kg of ammonia for coal-based electrolysis method.
NASA Technical Reports Server (NTRS)
Izygon, Michel E.
1992-01-01
The development process of the knowledge base for the generation of Test Libraries for Mission Operations Computer (MOC) Command Support focused on a series of information gathering interviews. These knowledge capture sessions are supporting the development of a prototype for evaluating the capabilities of INTUIT on such an application. the prototype includes functions related to POCC (Payload Operation Control Center) processing. It prompts the end-users for input through a series of panels and then generates the Meds associated with the initialization and the update of hazardous command tables for a POCC Processing TLIB.
Pulsed corona generation using a diode-based pulsed power generator
NASA Astrophysics Data System (ADS)
Pemen, A. J. M.; Grekhov, I. V.; van Heesch, E. J. M.; Yan, K.; Nair, S. A.; Korotkov, S. V.
2003-10-01
Pulsed plasma techniques serve a wide range of unconventional processes, such as gas and water processing, hydrogen production, and nanotechnology. Extending research on promising applications, such as pulsed corona processing, depends to a great extent on the availability of reliable, efficient and repetitive high-voltage pulsed power technology. Heavy-duty opening switches are the most critical components in high-voltage pulsed power systems with inductive energy storage. At the Ioffe Institute, an unconventional switching mechanism has been found, based on the fast recovery process in a diode. This article discusses the application of such a "drift-step-recovery-diode" for pulsed corona plasma generation. The principle of the diode-based nanosecond high-voltage generator will be discussed. The generator will be coupled to a corona reactor via a transmission-line transformer. The advantages of this concept, such as easy voltage transformation, load matching, switch protection and easy coupling with a dc bias voltage, will be discussed. The developed circuit is tested at both a resistive load and various corona reactors. Methods to optimize the energy transfer to a corona reactor have been evaluated. The impedance matching between the pulse generator and corona reactor can be significantly improved by using a dc bias voltage. At good matching, the corona energy increases and less energy reflects back to the generator. Matching can also be slightly improved by increasing the temperature in the corona reactor. More effective is to reduce the reactor pressure.
The Scope of Usage-Based Theory
Ibbotson, Paul
2013-01-01
Usage-based approaches typically draw on a relatively small set of cognitive processes, such as categorization, analogy, and chunking to explain language structure and function. The goal of this paper is to first review the extent to which the “cognitive commitment” of usage-based theory has had success in explaining empirical findings across domains, including language acquisition, processing, and typology. We then look at the overall strengths and weaknesses of usage-based theory and highlight where there are significant debates. Finally, we draw special attention to a set of culturally generated structural patterns that seem to lie beyond the explanation of core usage-based cognitive processes. In this context we draw a distinction between cognition permitting language structure vs. cognition entailing language structure. As well as addressing the need for greater clarity on the mechanisms of generalizations and the fundamental units of grammar, we suggest that integrating culturally generated structures within existing cognitive models of use will generate tighter predictions about how language works. PMID:23658552
Energy Optimization for a Weak Hybrid Power System of an Automobile Exhaust Thermoelectric Generator
NASA Astrophysics Data System (ADS)
Fang, Wei; Quan, Shuhai; Xie, Changjun; Tang, Xinfeng; Ran, Bin; Jiao, Yatian
2017-11-01
An integrated starter generator (ISG)-type hybrid electric vehicle (HEV) scheme is proposed based on the automobile exhaust thermoelectric generator (AETEG). An eddy current dynamometer is used to simulate the vehicle's dynamic cycle. A weak ISG hybrid bench test system is constructed to test the 48 V output from the power supply system, which is based on engine exhaust-based heat power generation. The thermoelectric power generation-based system must ultimately be tested when integrated into the ISG weak hybrid mixed power system. The test process is divided into two steps: comprehensive simulation and vehicle-based testing. The system's dynamic process is simulated for both conventional and thermoelectric powers, and the dynamic running process comprises four stages: starting, acceleration, cruising and braking. The quantity of fuel available and battery pack energy, which are used as target vehicle energy functions for comparison with conventional systems, are simplified into a single energy target function, and the battery pack's output current is used as the control variable in the thermoelectric hybrid energy optimization model. The system's optimal battery pack output current function is resolved when its dynamic operating process is considered as part of the hybrid thermoelectric power generation system. In the experiments, the system bench is tested using conventional power and hybrid thermoelectric power for the four dynamic operation stages. The optimal battery pack curve is calculated by functional analysis. In the vehicle, a power control unit is used to control the battery pack's output current and minimize energy consumption. Data analysis shows that the fuel economy of the hybrid power system under European Driving Cycle conditions is improved by 14.7% when compared with conventional systems.
Grey Comprehensive Evaluation of Biomass Power Generation Project Based on Group Judgement
NASA Astrophysics Data System (ADS)
Xia, Huicong; Niu, Dongxiao
2017-06-01
The comprehensive evaluation of benefit is an important task needed to be carried out at all stages of biomass power generation projects. This paper proposed an improved grey comprehensive evaluation method based on triangle whiten function. To improve the objectivity of weight calculation result of only reference comparison judgment method, this paper introduced group judgment to the weighting process. In the process of grey comprehensive evaluation, this paper invited a number of experts to estimate the benefit level of projects, and optimized the basic estimations based on the minimum variance principle to improve the accuracy of evaluation result. Taking a biomass power generation project as an example, the grey comprehensive evaluation result showed that the benefit level of this project was good. This example demonstrates the feasibility of grey comprehensive evaluation method based on group judgment for benefit evaluation of biomass power generation project.
Automated speech understanding: the next generation
NASA Astrophysics Data System (ADS)
Picone, J.; Ebel, W. J.; Deshmukh, N.
1995-04-01
Modern speech understanding systems merge interdisciplinary technologies from Signal Processing, Pattern Recognition, Natural Language, and Linguistics into a unified statistical framework. These systems, which have applications in a wide range of signal processing problems, represent a revolution in Digital Signal Processing (DSP). Once a field dominated by vector-oriented processors and linear algebra-based mathematics, the current generation of DSP-based systems rely on sophisticated statistical models implemented using a complex software paradigm. Such systems are now capable of understanding continuous speech input for vocabularies of several thousand words in operational environments. The current generation of deployed systems, based on small vocabularies of isolated words, will soon be replaced by a new technology offering natural language access to vast information resources such as the Internet, and provide completely automated voice interfaces for mundane tasks such as travel planning and directory assistance.
Simulation based optimization on automated fibre placement process
NASA Astrophysics Data System (ADS)
Lei, Shi
2018-02-01
In this paper, a software simulation (Autodesk TruPlan & TruFiber) based method is proposed to optimize the automate fibre placement (AFP) process. Different types of manufacturability analysis are introduced to predict potential defects. Advanced fibre path generation algorithms are compared with respect to geometrically different parts. Major manufacturing data have been taken into consideration prior to the tool paths generation to achieve high success rate of manufacturing.
A Positive Generation Effect on Memory for Auditory Context.
Overman, Amy A; Richard, Alison G; Stephens, Joseph D W
2017-06-01
Self-generation of information during memory encoding has large positive effects on subsequent memory for items, but mixed effects on memory for contextual information associated with items. A processing account of generation effects on context memory (Mulligan in Journal of Experimental Psychology: Learning, Memory, and Cognition, 30(4), 838-855, 2004; Mulligan, Lozito, & Rosner in Journal of Experimental Psychology: Learning, Memory, and Cognition, 32(4), 836-846, 2006) proposes that these effects depend on whether the generation task causes any shift in processing of the type of context features for which memory is being tested. Mulligan and colleagues have used this account to predict various negative effects of generation on context memory, but the account also predicts positive generation effects under certain circumstances. The present experiment provided a critical test of the processing account by examining how generation affected memory for auditory rather than visual context. Based on the processing account, we predicted that generation of rhyme words should enhance processing of auditory information associated with the words (i.e., voice gender), whereas generation of antonym words should have no effect. These predictions were confirmed, providing support to the processing account.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saraswathy, P.; Dey, A.C.; Sarkar, S.K.
The Indian pursuit of gel generator technology for {sup 99m}Tc was driven mainly by three considerations, namely, (i) well-established and ease of reliable production of (n, gamma)-based {sup 99}Mo in several tens of GBq quantities in the research reactors in Trombay/Mumbai, India, (ii) need for relatively low-cost alternate technology to replace the solvent (MEK) extraction generator system in use in India since 1970s and (iii) minimize dependency on weekly import of fission-produced {sup 99}Mo raw material required for alumina column generator. Extensive investigations on process standardisation for zirconium molybdate gel (ZMG) led to a steady progress, achieved both in termsmore » of process technology and final performance of {sup 99m}Tc gel generators. The {sup 99m}Tc final product purity from the Indian gel system was comparable to that obtained from the gold-standard alumina column generators. Based on the feasibility established for reliable small-scale production, as well as satisfactory clinical experience with a number of gel generators used in collaborating hospital radiopharmacies, full-fledged mechanised processing facilities for handling up to 150 g of ZMG were set up. The indigenous design and development included setting up of shielded plant facilities with pneumatic-driven as well as manual controls and special gadgets such as, microwave heating of the zirconium molybdate cake, dispenser for gel granules, loading of gel columns into pre-assembled generator housing etc. Formal review of the safety features was carried out by the regulatory body and stage-wise clearance for processing low and medium level {sup 99}Mo activity was granted. Starting from around 70 GBq {sup 99}Mo handling, the processing facilities have since been successfully operated at a level of 740 GBq {sup 99}Mo, twice a month. In all 18 batches of gel have been processed and 156 generators produced. The individual generator capacity was 15 to 30 GBq with an elution yield of nearly 75%. 129 generators were supplied to 11 user hospitals and the estimated number of clinical studies done is well over 5000. The salient aspects of the Indian experience have been reported in many a forum and shared with the IAEA through the on-going CRP. The detailed process know-how is available for technology transfer from BRIT, India. (author)« less
Single crystals and nonlinear process for outstanding vibration-powered electrical generators.
Badel, Adrien; Benayad, Abdelmjid; Lefeuvre, Elie; Lebrun, Laurent; Richard, Claude; Guyomar, Daniel
2006-04-01
This paper compares the performances of vibration-powered electrical generators using a piezoelectric ceramic and a piezoelectric single crystal associated to several power conditioning circuits. A new approach of the piezoelectric power conversion based on a nonlinear voltage processing is presented, leading to three novel high performance power conditioning interfaces. Theoretical predictions and experimental results show that the nonlinear processing technique may increase the power harvested by a factor of 8 compared to standard techniques. Moreover, it is shown that, for a given energy harvesting technique, generators using single crystals deliver 20 times more power than generators using piezoelectric ceramics.
Rosnell, Tomi; Honkavaara, Eija
2012-01-01
The objective of this investigation was to develop and investigate methods for point cloud generation by image matching using aerial image data collected by quadrocopter type micro unmanned aerial vehicle (UAV) imaging systems. Automatic generation of high-quality, dense point clouds from digital images by image matching is a recent, cutting-edge step forward in digital photogrammetric technology. The major components of the system for point cloud generation are a UAV imaging system, an image data collection process using high image overlaps, and post-processing with image orientation and point cloud generation. Two post-processing approaches were developed: one of the methods is based on Bae Systems’ SOCET SET classical commercial photogrammetric software and another is built using Microsoft®’s Photosynth™ service available in the Internet. Empirical testing was carried out in two test areas. Photosynth processing showed that it is possible to orient the images and generate point clouds fully automatically without any a priori orientation information or interactive work. The photogrammetric processing line provided dense and accurate point clouds that followed the theoretical principles of photogrammetry, but also some artifacts were detected. The point clouds from the Photosynth processing were sparser and noisier, which is to a large extent due to the fact that the method is not optimized for dense point cloud generation. Careful photogrammetric processing with self-calibration is required to achieve the highest accuracy. Our results demonstrate the high performance potential of the approach and that with rigorous processing it is possible to reach results that are consistent with theory. We also point out several further research topics. Based on theoretical and empirical results, we give recommendations for properties of imaging sensor, data collection and processing of UAV image data to ensure accurate point cloud generation. PMID:22368479
Rosnell, Tomi; Honkavaara, Eija
2012-01-01
The objective of this investigation was to develop and investigate methods for point cloud generation by image matching using aerial image data collected by quadrocopter type micro unmanned aerial vehicle (UAV) imaging systems. Automatic generation of high-quality, dense point clouds from digital images by image matching is a recent, cutting-edge step forward in digital photogrammetric technology. The major components of the system for point cloud generation are a UAV imaging system, an image data collection process using high image overlaps, and post-processing with image orientation and point cloud generation. Two post-processing approaches were developed: one of the methods is based on Bae Systems' SOCET SET classical commercial photogrammetric software and another is built using Microsoft(®)'s Photosynth™ service available in the Internet. Empirical testing was carried out in two test areas. Photosynth processing showed that it is possible to orient the images and generate point clouds fully automatically without any a priori orientation information or interactive work. The photogrammetric processing line provided dense and accurate point clouds that followed the theoretical principles of photogrammetry, but also some artifacts were detected. The point clouds from the Photosynth processing were sparser and noisier, which is to a large extent due to the fact that the method is not optimized for dense point cloud generation. Careful photogrammetric processing with self-calibration is required to achieve the highest accuracy. Our results demonstrate the high performance potential of the approach and that with rigorous processing it is possible to reach results that are consistent with theory. We also point out several further research topics. Based on theoretical and empirical results, we give recommendations for properties of imaging sensor, data collection and processing of UAV image data to ensure accurate point cloud generation.
USDA-ARS?s Scientific Manuscript database
First-generation (ie., corn-based) fuel ethanol production processes provide several advantages which could be synergistically applied to overcome limitations of second-generation biofuel processes from lignocellulose. These include resources such as equipment, manpower, nutrients, water, and heat....
NASA Astrophysics Data System (ADS)
Lu, Mark; Liang, Curtis; King, Dion; Melvin, Lawrence S., III
2005-11-01
Model-based Optical Proximity correction has become an indispensable tool for achieving wafer pattern to design fidelity at current manufacturing process nodes. Most model-based OPC is performed considering the nominal process condition, with limited consideration of through process manufacturing robustness. This study examines the use of off-target process models - models that represent non-nominal process states such as would occur with a dose or focus variation - to understands and manipulate the final pattern correction to a more process robust configuration. The study will first examine and validate the process of generating an off-target model, then examine the quality of the off-target model. Once the off-target model is proven, it will be used to demonstrate methods of generating process robust corrections. The concepts are demonstrated using a 0.13 μm logic gate process. Preliminary indications show success in both off-target model production and process robust corrections. With these off-target models as tools, mask production cycle times can be reduced.
Declarative Business Process Modelling and the Generation of ERP Systems
NASA Astrophysics Data System (ADS)
Schultz-Møller, Nicholas Poul; Hølmer, Christian; Hansen, Michael R.
We present an approach to the construction of Enterprise Resource Planning (ERP) Systems, which is based on the Resources, Events and Agents (REA) ontology. This framework deals with processes involving exchange and flow of resources in a declarative, graphically-based manner describing what the major entities are rather than how they engage in computations. We show how to develop a domain-specific language on the basis of REA, and a tool which automatically can generate running web-applications. A main contribution is a proof-of-concept showing that business-domain experts can generate their own applications without worrying about implementation details.
Early stage hot spot analysis through standard cell base random pattern generation
NASA Astrophysics Data System (ADS)
Jeon, Joong-Won; Song, Jaewan; Kim, Jeong-Lim; Park, Seongyul; Yang, Seung-Hune; Lee, Sooryong; Kang, Hokyu; Madkour, Kareem; ElManhawy, Wael; Lee, SeungJo; Kwan, Joe
2017-04-01
Due to limited availability of DRC clean patterns during the process and RET recipe development, OPC recipes are not tested with high pattern coverage. Various kinds of pattern can help OPC engineer to detect sensitive patterns to lithographic effects. Random pattern generation is needed to secure robust OPC recipe. However, simple random patterns without considering real product layout style can't cover patterning hotspot in production levels. It is not effective to use them for OPC optimization thus it is important to generate random patterns similar to real product patterns. This paper presents a strategy for generating random patterns based on design architecture information and preventing hotspot in early process development stage through a tool called Layout Schema Generator (LSG). Using LSG, we generate standard cell based on random patterns reflecting real design cell structure - fin pitch, gate pitch and cell height. The output standard cells from LSG are applied to an analysis methodology to assess their hotspot severity by assigning a score according to their optical image parameters - NILS, MEEF, %PV band and thus potential hotspots can be defined by determining their ranking. This flow is demonstrated on Samsung 7nm technology optimizing OPC recipe and early enough in the process avoiding using problematic patterns.
Bolea, Mario; Mora, José; Ortega, Beatriz; Capmany, José
2012-03-12
A novel all-optical technique based on the incoherent processing of optical signals using high-order dispersive elements is analyzed for microwave arbitrary pulse generation. We show an approach which allows a full reconfigurability of a pulse in terms of chirp, envelope and central frequency by the proper control of the second-order dispersion and the incoherent optical source power distribution, achieving large values of time-bandwidth product.
An Ontology-Based Approach to Incorporate User-Generated Geo-Content Into Sdi
NASA Astrophysics Data System (ADS)
Deng, D.-P.; Lemmens, R.
2011-08-01
The Web is changing the way people share and communicate information because of emergence of various Web technologies, which enable people to contribute information on the Web. User-Generated Geo-Content (UGGC) is a potential resource of geographic information. Due to the different production methods, UGGC often cannot fit in geographic information model. There is a semantic gap between UGGC and formal geographic information. To integrate UGGC into geographic information, this study conducts an ontology-based process to bridge this semantic gap. This ontology-based process includes five steps: Collection, Extraction, Formalization, Mapping, and Deployment. In addition, this study implements this process on Twitter messages, which is relevant to Japan Earthquake disaster. By using this process, we extract disaster relief information from Twitter messages, and develop a knowledge base for GeoSPARQL queries in disaster relief information.
Novel secret key generation techniques using memristor devices
NASA Astrophysics Data System (ADS)
Abunahla, Heba; Shehada, Dina; Yeun, Chan Yeob; Mohammad, Baker; Jaoude, Maguy Abi
2016-02-01
This paper proposes novel secret key generation techniques using memristor devices. The approach depends on using the initial profile of a memristor as a master key. In addition, session keys are generated using the master key and other specified parameters. In contrast to existing memristor-based security approaches, the proposed development is cost effective and power efficient since the operation can be achieved with a single device rather than a crossbar structure. An algorithm is suggested and demonstrated using physics based Matlab model. It is shown that the generated keys can have dynamic size which provides perfect security. Moreover, the proposed encryption and decryption technique using the memristor based generated keys outperforms Triple Data Encryption Standard (3DES) and Advanced Encryption Standard (AES) in terms of processing time. This paper is enriched by providing characterization results of a fabricated microscale Al/TiO2/Al memristor prototype in order to prove the concept of the proposed approach and study the impacts of process variations. The work proposed in this paper is a milestone towards System On Chip (SOC) memristor based security.
Ekins, Sean; Olechno, Joe; Williams, Antony J.
2013-01-01
Dispensing and dilution processes may profoundly influence estimates of biological activity of compounds. Published data show Ephrin type-B receptor 4 IC50 values obtained via tip-based serial dilution and dispensing versus acoustic dispensing with direct dilution differ by orders of magnitude with no correlation or ranking of datasets. We generated computational 3D pharmacophores based on data derived by both acoustic and tip-based transfer. The computed pharmacophores differ significantly depending upon dispensing and dilution methods. The acoustic dispensing-derived pharmacophore correctly identified active compounds in a subsequent test set where the tip-based method failed. Data from acoustic dispensing generates a pharmacophore containing two hydrophobic features, one hydrogen bond donor and one hydrogen bond acceptor. This is consistent with X-ray crystallography studies of ligand-protein interactions and automatically generated pharmacophores derived from this structural data. In contrast, the tip-based data suggest a pharmacophore with two hydrogen bond acceptors, one hydrogen bond donor and no hydrophobic features. This pharmacophore is inconsistent with the X-ray crystallographic studies and automatically generated pharmacophores. In short, traditional dispensing processes are another important source of error in high-throughput screening that impacts computational and statistical analyses. These findings have far-reaching implications in biological research. PMID:23658723
The neural component-process architecture of endogenously generated emotion
Kanske, Philipp; Singer, Tania
2017-01-01
Abstract Despite the ubiquity of endogenous emotions and their role in both resilience and pathology, the processes supporting their generation are largely unknown. We propose a neural component process model of endogenous generation of emotion (EGE) and test it in two functional magnetic resonance imaging (fMRI) experiments (N = 32/293) where participants generated and regulated positive and negative emotions based on internal representations, usin self-chosen generation methods. EGE activated nodes of salience (SN), default mode (DMN) and frontoparietal control (FPCN) networks. Component processes implemented by these networks were established by investigating their functional associations, activation dynamics and integration. SN activation correlated with subjective affect, with midbrain nodes exclusively distinguishing between positive and negative affect intensity, showing dynamics consistent generation of core affect. Dorsomedial DMN, together with ventral anterior insula, formed a pathway supporting multiple generation methods, with activation dynamics suggesting it is involved in the generation of elaborated experiential representations. SN and DMN both coupled to left frontal FPCN which in turn was associated with both subjective affect and representation formation, consistent with FPCN supporting the executive coordination of the generation process. These results provide a foundation for research into endogenous emotion in normal, pathological and optimal function. PMID:27522089
GENOPT 2016: Design of a generalization-based challenge in global optimization
NASA Astrophysics Data System (ADS)
Battiti, Roberto; Sergeyev, Yaroslav; Brunato, Mauro; Kvasov, Dmitri
2016-10-01
While comparing results on benchmark functions is a widely used practice to demonstrate the competitiveness of global optimization algorithms, fixed benchmarks can lead to a negative data mining process. To avoid this negative effect, the GENOPT contest benchmarks can be used which are based on randomized function generators, designed for scientific experiments, with fixed statistical characteristics but individual variation of the generated instances. The generators are available to participants for off-line tests and online tuning schemes, but the final competition is based on random seeds communicated in the last phase through a cooperative process. A brief presentation and discussion of the methods and results obtained in the framework of the GENOPT contest are given in this contribution.
NASA Astrophysics Data System (ADS)
Zhou, Yuping; Zhang, Qi
2018-04-01
In the information environment, digital and information processing to Li brocade patterns reveals an important means of Li ethnic style and inheriting the national culture. Adobe Illustrator CS3 and Java language were used in the paper to make "variation" processing to Li brocade patterns, and generate "Li brocade pattern mutant genes". The generation of pattern mutant genes includes color mutation, shape mutation, adding and missing transform, and twisted transform, etc. Research shows that Li brocade pattern mutant genes can be generated by using the Adobe Illustrator CS3 and the image processing tools of Java language edit, etc.
The Next Generation of HLA Image Products
NASA Astrophysics Data System (ADS)
Gaffney, N. I.; Casertano, S.; Ferguson, B.
2012-09-01
We present the re-engineered pipeline based on existing and improved algorithms with the aim of improving processing quality, cross-instrument portability, data flow management, and software maintenance. The Hubble Legacy Archive (HLA) is a project to add value to the Hubble Space Telescope data archive by producing and delivering science-ready drizzled data products and source lists derived from these products. Initially, ACS, NICMOS, and WFCP2 data were combined using instrument-specific pipelines based on scripts developed to process the ACS GOODS data and a separate set of scripts to generate source extractor and DAOPhot source lists. The new pipeline, initially designed for WFC3 data, isolates instrument-specific processing and is easily extendable to other instruments and to generating wide-area mosaics. Significant improvements have been made in image combination using improved alignment, source detection, and background equalization routines. It integrates improved alignment procedures, better noise model, and source list generation within a single code base. Wherever practical, PyRAF based routines have been replaced with non-IRAF based python libraries (e.g. NumPy and PyFITS). The data formats have been modified to handle better and more consistent propagation of information from individual exposures to the combined products. A new exposure layer stores the effective exposure time for each pixel in the sky which is key in properly interpreting combined images from diverse data that were not initially planned to be mosaiced. We worked to improve the validity of the metadata within our FITS headers for these products relative to standard IRAF/PyRAF processing. Any keywords that pertain to individual exposures have been removed from the primary and extension headers and placed in a table extension for more direct and efficient perusal. This mechanism also allows for more detailed information on the processing of individual images to be stored and propagated providing a more hierarchical metadata storage system than key value pair FITS headers provide. In this poster we will discuss the changes to the pipeline processing and source list generation and the lessons learned which may be applicable to other archive projects as well as discuss our new metadata curation and preservation process.
MATTS- A Step Towards Model Based Testing
NASA Astrophysics Data System (ADS)
Herpel, H.-J.; Willich, G.; Li, J.; Xie, J.; Johansen, B.; Kvinnesland, K.; Krueger, S.; Barrios, P.
2016-08-01
In this paper we describe a Model Based approach to testing of on-board software and compare it with traditional validation strategy currently applied to satellite software. The major problems that software engineering will face over at least the next two decades are increasing application complexity driven by the need for autonomy and serious application robustness. In other words, how do we actually get to declare success when trying to build applications one or two orders of magnitude more complex than today's applications. To solve the problems addressed above the software engineering process has to be improved at least for two aspects: 1) Software design and 2) Software testing. The software design process has to evolve towards model-based approaches with extensive use of code generators. Today, testing is an essential, but time and resource consuming activity in the software development process. Generating a short, but effective test suite usually requires a lot of manual work and expert knowledge. In a model-based process, among other subtasks, test construction and test execution can also be partially automated. The basic idea behind the presented study was to start from a formal model (e.g. State Machines), generate abstract test cases which are then converted to concrete executable test cases (input and expected output pairs). The generated concrete test cases were applied to an on-board software. Results were collected and evaluated wrt. applicability, cost-efficiency, effectiveness at fault finding, and scalability.
On the Development of a Hospital-Patient Web-Based Communication Tool: A Case Study From Norway.
Granja, Conceição; Dyb, Kari; Bolle, Stein Roald; Hartvigsen, Gunnar
2015-01-01
Surgery cancellations are undesirable in hospital settings as they increase costs, reduce productivity and efficiency, and directly affect the patient. The problem of elective surgery cancellations in a North Norwegian University Hospital is addressed. Based on a three-step methodology conducted at the hospital, the preoperative planning process was modeled taking into consideration the narratives from different health professions. From the analysis of the generated process models, it is concluded that in order to develop a useful patient centered web-based communication tool, it is necessary to fully understand how hospitals plan and organize surgeries today. Moreover, process reengineering is required to generate a standard process that can serve as a tool for health ICT designers to define the requirements for a robust and useful system.
A CSP-Based Agent Modeling Framework for the Cougaar Agent-Based Architecture
NASA Technical Reports Server (NTRS)
Gracanin, Denis; Singh, H. Lally; Eltoweissy, Mohamed; Hinchey, Michael G.; Bohner, Shawn A.
2005-01-01
Cognitive Agent Architecture (Cougaar) is a Java-based architecture for large-scale distributed agent-based applications. A Cougaar agent is an autonomous software entity with behaviors that represent a real-world entity (e.g., a business process). A Cougaar-based Model Driven Architecture approach, currently under development, uses a description of system's functionality (requirements) to automatically implement the system in Cougaar. The Communicating Sequential Processes (CSP) formalism is used for the formal validation of the generated system. Two main agent components, a blackboard and a plugin, are modeled as CSP processes. A set of channels represents communications between the blackboard and individual plugins. The blackboard is represented as a CSP process that communicates with every agent in the collection. The developed CSP-based Cougaar modeling framework provides a starting point for a more complete formal verification of the automatically generated Cougaar code. Currently it is used to verify the behavior of an individual agent in terms of CSP properties and to analyze the corresponding Cougaar society.
A second generation 50 Mbps VLSI level zero processing system prototype
NASA Technical Reports Server (NTRS)
Harris, Jonathan C.; Shi, Jeff; Speciale, Nick; Bennett, Toby
1994-01-01
Level Zero Processing (LZP) generally refers to telemetry data processing functions performed at ground facilities to remove all communication artifacts from instrument data. These functions typically include frame synchronization, error detection and correction, packet reassembly and sorting, playback reversal, merging, time-ordering, overlap deletion, and production of annotated data sets. The Data Systems Technologies Division (DSTD) at Goddard Space Flight Center (GSFC) has been developing high-performance Very Large Scale Integration Level Zero Processing Systems (VLSI LZPS) since 1989. The first VLSI LZPS prototype demonstrated 20 Megabits per second (Mbp's) capability in 1992. With a new generation of high-density Application-specific Integrated Circuits (ASIC) and a Mass Storage System (MSS) based on the High-performance Parallel Peripheral Interface (HiPPI), a second prototype has been built that achieves full 50 Mbp's performance. This paper describes the second generation LZPS prototype based upon VLSI technologies.
Influence of winding construction on starter-generator thermal processes
NASA Astrophysics Data System (ADS)
Grachev, P. Yu; Bazarov, A. A.; Tabachinskiy, A. S.
2018-01-01
Dynamic processes in starter-generators features high winding are overcurrent. It can lead to insulation overheating and fault operation mode. For hybrid and electric vehicles, new high efficiency construction of induction machines windings is proposed. Stator thermal processes need be considered in the most difficult operation modes. The article describes construction features of new compact stator windings, electromagnetic and thermal models of processes in stator windings and explains the influence of innovative construction on thermal processes. Models are based on finite element method.
Integrable high order UWB pulse photonic generator based on cross phase modulation in a SOA-MZI.
Moreno, Vanessa; Rius, Manuel; Mora, José; Muriel, Miguel A; Capmany, José
2013-09-23
We propose and experimentally demonstrate a potentially integrable optical scheme to generate high order UWB pulses. The technique is based on exploiting the cross phase modulation generated in an InGaAsP Mach-Zehnder interferometer containing integrated semiconductor optical amplifiers, and is also adaptable to different pulse modulation formats through an optical processing unit which allows to control of the amplitude, polarity and time delay of the generated taps.
NASA Astrophysics Data System (ADS)
Xiao, Heng; Gou, Xiaolong; Yang, Suwen
2011-05-01
Thermoelectric (TE) power generation technology, due to its several advantages, is becoming a noteworthy research direction. Many researchers conduct their performance analysis and optimization of TE devices and related applications based on the generalized thermoelectric energy balance equations. These generalized TE equations involve the internal irreversibility of Joule heating inside the thermoelectric device and heat leakage through the thermoelectric couple leg. However, it is assumed that the thermoelectric generator (TEG) is thermally isolated from the surroundings except for the heat flows at the cold and hot junctions. Since the thermoelectric generator is a multi-element device in practice, being composed of many fundamental TE couple legs, the effect of heat transfer between the TE couple leg and the ambient environment is not negligible. In this paper, based on basic theories of thermoelectric power generation and thermal science, detailed modeling of a thermoelectric generator taking account of the phenomenon of energy loss from the TE couple leg is reported. The revised generalized thermoelectric energy balance equations considering the effect of heat transfer between the TE couple leg and the ambient environment have been derived. Furthermore, characteristics of a multi-element thermoelectric generator with irreversibility have been investigated on the basis of the new derived TE equations. In the present investigation, second-law-based thermodynamic analysis (exergy analysis) has been applied to the irreversible heat transfer process in particular. It is found that the existence of the irreversible heat convection process causes a large loss of heat exergy in the TEG system, and using thermoelectric generators for low-grade waste heat recovery has promising potential. The results of irreversibility analysis, especially irreversible effects on generator system performance, based on the system model established in detail have guiding significance for the development and application of thermoelectric generators, particularly for the design and optimization of TE modules.
Experiences on developing digital down conversion algorithms using Xilinx system generator
NASA Astrophysics Data System (ADS)
Xu, Chengfa; Yuan, Yuan; Zhao, Lizhi
2013-07-01
The Digital Down Conversion (DDC) algorithm is a classical signal processing method which is widely used in radar and communication systems. In this paper, the DDC function is implemented by Xilinx System Generator tool on FPGA. System Generator is an FPGA design tool provided by Xilinx Inc and MathWorks Inc. It is very convenient for programmers to manipulate the design and debug the function, especially for the complex algorithm. Through the developing process of DDC function based on System Generator, the results show that System Generator is a very fast and efficient tool for FPGA design.
Generating Systems Biology Markup Language Models from the Synthetic Biology Open Language.
Roehner, Nicholas; Zhang, Zhen; Nguyen, Tramy; Myers, Chris J
2015-08-21
In the context of synthetic biology, model generation is the automated process of constructing biochemical models based on genetic designs. This paper discusses the use cases for model generation in genetic design automation (GDA) software tools and introduces the foundational concepts of standards and model annotation that make this process useful. Finally, this paper presents an implementation of model generation in the GDA software tool iBioSim and provides an example of generating a Systems Biology Markup Language (SBML) model from a design of a 4-input AND sensor written in the Synthetic Biology Open Language (SBOL).
Ratcliffe, M B; Khan, J H; Magee, K M; McElhinney, D B; Hubner, C
2000-06-01
Using a Java-based intranet program (applet), we collected postoperative process data after coronary artery bypass grafting. A Java-based applet was developed and deployed on a hospital intranet. Briefly, the nurse entered patient process data using a point and click interface. The applet generated a nursing note, and process data were saved in a Microsoft Access database. In 10 patients, this method was validated by comparison with a retrospective chart review. In 45 consecutive patients, weekly control charts were generated from the data. When aberrations from the pathway occurred, feedback was initiated to restore the goals of the critical pathway. The intranet process data collection method was verified by a manual chart review with 98% sensitivity. The control charts for time to extubation, intensive care unit stay, and hospital stay showed a deviation from critical pathway goals after the first 20 patients. Feedback modulation was associated with a return to critical pathway goals. Java-based applets are inexpensive and can collect accurate postoperative process data, identify critical pathway deviations, and allow timely feedback of process data.
NASA Astrophysics Data System (ADS)
Jelinek, H. J.
1986-01-01
This is the Final Report of Electronic Design Associates on its Phase I SBIR project. The purpose of this project is to develop a method for correcting helium speech, as experienced in diver-surface communication. The goal of the Phase I study was to design, prototype, and evaluate a real time helium speech corrector system based upon digital signal processing techniques. The general approach was to develop hardware (an IBM PC board) to digitize helium speech and software (a LAMBDA computer based simulation) to translate the speech. As planned in the study proposal, this initial prototype may now be used to assess expected performance from a self contained real time system which uses an identical algorithm. The Final Report details the work carried out to produce the prototype system. Four major project tasks were: a signal processing scheme for converting helium speech to normal sounding speech was generated. The signal processing scheme was simulated on a general purpose (LAMDA) computer. Actual helium speech was supplied to the simulation and the converted speech was generated. An IBM-PC based 14 bit data Input/Output board was designed and built. A bibliography of references on speech processing was generated.
Won, Jongsung; Cheng, Jack C P; Lee, Ghang
2016-03-01
Waste generated in construction and demolition processes comprised around 50% of the solid waste in South Korea in 2013. Many cases show that design validation based on building information modeling (BIM) is an effective means to reduce the amount of construction waste since construction waste is mainly generated due to improper design and unexpected changes in the design and construction phases. However, the amount of construction waste that could be avoided by adopting BIM-based design validation has been unknown. This paper aims to estimate the amount of construction waste prevented by a BIM-based design validation process based on the amount of construction waste that might be generated due to design errors. Two project cases in South Korea were studied in this paper, with 381 and 136 design errors detected, respectively during the BIM-based design validation. Each design error was categorized according to its cause and the likelihood of detection before construction. The case studies show that BIM-based design validation could prevent 4.3-15.2% of construction waste that might have been generated without using BIM. Copyright © 2015 Elsevier Ltd. All rights reserved.
Watanabe, Colin; Cuellar, Trinna L.; Haley, Benjamin
2016-01-01
ABSTRACT Incorporating miRNA-like features into vector-based hairpin scaffolds has been shown to augment small RNA processing and RNAi efficiency. Therefore, defining an optimal, native hairpin context may obviate a need for hairpin-specific targeting design schemes, which confound the movement of functional siRNAs into shRNA/artificial miRNA backbones, or large-scale screens to identify efficacious sequences. Thus, we used quantitative cell-based assays to compare separate third generation artificial miRNA systems, miR-E (based on miR-30a) and miR-3G (based on miR-16-2 and first described in this study) to widely-adopted, first and second generation formats in both Pol-II and Pol-III expression vector contexts. Despite their unique structures and strandedness, and in contrast to first and second-generation RNAi triggers, the third generation formats operated with remarkable similarity to one another, and strong silencing was observed with a significant fraction of the evaluated target sequences within either promoter context. By pairing an established siRNA design algorithm with the third generation vectors we could readily identify targeting sequences that matched or exceeded the potency of those discovered through large-scale sensor-based assays. We find that third generation hairpin systems enable the maximal level of siRNA function, likely through enhanced processing and accumulation of precisely-defined guide RNAs. Therefore, we predict future gains in RNAi potency will come from improved hairpin expression and identification of optimal siRNA-intrinsic silencing properties rather than further modification of these scaffolds. Consequently, third generation systems should be the primary format for vector-based RNAi studies; miR-3G is advantageous due to its small expression cassette and simplified, cost-efficient cloning scheme. PMID:26786363
System and method for deriving a process-based specification
NASA Technical Reports Server (NTRS)
Hinchey, Michael Gerard (Inventor); Rouff, Christopher A. (Inventor); Rash, James Larry (Inventor)
2009-01-01
A system and method for deriving a process-based specification for a system is disclosed. The process-based specification is mathematically inferred from a trace-based specification. The trace-based specification is derived from a non-empty set of traces or natural language scenarios. The process-based specification is mathematically equivalent to the trace-based specification. Code is generated, if applicable, from the process-based specification. A process, or phases of a process, using the features disclosed can be reversed and repeated to allow for an interactive development and modification of legacy systems. The process is applicable to any class of system, including, but not limited to, biological and physical systems, electrical and electro-mechanical systems in addition to software, hardware and hybrid hardware-software systems.
NASA Astrophysics Data System (ADS)
Hua, H.; Manipon, G.; Starch, M.
2017-12-01
NASA's upcoming missions are expected to be generating data volumes at least an order of magnitude larger than current missions. A significant increase in data processing, data rates, data volumes, and long-term data archive capabilities are needed. Consequently, new challenges are emerging that impact traditional data and software management approaches. At large-scales, next generation science data systems are exploring the move onto cloud computing paradigms to support these increased needs. New implications such as costs, data movement, collocation of data systems & archives, and moving processing closer to the data, may result in changes to the stewardship, preservation, and provenance of science data and software. With more science data systems being on-boarding onto cloud computing facilities, we can expect more Earth science data records to be both generated and kept in the cloud. But at large scales, the cost of processing and storing global data may impact architectural and system designs. Data systems will trade the cost of keeping data in the cloud with the data life-cycle approaches of moving "colder" data back to traditional on-premise facilities. How will this impact data citation and processing software stewardship? What are the impacts of cloud-based on-demand processing and its affect on reproducibility and provenance. Similarly, with more science processing software being moved onto cloud, virtual machines, and container based approaches, more opportunities arise for improved stewardship and preservation. But will the science community trust data reprocessed years or decades later? We will also explore emerging questions of the stewardship of the science data system software that is generating the science data records both during and after the life of mission.
Technical Note: Approximate Bayesian parameterization of a process-based tropical forest model
NASA Astrophysics Data System (ADS)
Hartig, F.; Dislich, C.; Wiegand, T.; Huth, A.
2014-02-01
Inverse parameter estimation of process-based models is a long-standing problem in many scientific disciplines. A key question for inverse parameter estimation is how to define the metric that quantifies how well model predictions fit to the data. This metric can be expressed by general cost or objective functions, but statistical inversion methods require a particular metric, the probability of observing the data given the model parameters, known as the likelihood. For technical and computational reasons, likelihoods for process-based stochastic models are usually based on general assumptions about variability in the observed data, and not on the stochasticity generated by the model. Only in recent years have new methods become available that allow the generation of likelihoods directly from stochastic simulations. Previous applications of these approximate Bayesian methods have concentrated on relatively simple models. Here, we report on the application of a simulation-based likelihood approximation for FORMIND, a parameter-rich individual-based model of tropical forest dynamics. We show that approximate Bayesian inference, based on a parametric likelihood approximation placed in a conventional Markov chain Monte Carlo (MCMC) sampler, performs well in retrieving known parameter values from virtual inventory data generated by the forest model. We analyze the results of the parameter estimation, examine its sensitivity to the choice and aggregation of model outputs and observed data (summary statistics), and demonstrate the application of this method by fitting the FORMIND model to field data from an Ecuadorian tropical forest. Finally, we discuss how this approach differs from approximate Bayesian computation (ABC), another method commonly used to generate simulation-based likelihood approximations. Our results demonstrate that simulation-based inference, which offers considerable conceptual advantages over more traditional methods for inverse parameter estimation, can be successfully applied to process-based models of high complexity. The methodology is particularly suitable for heterogeneous and complex data structures and can easily be adjusted to other model types, including most stochastic population and individual-based models. Our study therefore provides a blueprint for a fairly general approach to parameter estimation of stochastic process-based models.
Murphy, Enda; King, Eoin A
2016-08-15
The strategic noise mapping process of the EU has now been ongoing for more than ten years. However, despite the fact that a significant volume of research has been conducted on the process and related issues there has been little change or innovation in how relevant authorities and policymakers are conducting the process since its inception. This paper reports on research undertaken to assess the possibility for smartphone-based noise mapping data to be integrated into the traditional strategic noise mapping process. We compare maps generated using the traditional approach with those generated using smartphone-based measurement data. The advantage of the latter approach is that it has the potential to remove the need for exhaustive input data into the source calculation model for noise prediction. In addition, the study also tests the accuracy of smartphone-based measurements against simultaneous measurements taken using traditional sound level meters in the field. Copyright © 2016 Elsevier B.V. All rights reserved.
Rühe, J
2017-09-26
In photolithographic processes, the light inducing the photochemical reactions is confined to a small volume, which enables direct writing of micro- and nanoscale features onto solid surfaces without the need of a predefined photomask. The direct writing process can be used to generate topographic patterns through photopolymerization or photo-cross-linking or can be employed to use light to generate chemical patterns on the surface with high spatial control, which would make such processes attractive for bioapplications. The prospects of maskless photolithography technologies with a focus on two-photon lithography and scanning-probe-based photochemical processes based on scanning near-field optical microscopy or beam pen lithography are discussed.
Automatic two- and three-dimensional mesh generation based on fuzzy knowledge processing
NASA Astrophysics Data System (ADS)
Yagawa, G.; Yoshimura, S.; Soneda, N.; Nakao, K.
1992-09-01
This paper describes the development of a novel automatic FEM mesh generation algorithm based on the fuzzy knowledge processing technique. A number of local nodal patterns are stored in a nodal pattern database of the mesh generation system. These nodal patterns are determined a priori based on certain theories or past experience of experts of FEM analyses. For example, such human experts can determine certain nodal patterns suitable for stress concentration analyses of cracks, corners, holes and so on. Each nodal pattern possesses a membership function and a procedure of node placement according to this function. In the cases of the nodal patterns for stress concentration regions, the membership function which is utilized in the fuzzy knowledge processing has two meanings, i.e. the “closeness” of nodal location to each stress concentration field as well as “nodal density”. This is attributed to the fact that a denser nodal pattern is required near a stress concentration field. What a user has to do in a practical mesh generation process are to choose several local nodal patterns properly and to designate the maximum nodal density of each pattern. After those simple operations by the user, the system places the chosen nodal patterns automatically in an analysis domain and on its boundary, and connects them smoothly by the fuzzy knowledge processing technique. Then triangular or tetrahedral elements are generated by means of the advancing front method. The key issue of the present algorithm is an easy control of complex two- or three-dimensional nodal density distribution by means of the fuzzy knowledge processing technique. To demonstrate fundamental performances of the present algorithm, a prototype system was constructed with one of object-oriented languages, Smalltalk-80 on a 32-bit microcomputer, Macintosh II. The mesh generation of several two- and three-dimensional domains with cracks, holes and junctions was presented as examples.
A new class of advanced oxidation processes (AOPs) based on sulfate radicals is being tested for the degradation of polycyclic aromatic hydrocarbons (PAHs) and polychlorinated biphenyls (PCBs) in aqueous solution. These AOPs are based on the generation of sulfate radicals through...
Note: Fully integrated 3.2 Gbps quantum random number generator with real-time extraction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Xiao-Guang; Nie, You-Qi; Liang, Hao
2016-07-15
We present a real-time and fully integrated quantum random number generator (QRNG) by measuring laser phase fluctuations. The QRNG scheme based on laser phase fluctuations is featured for its capability of generating ultra-high-speed random numbers. However, the speed bottleneck of a practical QRNG lies on the limited speed of randomness extraction. To close the gap between the fast randomness generation and the slow post-processing, we propose a pipeline extraction algorithm based on Toeplitz matrix hashing and implement it in a high-speed field-programmable gate array. Further, all the QRNG components are integrated into a module, including a compact and actively stabilizedmore » interferometer, high-speed data acquisition, and real-time data post-processing and transmission. The final generation rate of the QRNG module with real-time extraction can reach 3.2 Gbps.« less
TermGenie – a web-application for pattern-based ontology class generation
Dietze, Heiko; Berardini, Tanya Z.; Foulger, Rebecca E.; ...
2014-01-01
Biological ontologies are continually growing and improving from requests for new classes (terms) by biocurators. These ontology requests can frequently create bottlenecks in the biocuration process, as ontology developers struggle to keep up, while manually processing these requests and create classes. TermGenie allows biocurators to generate new classes based on formally specified design patterns or templates. The system is web-based and can be accessed by any authorized curator through a web browser. Automated rules and reasoning engines are used to ensure validity, uniqueness and relationship to pre-existing classes. In the last 4 years the Gene Ontology TermGenie generated 4715 newmore » classes, about 51.4% of all new classes created. The immediate generation of permanent identifiers proved not to be an issue with only 70 (1.4%) obsoleted classes. Lastly, TermGenie is a web-based class-generation system that complements traditional ontology development tools. All classes added through pre-defined templates are guaranteed to have OWL equivalence axioms that are used for automatic classification and in some cases inter-ontology linkage. At the same time, the system is simple and intuitive and can be used by most biocurators without extensive training.« less
TermGenie – a web-application for pattern-based ontology class generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dietze, Heiko; Berardini, Tanya Z.; Foulger, Rebecca E.
Biological ontologies are continually growing and improving from requests for new classes (terms) by biocurators. These ontology requests can frequently create bottlenecks in the biocuration process, as ontology developers struggle to keep up, while manually processing these requests and create classes. TermGenie allows biocurators to generate new classes based on formally specified design patterns or templates. The system is web-based and can be accessed by any authorized curator through a web browser. Automated rules and reasoning engines are used to ensure validity, uniqueness and relationship to pre-existing classes. In the last 4 years the Gene Ontology TermGenie generated 4715 newmore » classes, about 51.4% of all new classes created. The immediate generation of permanent identifiers proved not to be an issue with only 70 (1.4%) obsoleted classes. Lastly, TermGenie is a web-based class-generation system that complements traditional ontology development tools. All classes added through pre-defined templates are guaranteed to have OWL equivalence axioms that are used for automatic classification and in some cases inter-ontology linkage. At the same time, the system is simple and intuitive and can be used by most biocurators without extensive training.« less
TermGenie - a web-application for pattern-based ontology class generation.
Dietze, Heiko; Berardini, Tanya Z; Foulger, Rebecca E; Hill, David P; Lomax, Jane; Osumi-Sutherland, David; Roncaglia, Paola; Mungall, Christopher J
2014-01-01
Biological ontologies are continually growing and improving from requests for new classes (terms) by biocurators. These ontology requests can frequently create bottlenecks in the biocuration process, as ontology developers struggle to keep up, while manually processing these requests and create classes. TermGenie allows biocurators to generate new classes based on formally specified design patterns or templates. The system is web-based and can be accessed by any authorized curator through a web browser. Automated rules and reasoning engines are used to ensure validity, uniqueness and relationship to pre-existing classes. In the last 4 years the Gene Ontology TermGenie generated 4715 new classes, about 51.4% of all new classes created. The immediate generation of permanent identifiers proved not to be an issue with only 70 (1.4%) obsoleted classes. TermGenie is a web-based class-generation system that complements traditional ontology development tools. All classes added through pre-defined templates are guaranteed to have OWL equivalence axioms that are used for automatic classification and in some cases inter-ontology linkage. At the same time, the system is simple and intuitive and can be used by most biocurators without extensive training.
Problem Based Learning and the scientific process
NASA Astrophysics Data System (ADS)
Schuchardt, Daniel Shaner
This research project was developed to inspire students to constructively use problem based learning and the scientific process to learn middle school science content. The student population in this study consisted of male and female seventh grade students. Students were presented with authentic problems that are connected to physical and chemical properties of matter. The intent of the study was to have students use the scientific process of looking at existing knowledge, generating learning issues or questions about the problems, and then developing a course of action to research and design experiments to model resolutions to the authentic problems. It was expected that students would improve their ability to actively engage with others in a problem solving process to achieve a deeper understanding of Michigan's 7th Grade Level Content Expectations, the Next Generation Science Standards, and a scientific process. Problem based learning was statistically effective in students' learning of the scientific process. Students statistically showed improvement on pre to posttest scores. The teaching method of Problem Based Learning was effective for seventh grade science students at Dowagiac Middle School.
DEVS Unified Process for Web-Centric Development and Testing of System of Systems
2008-05-20
gathering from the user. Further, methodologies have been developed to generate DEVS models from BPMN /BPEL-based and message-based requirement specifications...27] 3. BPMN /BPEL based system specifications: Business Process Modeling Notation ( BPMN ) [bpm] or Business Process Execution Language (BPEL) provide a...information is stored in .wsdl and .bpel files for BPEL but in proprietary format for BPMN . 4. DoDAF-based requirement specifications: Department of
Code of Federal Regulations, 2010 CFR
2010-07-01
... manufacturing processes in which PCBs are generated when the PCB level in products leaving any manufacturing... imported products when the PCB concentration of products being imported is greater than 2 µg/g for any... process waste disposal. (2) Whether determinations of compliance are based on actual monitoring of PCB...
Power generation by thermally assisted electroluminescence: like optical cooling, but different
NASA Astrophysics Data System (ADS)
Buckner, Benjamin D.; Heeg, Bauke
2008-02-01
Thermally assisted electro-luminescence may provide a means to convert heat into electricity. In this process, radiation from a hot light-emitting diode (LED) is converted to electricity by a photovoltaic (PV) cell, which is termed thermophotonics. Novel analytical solutions to the equations governing such a system show that this system combines physical characteristics of thermophotovoltaics (TPV) and the inverse process of laser cooling. The flexibility of having both adjustable bias and load parameters may allow an optimized power generation system based on this concept to exceed the power throughput and efficiency of TPV systems. Such devices could function as efficient solar thermal, waste heat, and fuel-based generators.
Gao, Changwei; Liu, Xiaoming; Chen, Hai
2017-08-22
This paper focus on the power fluctuations of the virtual synchronous generator(VSG) during the transition process. An improved virtual synchronous generator(IVSG) control strategy based on feed-forward compensation is proposed. Adjustable parameter of the compensation section can be modified to achieve the goal of reducing the order of the system. It can effectively suppress the power fluctuations of the VSG in transient process. To verify the effectiveness of the proposed control strategy for distributed energy resources inverter, the simulation model is set up in MATLAB/SIMULINK platform and physical experiment platform is established. Simulation and experiment results demonstrate the effectiveness of the proposed IVSG control strategy.
Adaptive scallop height tool path generation for robot-based incremental sheet metal forming
NASA Astrophysics Data System (ADS)
Seim, Patrick; Möllensiep, Dennis; Störkle, Denis Daniel; Thyssen, Lars; Kuhlenkötter, Bernd
2016-10-01
Incremental sheet metal forming is an emerging process for the production of individualized products or prototypes in low batch sizes and with short times to market. In these processes, the desired shape is produced by the incremental inward motion of the workpiece-independent forming tool in depth direction and its movement along the contour in lateral direction. Based on this shape production, the tool path generation is a key factor on e.g. the resulting geometric accuracy, the resulting surface quality, and the working time. This paper presents an innovative tool path generation based on a commercial milling CAM package considering the surface quality and working time. This approach offers the ability to define a specific scallop height as an indicator of the surface quality for specific faces of a component. Moreover, it decreases the required working time for the production of the entire component compared to the use of a commercial software package without this adaptive approach. Different forming experiments have been performed to verify the newly developed tool path generation. Mainly, this approach serves to solve the existing conflict of combining the working time and the surface quality within the process of incremental sheet metal forming.
Gombert, Andreas K; van Maris, Antonius J A
2015-06-01
Current fuel ethanol production using yeasts and starch or sucrose-based feedstocks is referred to as 1st generation (1G) ethanol production. These processes are characterized by the high contribution of sugar prices to the final production costs, by high production volumes, and by low profit margins. In this context, small improvements in the ethanol yield on sugars have a large impact on process economy. Three types of strategies used to achieve this goal are discussed: engineering free-energy conservation, engineering redox-metabolism, and decreasing sugar losses in the process. Whereas the two former strategies lead to decreased biomass and/or glycerol formation, the latter requires increased process and/or yeast robustness. Copyright © 2014 Elsevier Ltd. All rights reserved.
Digital processing with single electrons for arbitrary waveform generation of current
NASA Astrophysics Data System (ADS)
Okazaki, Yuma; Nakamura, Shuji; Onomitsu, Koji; Kaneko, Nobu-Hisa
2018-03-01
We demonstrate arbitrary waveform generation of current using a GaAs-based single-electron pump. In our experiment, a digital processing algorithm known as delta-sigma modulation is incorporated into single-electron pumping to generate a density-modulated single-electron stream, by which we demonstrate the generation of arbitrary waveforms of current including sinusoidal, square, and triangular waves with a peak-to-peak amplitude of approximately 10 pA and an output bandwidth ranging from dc to close to 1 MHz. The developed current generator can be used as the precise and calculable current reference required for measurements of current noise in low-temperature experiments.
Smoking Cessation Intervention on Facebook: Which Content Generates the Best Engagement?
Thrul, Johannes; Klein, Alexandra B; Ramo, Danielle E
2015-11-11
Social media offer a great opportunity to deliver smoking cessation treatment to young adults, but previous online and social media interventions targeting health behavior change have struggled with low participant engagement. We examined engagement generated by content based on the Transtheoretical Model of Behavior Change (TTM) in a motivationally tailored smoking cessation intervention on Facebook. This study aimed to identify which intervention content based on the TTM (Decisional Balance and 10 processes of change) generated the highest engagement among participants in pre-action stages of change (Precontemplation, Contemplation, and Preparation). Participants (N=79, 20% female, mean age 20.8) were assessed for readiness to quit smoking and assigned to one of 7 secret Facebook groups tailored to their stage of change. Daily postings to the groups based on TTM Decisional Balance and the 10 processes of change were made by research staff over 3 months. Engagement was operationalized as the number of participant comments to each post. TTM content-based predictors of number of comments were analyzed and stratified by baseline stage of change, using negative binomial regression analyses with and without zero inflation. A total of 512 TTM-based posts generated 630 individual comments. In Precontemplation and Contemplation groups, Decisional Balance posts generated above average engagement (P=.01 and P<.001). In Contemplation groups, posts based on the TTM processes Dramatic Relief and Self-Liberation resulted in below average engagement (P=.01 and P=.005). In Preparation groups, posts based on Consciousness Raising generated above average engagement (P=.009). Participant engagement decreased over time and differed between groups within Precontemplation and Contemplation stages, but was independent of day of the week and time of day the content was posted to the groups. No participant baseline characteristics significantly predicted engagement. Participants not ready to quit in the next 30 days (in Precontemplation or Contemplation) engaged most when prompted to think about the pros and cons of behavior change, while those in the Preparation stage engaged most when posts increased awareness about smoking and smoking cessation. Findings support tailoring intervention content to readiness to quit and suggest intervention components that may be most effective in generating high participant engagement on social media.
ERIC Educational Resources Information Center
Fillingim, Jennifer Gale
2010-01-01
Contemporary mathematics education reform has placed increased emphasis on K-12 mathematics curriculum. Reform-based curricula, often referred to as "Standards-based" due to philosophical alignment with the NCTM Process Standards, have generated controversy among families, educators, and researchers. The mathematics education research…
Worklist handling in workflow-enabled radiological application systems
NASA Astrophysics Data System (ADS)
Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim; von Berg, Jens
2000-05-01
For the next generation integrated information systems for health care applications, more emphasis has to be put on systems which, by design, support the reduction of cost, the increase inefficiency and the improvement of the quality of services. A substantial contribution to this will be the modeling. optimization, automation and enactment of processes in health care institutions. One of the perceived key success factors for the system integration of processes will be the application of workflow management, with workflow management systems as key technology components. In this paper we address workflow management in radiology. We focus on an important aspect of workflow management, the generation and handling of worklists, which provide workflow participants automatically with work items that reflect tasks to be performed. The display of worklists and the functions associated with work items are the visible part for the end-users of an information system using a workflow management approach. Appropriate worklist design and implementation will influence user friendliness of a system and will largely influence work efficiency. Technically, in current imaging department information system environments (modality-PACS-RIS installations), a data-driven approach has been taken: Worklist -- if present at all -- are generated from filtered views on application data bases. In a future workflow-based approach, worklists will be generated by autonomous workflow services based on explicit process models and organizational models. This process-oriented approach will provide us with an integral view of entire health care processes or sub- processes. The paper describes the basic mechanisms of this approach and summarizes its benefits.
ERIC Educational Resources Information Center
Youqing, Chen
2006-01-01
Experience is an activity that arouses emotions and generates meanings based on vivid sensation and profound comprehension. It is emotional, meaningful, and personal, playing a key role in the course of forming and developing one's qualities. The psychological process of experience generation consists of such links as sensing things, arousing…
Stereo matching and view interpolation based on image domain triangulation.
Fickel, Guilherme Pinto; Jung, Claudio R; Malzbender, Tom; Samadani, Ramin; Culbertson, Bruce
2013-09-01
This paper presents a new approach for stereo matching and view interpolation problems based on triangular tessellations suitable for a linear array of rectified cameras. The domain of the reference image is initially partitioned into triangular regions using edge and scale information, aiming to place vertices along image edges and increase the number of triangles in textured regions. A region-based matching algorithm is then used to find an initial disparity for each triangle, and a refinement stage is applied to change the disparity at the vertices of the triangles, generating a piecewise linear disparity map. A simple post-processing procedure is applied to connect triangles with similar disparities generating a full 3D mesh related to each camera (view), which are used to generate new synthesized views along the linear camera array. With the proposed framework, view interpolation reduces to the trivial task of rendering polygonal meshes, which can be done very fast, particularly when GPUs are employed. Furthermore, the generated views are hole-free, unlike most point-based view interpolation schemes that require some kind of post-processing procedures to fill holes.
Real-time liquid-crystal atmosphere turbulence simulator with graphic processing unit.
Hu, Lifa; Xuan, Li; Li, Dayu; Cao, Zhaoliang; Mu, Quanquan; Liu, Yonggang; Peng, Zenghui; Lu, Xinghai
2009-04-27
To generate time-evolving atmosphere turbulence in real time, a phase-generating method for our liquid-crystal (LC) atmosphere turbulence simulator (ATS) is derived based on the Fourier series (FS) method. A real matrix expression for generating turbulence phases is given and calculated with a graphic processing unit (GPU), the GeForce 8800 Ultra. A liquid crystal on silicon (LCOS) with 256x256 pixels is used as the turbulence simulator. The total time to generate a turbulence phase is about 7.8 ms for calculation and readout with the GPU. A parallel processing method of calculating and sending a picture to the LCOS is used to improve the simulating speed of our LC ATS. Therefore, the real-time turbulence phase-generation frequency of our LC ATS is up to 128 Hz. To our knowledge, it is the highest speed used to generate a turbulence phase in real time.
NASA Technical Reports Server (NTRS)
Nieten, Joseph; Burke, Roger
1993-01-01
Consideration is given to the System Diagnostic Builder (SDB), an automated knowledge acquisition tool using state-of-the-art AI technologies. The SDB employs an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert. Thus, data are captured from the subject system, classified, and used to drive the rule generation process. These rule bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The knowledge bases captured from the Shuttle Mission Simulator can be used as black box simulations by the Intelligent Computer Aided Training devices. The SDB can also be used to construct knowledge bases for the process control industry, such as chemical production or oil and gas production.
NASA Astrophysics Data System (ADS)
Nieten, Joseph L.; Burke, Roger
1993-03-01
The system diagnostic builder (SDB) is an automated knowledge acquisition tool using state- of-the-art artificial intelligence (AI) technologies. The SDB uses an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert (SME). Thus, data is captured from the subject system, classified by an expert, and used to drive the rule generation process. These rule-bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The rule-bases can be used in any knowledge based system which monitors or controls a physical system or simulation. The SDB has demonstrated the utility of using inductive machine learning technology to generate reliable knowledge bases. In fact, we have discovered that the knowledge captured by the SDB can be used in any number of applications. For example, the knowledge bases captured from the SMS can be used as black box simulations by intelligent computer aided training devices. We can also use the SDB to construct knowledge bases for the process control industry, such as chemical production, or oil and gas production. These knowledge bases can be used in automated advisory systems to ensure safety, productivity, and consistency.
Inference from clustering with application to gene-expression microarrays.
Dougherty, Edward R; Barrera, Junior; Brun, Marcel; Kim, Seungchan; Cesar, Roberto M; Chen, Yidong; Bittner, Michael; Trent, Jeffrey M
2002-01-01
There are many algorithms to cluster sample data points based on nearness or a similarity measure. Often the implication is that points in different clusters come from different underlying classes, whereas those in the same cluster come from the same class. Stochastically, the underlying classes represent different random processes. The inference is that clusters represent a partition of the sample points according to which process they belong. This paper discusses a model-based clustering toolbox that evaluates cluster accuracy. Each random process is modeled as its mean plus independent noise, sample points are generated, the points are clustered, and the clustering error is the number of points clustered incorrectly according to the generating random processes. Various clustering algorithms are evaluated based on process variance and the key issue of the rate at which algorithmic performance improves with increasing numbers of experimental replications. The model means can be selected by hand to test the separability of expected types of biological expression patterns. Alternatively, the model can be seeded by real data to test the expected precision of that output or the extent of improvement in precision that replication could provide. In the latter case, a clustering algorithm is used to form clusters, and the model is seeded with the means and variances of these clusters. Other algorithms are then tested relative to the seeding algorithm. Results are averaged over various seeds. Output includes error tables and graphs, confusion matrices, principal-component plots, and validation measures. Five algorithms are studied in detail: K-means, fuzzy C-means, self-organizing maps, hierarchical Euclidean-distance-based and correlation-based clustering. The toolbox is applied to gene-expression clustering based on cDNA microarrays using real data. Expression profile graphics are generated and error analysis is displayed within the context of these profile graphics. A large amount of generated output is available over the web.
NASA Astrophysics Data System (ADS)
Luo, Keqin
1999-11-01
The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and accurately what is going on in each tank, and (iii) identify all WM opportunities through process improvement. This work has formed a solid foundation for the further development of powerful WM technologies for comprehensive WM in the following decade.
Software for pre-processing Illumina next-generation sequencing short read sequences
2014-01-01
Background When compared to Sanger sequencing technology, next-generation sequencing (NGS) technologies are hindered by shorter sequence read length, higher base-call error rate, non-uniform coverage, and platform-specific sequencing artifacts. These characteristics lower the quality of their downstream analyses, e.g. de novo and reference-based assembly, by introducing sequencing artifacts and errors that may contribute to incorrect interpretation of data. Although many tools have been developed for quality control and pre-processing of NGS data, none of them provide flexible and comprehensive trimming options in conjunction with parallel processing to expedite pre-processing of large NGS datasets. Methods We developed ngsShoRT (next-generation sequencing Short Reads Trimmer), a flexible and comprehensive open-source software package written in Perl that provides a set of algorithms commonly used for pre-processing NGS short read sequences. We compared the features and performance of ngsShoRT with existing tools: CutAdapt, NGS QC Toolkit and Trimmomatic. We also compared the effects of using pre-processed short read sequences generated by different algorithms on de novo and reference-based assembly for three different genomes: Caenorhabditis elegans, Saccharomyces cerevisiae S288c, and Escherichia coli O157 H7. Results Several combinations of ngsShoRT algorithms were tested on publicly available Illumina GA II, HiSeq 2000, and MiSeq eukaryotic and bacteria genomic short read sequences with the focus on removing sequencing artifacts and low-quality reads and/or bases. Our results show that across three organisms and three sequencing platforms, trimming improved the mean quality scores of trimmed sequences. Using trimmed sequences for de novo and reference-based assembly improved assembly quality as well as assembler performance. In general, ngsShoRT outperformed comparable trimming tools in terms of trimming speed and improvement of de novo and reference-based assembly as measured by assembly contiguity and correctness. Conclusions Trimming of short read sequences can improve the quality of de novo and reference-based assembly and assembler performance. The parallel processing capability of ngsShoRT reduces trimming time and improves the memory efficiency when dealing with large datasets. We recommend combining sequencing artifacts removal, and quality score based read filtering and base trimming as the most consistent method for improving sequence quality and downstream assemblies. ngsShoRT source code, user guide and tutorial are available at http://research.bioinformatics.udel.edu/genomics/ngsShoRT/. ngsShoRT can be incorporated as a pre-processing step in genome and transcriptome assembly projects. PMID:24955109
Application of genetic algorithm in integrated setup planning and operation sequencing
NASA Astrophysics Data System (ADS)
Kafashi, Sajad; Shakeri, Mohsen
2011-01-01
Process planning is an essential component for linking design and manufacturing process. Setup planning and operation sequencing is two main tasks in process planning. Many researches solved these two problems separately. Considering the fact that the two functions are complementary, it is necessary to integrate them more tightly so that performance of a manufacturing system can be improved economically and competitively. This paper present a generative system and genetic algorithm (GA) approach to process plan the given part. The proposed approach and optimization methodology analyses the TAD (tool approach direction), tolerance relation between features and feature precedence relations to generate all possible setups and operations using workshop resource database. Based on these technological constraints the GA algorithm approach, which adopts the feature-based representation, optimizes the setup plan and sequence of operations using cost indices. Case study show that the developed system can generate satisfactory results in optimizing the setup planning and operation sequencing simultaneously in feasible condition.
Option generation in decision making: ideation beyond memory retrieval
Del Missier, Fabio; Visentini, Mimì; Mäntylä, Timo
2015-01-01
According to prescriptive decision theories, the generation of options for choice is a central aspect of decision making. A too narrow representation of the problem may indeed limit the opportunity to evaluate promising options. However, despite the theoretical and applied significance of this topic, the cognitive processes underlying option generation are still unclear. In particular, while a cued recall account of option generation emphasizes the role of memory and executive control, other theoretical proposals stress the importance of ideation processes based on various search and thinking processes. Unfortunately, relevant behavioral evidence on the cognitive processes underlying option generation is scattered and inconclusive. In order to reach a better understanding, we carried out an individual-differences study employing a wide array of cognitive predictors, including measures of episodic memory, semantic memory, cognitive control, and ideation fluency. The criterion tasks consisted of three different poorly-structured decision-making scenarios, and the participants were asked to generate options to solve these problems. The main criterion variable of the study was the number of valid options generated, but also the diversity and the quality of generated options were examined. The results showed that option generation fluency and diversity in the context of ill-structured decision making are supported by ideation ability even after taking into account the effects of individual differences in several other aspects of cognitive functioning. Thus, ideation processes, possibly supported by search and thinking processes, seem to contribute to option generation beyond basic associative memory retrieval. The findings of the study also indicate that generating more options may have multifaceted consequences for choice, increasing the quality of the best option generated but decreasing the mean quality of the options in the generated set. PMID:25657628
Option generation in decision making: ideation beyond memory retrieval.
Del Missier, Fabio; Visentini, Mimì; Mäntylä, Timo
2014-01-01
According to prescriptive decision theories, the generation of options for choice is a central aspect of decision making. A too narrow representation of the problem may indeed limit the opportunity to evaluate promising options. However, despite the theoretical and applied significance of this topic, the cognitive processes underlying option generation are still unclear. In particular, while a cued recall account of option generation emphasizes the role of memory and executive control, other theoretical proposals stress the importance of ideation processes based on various search and thinking processes. Unfortunately, relevant behavioral evidence on the cognitive processes underlying option generation is scattered and inconclusive. In order to reach a better understanding, we carried out an individual-differences study employing a wide array of cognitive predictors, including measures of episodic memory, semantic memory, cognitive control, and ideation fluency. The criterion tasks consisted of three different poorly-structured decision-making scenarios, and the participants were asked to generate options to solve these problems. The main criterion variable of the study was the number of valid options generated, but also the diversity and the quality of generated options were examined. The results showed that option generation fluency and diversity in the context of ill-structured decision making are supported by ideation ability even after taking into account the effects of individual differences in several other aspects of cognitive functioning. Thus, ideation processes, possibly supported by search and thinking processes, seem to contribute to option generation beyond basic associative memory retrieval. The findings of the study also indicate that generating more options may have multifaceted consequences for choice, increasing the quality of the best option generated but decreasing the mean quality of the options in the generated set.
Status review of PMR polyimides
NASA Technical Reports Server (NTRS)
Serafini, T. T.
1978-01-01
The current status of first and second generation PMR polyimides are reviewed. Synthesis, processing, and applications were considered, using prepreg materials based on processable, high temperature resistant polyimides.
NASA Astrophysics Data System (ADS)
Borne, Adrien; Katsura, Tomotaka; Félix, Corinne; Doppagne, Benjamin; Segonds, Patricia; Bencheikh, Kamel; Levenson, Juan Ariel; Boulanger, Benoit
2016-01-01
Several third-harmonic generation processes were performed in a single step-index germanium-doped silica optical fiber under intermodal phase-matching conditions. The nanosecond fundamental beam range between 1400 and 1600 nm. The transverse distributions of the energy were successfully modeled in the form of Ince-Gauss modes, pointing out some ellipticity of fiber core. From these experiments and theoretical calculations, we discuss the implementation of frequency degenerated triple photon generation that shares the same phase-matching condition as third-harmonic generation, which is its reverse process.
Optical testing of aspheres based on photochromic computer-generated holograms
NASA Astrophysics Data System (ADS)
Pariani, Giorgio; Bianco, Andrea; Bertarelli, Chiara; Spanó, Paolo; Molinari, Emilio
2010-07-01
Aspherical optics are widely used in modern optical telescopes and instrumentation because of their ability to reduce aberrations with a simple optical system. Testing their optical quality through null interferometry is not trivial as reference optics are not available. Computer-Generated Holograms (CGHs) are efficient devices that allow to generate a well-defined optical wavefront. We developed rewritable Computer Generated Holograms for the interferometric test of aspheres based on photochromic layers. These photochromic holograms are cost-effective and the method of production does not need any post exposure process.
Analysis of quality raw data of second generation sequencers with Quality Assessment Software.
Ramos, Rommel Tj; Carneiro, Adriana R; Baumbach, Jan; Azevedo, Vasco; Schneider, Maria Pc; Silva, Artur
2011-04-18
Second generation technologies have advantages over Sanger; however, they have resulted in new challenges for the genome construction process, especially because of the small size of the reads, despite the high degree of coverage. Independent of the program chosen for the construction process, DNA sequences are superimposed, based on identity, to extend the reads, generating contigs; mismatches indicate a lack of homology and are not included. This process improves our confidence in the sequences that are generated. We developed Quality Assessment Software, with which one can review graphs showing the distribution of quality values from the sequencing reads. This software allow us to adopt more stringent quality standards for sequence data, based on quality-graph analysis and estimated coverage after applying the quality filter, providing acceptable sequence coverage for genome construction from short reads. Quality filtering is a fundamental step in the process of constructing genomes, as it reduces the frequency of incorrect alignments that are caused by measuring errors, which can occur during the construction process due to the size of the reads, provoking misassemblies. Application of quality filters to sequence data, using the software Quality Assessment, along with graphing analyses, provided greater precision in the definition of cutoff parameters, which increased the accuracy of genome construction.
NASA Astrophysics Data System (ADS)
Koon, Phillip L.; Greene, Scott
2002-07-01
Our aerospace customers are demanding that we drastically reduce the cost of operating and supporting our products. Our space customer in particular is looking for the next generation of reusable launch vehicle systems to support more aircraft like operation. To achieve this goal requires more than an evolution in materials, processes and systems, what is required is a paradigm shift in the design of the launch vehicles and the processing systems that support the launch vehicles. This paper describes the Automated Informed Maintenance System (AIM) we are developing for NASA's Space Launch Initiative (SLI) Second Generation Reusable Launch Vehicle (RLV). Our system includes an Integrated Health Management (IHM) system for the launch vehicles and ground support systems, which features model based diagnostics and prognostics. Health Management data is used by our AIM decision support and process aids to automatically plan maintenance, generate work orders and schedule maintenance activities along with the resources required to execute these processes. Our system will automate the ground processing for a spaceport handling multiple RLVs executing multiple missions. To accomplish this task we are applying the latest web based distributed computing technologies and application development techniques.
The role of ion-exchange membrane in energy conversion
NASA Astrophysics Data System (ADS)
Khoiruddin, Aryanti, Putu T. P.; Hakim, Ahmad N.; Wenten, I. Gede
2017-05-01
Ion-exchange membrane (IEM) may play an important role in the future of electrical energy generation which is considered as renewable and clean energy. Fell cell (FC) is one of the promising technologies for solving energy issues in the future owing to the interesting features such as high electrical efficiency, low emissions, low noise level, and modularity. IEM-based processes, such as microbial fuel cell (MFC) and reverse electrodialysis (RED) may be combined with water or wastewater treatment into an integrated system. By using the integrated system, water and energy could be produced simultaneously. The IEM-based processes can be used for direct electricity generation or long term energy storage such as by harnessing surplus electricity from an existing renewable energy system to be converted into hydrogen gas via electrolysis or stored into chemical energy via redox flow battery (RFB). In this paper, recent development and applications of IEM-based processes in energy conversion are reviewed. In addition, perspective and challenges of IEM-based processes in energy conversion are pointed out.
Multi-Sensor Data Fusion Project
2000-02-28
seismic network by detecting T phases generated by underground events ( generally earthquakes ) and associating these phases to seismic events. The...between underwater explosions (H), underground sources, mostly earthquake - generated (7), and noise detections (N). The phases classified as H are the only...processing for infrasound sensors is most similar to seismic array processing with the exception that the detections are based on a more sophisticated
Hydrogen generator, via catalytic partial oxidation of methane for fuel cells
NASA Astrophysics Data System (ADS)
Recupero, Vincenzo; Pino, Lidia; Di Leonardo, Raffaele; Lagana', Massimo; Maggio, Gaetano
It is well known that the most acknowledged process for generation of hydrogen for fuel cells is based upon the steam reforming of methane or natural gas. A valid alternative could be a process based on partial oxidation of methane, since the process is mildly exothermic and therefore not energy intensive. Consequently, great interest is expected from conversion of methane into syngas, if an autothermal, low energy intensive, compact and reliable process could be developed. This paper covers the activities, performed by the CNR Institute of Transformation and Storage of Energy (CNR-TAE), on theoretical and experimental studies for a compact hydrogen generator, via catalytic selective partial oxidation of methane, integrated with second generation fuel cells (EC-JOU2 contract). In particular, the project focuses the attention on methane partial oxidation via heterogeneous selective catalysts, in order to: demonstrate the basic catalytic selective partial oxidation of methane (CSPOM) technology in a subscale prototype, equivalent to a nominal output of 5 kWe; develop the CSPOM technology for its application in electric energy production by means of fuel cells; assess, by a balance of plant analysis, and a techno-economic evaluation, the potential benefits of the CSPOM for different categories of fuel cells.
Vector Quantization Algorithm Based on Associative Memories
NASA Astrophysics Data System (ADS)
Guzmán, Enrique; Pogrebnyak, Oleksiy; Yáñez, Cornelio; Manrique, Pablo
This paper presents a vector quantization algorithm for image compression based on extended associative memories. The proposed algorithm is divided in two stages. First, an associative network is generated applying the learning phase of the extended associative memories between a codebook generated by the LBG algorithm and a training set. This associative network is named EAM-codebook and represents a new codebook which is used in the next stage. The EAM-codebook establishes a relation between training set and the LBG codebook. Second, the vector quantization process is performed by means of the recalling stage of EAM using as associative memory the EAM-codebook. This process generates a set of the class indices to which each input vector belongs. With respect to the LBG algorithm, the main advantages offered by the proposed algorithm is high processing speed and low demand of resources (system memory); results of image compression and quality are presented.
NASA Astrophysics Data System (ADS)
Dileep Kumar, V.; Barnwal, Tripti A.; Mukherjee, Jaya; Gantayet, L. M.
2010-02-01
For effective evaporation of refractory metal, electron beam is found to be most suitable vapour generator source. Using electron beam, high throughput laser based purification processes are carried out. But due to highly concentrated electron beam, the vapour gets ionised and these ions lead to dilution of the pure product of laser based separation process. To estimate the concentration of these ions and extraction potential requirement to remove these ions from vapour stream, experiments have been conducted using aluminium as evaporant. The aluminium ingots were placed in water cooled copper crucible. Inserts were used to hold the evaporant, in order to attain higher number density in the vapour processing zone and also for confining the liquid metal. Parametric studies with beam power, number density and extraction potential were conducted. In this paper we discuss the trend of the generation of thermal ions and electrostatic field requirement for extraction.
A high-fidelity weather time series generator using the Markov Chain process on a piecewise level
NASA Astrophysics Data System (ADS)
Hersvik, K.; Endrerud, O.-E. V.
2017-12-01
A method is developed for generating a set of unique weather time-series based on an existing weather series. The method allows statistically valid weather variations to take place within repeated simulations of offshore operations. The numerous generated time series need to share the same statistical qualities as the original time series. Statistical qualities here refer mainly to the distribution of weather windows available for work, including durations and frequencies of such weather windows, and seasonal characteristics. The method is based on the Markov chain process. The core new development lies in how the Markov Process is used, specifically by joining small pieces of random length time series together rather than joining individual weather states, each from a single time step, which is a common solution found in the literature. This new Markov model shows favorable characteristics with respect to the requirements set forth and all aspects of the validation performed.
Automated extraction of knowledge for model-based diagnostics
NASA Technical Reports Server (NTRS)
Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.
1990-01-01
The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.
NASA Astrophysics Data System (ADS)
Nejad, Hossein Tehrani Nik; Sugimura, Nobuhiro; Iwamura, Koji; Tanimizu, Yoshitaka
Process planning and scheduling are important manufacturing planning activities which deal with resource utilization and time span of manufacturing operations. The process plans and the schedules generated in the planning phase shall be modified in the execution phase due to the disturbances in the manufacturing systems. This paper deals with a multi-agent architecture of an integrated and dynamic system for process planning and scheduling for multi jobs. A negotiation protocol is discussed, in this paper, to generate the process plans and the schedules of the manufacturing resources and the individual jobs, dynamically and incrementally, based on the alternative manufacturing processes. The alternative manufacturing processes are presented by the process plan networks discussed in the previous paper, and the suitable process plans and schedules are searched and generated to cope with both the dynamic status and the disturbances of the manufacturing systems. We initiatively combine the heuristic search algorithms of the process plan networks with the negotiation protocols, in order to generate suitable process plans and schedules in the dynamic manufacturing environment. A simulation software has been developed to carry out case studies, aimed at verifying the performance of the proposed multi-agent architecture.
NASA Astrophysics Data System (ADS)
Liu, Chen; Gao, Bin; Huang, Peng; Kang, Jinfeng
2017-03-01
In this work, first principle calculations are employed to study the microstructure characteristics of the anatase TiO2 resistive switching material associated with the generation of oxygen vacancy (V o) based nanofilaments during the switching process. The calculations indicate that both the magnéli phase Ti4O7 and V o-defect phase of anatase TiO2 may be formed with the generation of oxygen vacancies during the forming and SET processes. Based on the calculations, a new physical insight is proposed to clarify the microstructure evolution characteristics of the anatase TiO2 resistive switching material and the correlation with resistive switching behaviors. During the forming or SET process, the anatase TiO2 is first excited to a transition state with the generation of oxygen vacancies, then fully relaxes to a stable V o-defect state. This V o-defect state may either recover to the original state with the recombination of the oxygen vacancies, which causes the reversible resistive switching behavior, or further transform to a much more stable state—the magnéli phase Ti4O7, through a phase transition process with the generation of many more oxygen vacancies. The phase transition from V o- defective anatase phase to magnéli phase Ti4O7 causes the failure of the resistive switching due to the significantly reduced possibility of the reversible phase transition from the magnéli phase to the anatase phase, compared with the possibility of the recombination from the V o-defective anatase.
Robot-based additive manufacturing for flexible die-modelling in incremental sheet forming
NASA Astrophysics Data System (ADS)
Rieger, Michael; Störkle, Denis Daniel; Thyssen, Lars; Kuhlenkötter, Bernd
2017-10-01
The paper describes the application concept of additive manufactured dies to support the robot-based incremental sheet metal forming process (`Roboforming') for the production of sheet metal components in small batch sizes. Compared to the dieless kinematic-based generation of a shape by means of two cooperating industrial robots, the supporting robot models a die on the back of the metal sheet by using the robot-based fused layer manufacturing process (FLM). This tool chain is software-defined and preserves the high geometrical form flexibility of Roboforming while flexibly generating support structures adapted to the final part's geometry. Test series serve to confirm the feasibility of the concept by investigating the process challenges of the adhesion to the sheet surface and the general stability as well as the influence on the geometric accuracy compared to the well-known forming strategies.
Materials processing in space: Future technology trends
NASA Technical Reports Server (NTRS)
Barter, N. J.
1980-01-01
NASA's materials processing in space- (MPS) program involves both ground and space-based research and looks to frequent and cost effective access to the space environment for necessary progress. The first generation payloads for research are under active design and development. They will be hosted by the Space Shuttle/Spacelab on Earth orbital flights in the early 1980's. hese missions will focus on the acquisition of materials behavior research data, the potential enhancement of Earth based technology, and the implementation of space based processing for specialized, high value materials. Some materials to be studied in these payloads may provide future breakthroughs for stronger alloys, ultrapure glasses, superior electronic components, and new or better chemicals. An operational 25 kW power system is expected to be operational to support sustained, systematic space processing activity beyond shuttle capability for second generation payload systems for SPACELAB and free flyer missions to study solidification and crystal growth and to process metal/alloys, glasses/ceramics, and chemicals and biologicals.
Mobile Ultrasound Plane Wave Beamforming on iPhone or iPad using Metal- based GPU Processing
NASA Astrophysics Data System (ADS)
Hewener, Holger J.; Tretbar, Steffen H.
Mobile and cost effective ultrasound devices are being used in point of care scenarios or the drama room. To reduce the costs of such devices we already presented the possibilities of consumer devices like the Apple iPad for full signal processing of raw data for ultrasound image generation. Using technologies like plane wave imaging to generate a full image with only one excitation/reception event the acquisition times and power consumption of ultrasound imaging can be reduced for low power mobile devices based on consumer electronics realizing the transition from FPGA or ASIC based beamforming into more flexible software beamforming. The massive parallel beamforming processing can be done with the Apple framework "Metal" for advanced graphics and general purpose GPU processing for the iOS platform. We were able to integrate the beamforming reconstruction into our mobile ultrasound processing application with imaging rates up to 70 Hz on iPad Air 2 hardware.
Generation of multicolor vacuum ultraviolet pulses through four-wave sum-frequency mixing in argon
NASA Astrophysics Data System (ADS)
Shi, Liping; Li, Wenxue; Zhou, Hui; Wang, Di; Ding, Liang'en; Zeng, Heping
2013-11-01
We demonstrate efficient generation of multicolor vacuum ultraviolet pulses with excellent mode quality through χ(3)-based four-wave sum-frequency mixing and third-order harmonic generation of 400- and 267-nm femtosecond laser pulses in argon gas. The χ(3)-based nonlinear optical processes were optimized with appropriate control of gas pressure and group velocity delay between the driving pulses. Furthermore, the pulse breakup effects were observed for tightly focused ultraviolet pulses.
Semi-autonomous remote sensing time series generation tool
NASA Astrophysics Data System (ADS)
Babu, Dinesh Kumar; Kaufmann, Christof; Schmidt, Marco; Dhams, Thorsten; Conrad, Christopher
2017-10-01
High spatial and temporal resolution data is vital for crop monitoring and phenology change detection. Due to the lack of satellite architecture and frequent cloud cover issues, availability of daily high spatial data is still far from reality. Remote sensing time series generation of high spatial and temporal data by data fusion seems to be a practical alternative. However, it is not an easy process, since it involves multiple steps and also requires multiple tools. In this paper, a framework of Geo Information System (GIS) based tool is presented for semi-autonomous time series generation. This tool will eliminate the difficulties by automating all the steps and enable the users to generate synthetic time series data with ease. Firstly, all the steps required for the time series generation process are identified and grouped into blocks based on their functionalities. Later two main frameworks are created, one to perform all the pre-processing steps on various satellite data and the other one to perform data fusion to generate time series. The two frameworks can be used individually to perform specific tasks or they could be combined to perform both the processes in one go. This tool can handle most of the known geo data formats currently available which makes it a generic tool for time series generation of various remote sensing satellite data. This tool is developed as a common platform with good interface which provides lot of functionalities to enable further development of more remote sensing applications. A detailed description on the capabilities and the advantages of the frameworks are given in this paper.
NASA Astrophysics Data System (ADS)
Zangori, Laura; Forbes, Cory T.; Schwarz, Christina V.
2015-10-01
Opportunities to generate model-based explanations are crucial for elementary students, yet are rarely foregrounded in elementary science learning environments despite evidence that early learners can reason from models when provided with scaffolding. We used a quasi-experimental research design to investigate the comparative impact of a scaffold test condition consisting of embedded physical scaffolds within a curricular modeling task on third-grade (age 8-9) students' formulation of model-based explanations for the water cycle. This condition was contrasted to the control condition where third-grade students used a curricular modeling task with no embedded physical scaffolds. Students from each condition ( n scaffold = 60; n unscaffold = 56) generated models of the water cycle before and after completion of a 10-week water unit. Results from quantitative analyses suggest that students in the scaffolded condition represented and linked more subsurface water process sequences with surface water process sequences than did students in the unscaffolded condition. However, results of qualitative analyses indicate that students in the scaffolded condition were less likely to build upon these process sequences to generate model-based explanations and experienced difficulties understanding their models as abstracted representations rather than recreations of real-world phenomena. We conclude that embedded curricular scaffolds may support students to consider non-observable components of the water cycle but, alone, may be insufficient for generation of model-based explanations about subsurface water movement.
ERIC Educational Resources Information Center
Demetriadis, S. N.; Papadopoulos, P. M.; Stamelos, I. G.; Fischer, F.
2008-01-01
This study investigates the hypothesis that students' learning and problem-solving performance in ill-structured domains can be improved, if elaborative question prompts are used to activate students' context-generating cognitive processes, during case study. Two groups of students used a web-based learning environment to criss-cross and study…
DOE Office of Scientific and Technical Information (OSTI.GOV)
none,
2008-03-01
This roadmap to “Next Generation Hydrocarbon Biorefineries” outlines a number of novel process pathways for biofuels production based on sound scientific and engineering proofs of concept demonstrated in laboratories around the world. This report was based on the workshop of the same name held June 25-26, 2007 in Washington, DC.
Moreno-Tapia, Sandra Veronica; Vera-Salas, Luis Alberto; Osornio-Rios, Roque Alfredo; Dominguez-Gonzalez, Aurelio; Stiharu, Ion; de Jesus Romero-Troncoso, Rene
2010-01-01
Computer numerically controlled (CNC) machines have evolved to adapt to increasing technological and industrial requirements. To cover these needs, new generation machines have to perform monitoring strategies by incorporating multiple sensors. Since in most of applications the online Processing of the variables is essential, the use of smart sensors is necessary. The contribution of this work is the development of a wireless network platform of reconfigurable smart sensors for CNC machine applications complying with the measurement requirements of new generation CNC machines. Four different smart sensors are put under test in the network and their corresponding signal processing techniques are implemented in a Field Programmable Gate Array (FPGA)-based sensor node. PMID:22163602
Moreno-Tapia, Sandra Veronica; Vera-Salas, Luis Alberto; Osornio-Rios, Roque Alfredo; Dominguez-Gonzalez, Aurelio; Stiharu, Ion; Romero-Troncoso, Rene de Jesus
2010-01-01
Computer numerically controlled (CNC) machines have evolved to adapt to increasing technological and industrial requirements. To cover these needs, new generation machines have to perform monitoring strategies by incorporating multiple sensors. Since in most of applications the online Processing of the variables is essential, the use of smart sensors is necessary. The contribution of this work is the development of a wireless network platform of reconfigurable smart sensors for CNC machine applications complying with the measurement requirements of new generation CNC machines. Four different smart sensors are put under test in the network and their corresponding signal processing techniques are implemented in a Field Programmable Gate Array (FPGA)-based sensor node.
Keane, M; Siert, A; Stone, S; Chen, B; Slaven, J; Cumpston, A; Antonini, J
2012-09-01
Eight welding processes/shielding gas combinations were assessed for generation of hexavalent chromium (Cr 6+ ) in stainless steel welding fumes. The processes examined were gas metal arc welding (GMAW) (axial spray, short circuit, and pulsed spray modes), flux cored arc welding (FCAW), and shielded metal arc welding (SMAW). The Cr 6+ fractions were measured in the fumes; fume generation rates, Cr 6+ generation rates, and Cr 6+ generation rates per unit mass of welding wire were determined. A limited controlled comparison study was done in a welding shop including SMAW, FCAW, and three GMAW methods. The processes studied were compared for costs, including relative labor costs. Results indicate the Cr 6+ in the fume varied widely, from a low of 2800 to a high of 34,000 ppm. Generation rates of Cr 6+ ranged from 69 to 7800 μg/min, and Cr 6+ generation rates per unit of wire ranged from 1 to 270 μg/g. The results of field study were similar to the findings in the laboratory. The Cr 6+ (ppm) in the fume did not necessarily correlate with the Cr 6+ generation rate. Physical properties were similar for the processes, with mass median aerodynamic diameters ranging from 250 to 336 nm, while the FCAW and SMAW fumes were larger (360 and 670 nm, respectively). The pulsed axial spray method was the best choice of the processes studied based on minimal fume generation, minimal Cr 6+ generation, and cost per weld. This method is usable in any position, has a high metal deposition rate, and is relatively simple to learn and use.
Generation and context memory.
Mulligan, Neil W; Lozito, Jeffrey P; Rosner, Zachary A
2006-07-01
Generation enhances memory for occurrence but may not enhance other aspects of memory. The present study further delineates the negative generation effect in context memory reported in N. W. Mulligan (2004). First, the negative generation effect occurred for perceptual attributes of the target item (its color and font) but not for extratarget aspects of context (location and background color). Second, nonvisual generation tasks with either semantic or nonsemantic generation rules (antonym and rhyme generation, respectively) produced the same pattern of results. In contrast, a visual (or data-driven) generation task (letter transposition) did not disrupt context memory for color. Third, generating nonwords produced no effect on item memory but persisted in producing a negative effect on context memory for target attributes, implying that (a) the negative generation effect in context memory is not mediated by semantic encoding, and (b) the negative effect on context memory can be dissociated from the positive effect on item memory. The results are interpreted in terms of the processing account of generation. The original, perceptual-conceptual version of this account is too narrow, but a modified processing account, based on a more generic visual versus nonvisual processing distinction, accommodates the results. Copyright 2006 APA, all rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosikhin, Ahmad, E-mail: a.rosikhin86@yahoo.co.id; Winata, Toto, E-mail: toto@fi.itb.ac.id
2016-04-19
Internal transmission profile in charges carrier generation layer of graphene/Si based solar cell has been explored theoretically. Photovoltaic device was constructed from graphene/Si heterojunction forming a multilayer stuck with Si as generation layer. The graphene/Si sheet was layered on ITO/glass wafer then coated by Al forming Ohmic contact with Si. Photon incident propagate from glass substrate to metal electrode and assumed that there is no transmission in Al layer. The wavelength range spectra used in this calculation was 200 – 1000 nm. It found that transmission intensity in the generation layer show non-linear behavior and partitioned by few areas which relatedmore » with excitation process. According to this information, it may to optimize the photons absorption to create more excitation process by inserting appropriate material to enhance optical properties in certain wavelength spectra because of the exciton generation is strongly influenced by photon absorption.« less
Solution-Processed Carbon Nanotube True Random Number Generator.
Gaviria Rojas, William A; McMorrow, Julian J; Geier, Michael L; Tang, Qianying; Kim, Chris H; Marks, Tobin J; Hersam, Mark C
2017-08-09
With the growing adoption of interconnected electronic devices in consumer and industrial applications, there is an increasing demand for robust security protocols when transmitting and receiving sensitive data. Toward this end, hardware true random number generators (TRNGs), commonly used to create encryption keys, offer significant advantages over software pseudorandom number generators. However, the vast network of devices and sensors envisioned for the "Internet of Things" will require small, low-cost, and mechanically flexible TRNGs with low computational complexity. These rigorous constraints position solution-processed semiconducting single-walled carbon nanotubes (SWCNTs) as leading candidates for next-generation security devices. Here, we demonstrate the first TRNG using static random access memory (SRAM) cells based on solution-processed SWCNTs that digitize thermal noise to generate random bits. This bit generation strategy can be readily implemented in hardware with minimal transistor and computational overhead, resulting in an output stream that passes standardized statistical tests for randomness. By using solution-processed semiconducting SWCNTs in a low-power, complementary architecture to achieve TRNG, we demonstrate a promising approach for improving the security of printable and flexible electronics.
2008-07-01
generation of process partitioning, a thread pipelining becomes possible. In this paper we briefly summarize the requirements and trends for FADEC based... FADEC environment, presenting a hypothetical realization of an example application. Finally we discuss the application of Time-Triggered...based control applications of the future. 15. SUBJECT TERMS Gas turbine, FADEC , Multi-core processing technology, disturbed based control
Lee, Robert H; Bott, Marjorie J; Forbes, Sarah; Redford, Linda; Swagerty, Daniel L; Taunton, Roma Lee
2003-01-01
Understanding how quality improvement affects costs is important. Unfortunately, low-cost, reliable ways of measuring direct costs are scarce. This article builds on the principles of process improvement to develop a costing strategy that meets both criteria. Process-based costing has 4 steps: developing a flowchart, estimating resource use, valuing resources, and calculating direct costs. To illustrate the technique, this article uses it to cost the care planning process in 3 long-term care facilities. We conclude that process-based costing is easy to implement; generates reliable, valid data; and allows nursing managers to assess the costs of new or modified processes.
From empirical data to time-inhomogeneous continuous Markov processes.
Lencastre, Pedro; Raischel, Frank; Rogers, Tim; Lind, Pedro G
2016-03-01
We present an approach for testing for the existence of continuous generators of discrete stochastic transition matrices. Typically, existing methods to ascertain the existence of continuous Markov processes are based on the assumption that only time-homogeneous generators exist. Here a systematic extension to time inhomogeneity is presented, based on new mathematical propositions incorporating necessary and sufficient conditions, which are then implemented computationally and applied to numerical data. A discussion concerning the bridging between rigorous mathematical results on the existence of generators to its computational implementation is presented. Our detection algorithm shows to be effective in more than 60% of tested matrices, typically 80% to 90%, and for those an estimate of the (nonhomogeneous) generator matrix follows. We also solve the embedding problem analytically for the particular case of three-dimensional circulant matrices. Finally, a discussion of possible applications of our framework to problems in different fields is briefly addressed.
GENIE(++): A Multi-Block Structured Grid System
NASA Technical Reports Server (NTRS)
Williams, Tonya; Nadenthiran, Naren; Thornburg, Hugh; Soni, Bharat K.
1996-01-01
The computer code GENIE++ is a continuously evolving grid system containing a multitude of proven geometry/grid techniques. The generation process in GENIE++ is based on an earlier version. The process uses several techniques either separately or in combination to quickly and economically generate sculptured geometry descriptions and grids for arbitrary geometries. The computational mesh is formed by using an appropriate algebraic method. Grid clustering is accomplished with either exponential or hyperbolic tangent routines which allow the user to specify a desired point distribution. Grid smoothing can be accomplished by using an elliptic solver with proper forcing functions. B-spline and Non-Uniform Rational B-splines (NURBS) algorithms are used for surface definition and redistribution. The built in sculptured geometry definition with desired distribution of points, automatic Bezier curve/surface generation for interior boundaries/surfaces, and surface redistribution is based on NURBS. Weighted Lagrance/Hermite transfinite interpolation methods, interactive geometry/grid manipulation modules, and on-line graphical visualization of the generation process are salient features of this system which result in a significant time savings for a given geometry/grid application.
Auto-Generated Semantic Processing Services
NASA Technical Reports Server (NTRS)
Davis, Rodney; Hupf, Greg
2009-01-01
Auto-Generated Semantic Processing (AGSP) Services is a suite of software tools for automated generation of other computer programs, denoted cross-platform semantic adapters, that support interoperability of computer-based communication systems that utilize a variety of both new and legacy communication software running in a variety of operating- system/computer-hardware combinations. AGSP has numerous potential uses in military, space-exploration, and other government applications as well as in commercial telecommunications. The cross-platform semantic adapters take advantage of common features of computer- based communication systems to enforce semantics, messaging protocols, and standards of processing of streams of binary data to ensure integrity of data and consistency of meaning among interoperating systems. The auto-generation aspect of AGSP Services reduces development time and effort by emphasizing specification and minimizing implementation: In effect, the design, building, and debugging of software for effecting conversions among complex communication protocols, custom device mappings, and unique data-manipulation algorithms is replaced with metadata specifications that map to an abstract platform-independent communications model. AGSP Services is modular and has been shown to be easily integrable into new and legacy NASA flight and ground communication systems.
Forman, Michael A; Young, Derek
2012-09-18
Examples of methods for generating data based on a communications channel are described. In one such example, a processing unit may generate a first vector representation based in part on at least two characteristics of a communications channel. A constellation having at least two dimensions may be addressed with the first vector representation to identify a first symbol associated with the first vector representation. The constellation represents a plurality of regions, each region associated with a respective symbol. The symbol may be used to generate data, which may stored in an electronic storage medium and used as a cryptographic key or a spreading code or hopping sequence in a modulation technique.
Bird's-eye view on noise-based logic.
Kish, Laszlo B; Granqvist, Claes G; Horvath, Tamas; Klappenecker, Andreas; Wen, He; Bezrukov, Sergey M
2014-01-01
Noise-based logic is a practically deterministic logic scheme inspired by the randomness of neural spikes and uses a system of uncorrelated stochastic processes and their superposition to represent the logic state. We briefly discuss various questions such as ( i ) What does practical determinism mean? ( ii ) Is noise-based logic a Turing machine? ( iii ) Is there hope to beat (the dreams of) quantum computation by a classical physical noise-based processor, and what are the minimum hardware requirements for that? Finally, ( iv ) we address the problem of random number generators and show that the common belief that quantum number generators are superior to classical (thermal) noise-based generators is nothing but a myth.
Bird's-eye view on noise-based logic
NASA Astrophysics Data System (ADS)
Kish, Laszlo B.; Granqvist, Claes G.; Horvath, Tamas; Klappenecker, Andreas; Wen, He; Bezrukov, Sergey M.
2014-09-01
Noise-based logic is a practically deterministic logic scheme inspired by the randomness of neural spikes and uses a system of uncorrelated stochastic processes and their superposition to represent the logic state. We briefly discuss various questions such as (i) What does practical determinism mean? (ii) Is noise-based logic a Turing machine? (iii) Is there hope to beat (the dreams of) quantum computation by a classical physical noise-based processor, and what are the minimum hardware requirements for that? Finally, (iv) we address the problem of random number generators and show that the common belief that quantum number generators are superior to classical (thermal) noise-based generators is nothing but a myth.
NASA Technical Reports Server (NTRS)
Baker, C. R.
1975-01-01
Liquid hydrogen is being considered as a substitute for conventional hydrocarbon-based fuels for future generations of commercial jet aircraft. Its acceptance will depend, in part, upon the technology and cost of liquefaction. The process and economic requirements for providing a sufficient quantity of liquid hydrogen to service a major airport are described. The design is supported by thermodynamic studies which determine the effect of process arrangement and operating parameters on the process efficiency and work of liquefaction.
Flexible Learning Itineraries Based on Conceptual Maps
ERIC Educational Resources Information Center
Agudelo, Olga Lucía; Salinas, Jesús
2015-01-01
The use of learning itineraries based on conceptual maps is studied in order to propose a more flexible instructional design that strengthens the learning process focused on the student, generating non-linear processes, characterising its elements, setting up relationships between them and shaping a general model with specifications for each…
Performance Testing of GPU-Based Approximate Matching Algorithm on Network Traffic
2015-03-01
Defense Department’s use. vi THIS PAGE INTENTIONALLY LEFT BLANK vii TABLE OF CONTENTS I. INTRODUCTION...22 D. GENERATING DIGESTS ............................................................................23 1. Reference...the-shelf GPU Graphical Processing Unit GPGPU General -Purpose Graphic Processing Unit HBSS Host-Based Security System HIPS Host Intrusion
A distributed data base management system. [for Deep Space Network
NASA Technical Reports Server (NTRS)
Bryan, A. I.
1975-01-01
Major system design features of a distributed data management system for the NASA Deep Space Network (DSN) designed for continuous two-way deep space communications are described. The reasons for which the distributed data base utilizing third-generation minicomputers is selected as the optimum approach for the DSN are threefold: (1) with a distributed master data base, valid data is available in real-time to support DSN management activities at each location; (2) data base integrity is the responsibility of local management; and (3) the data acquisition/distribution and processing power of a third-generation computer enables the computer to function successfully as a data handler or as an on-line process controller. The concept of the distributed data base is discussed along with the software, data base integrity, and hardware used. The data analysis/update constraint is examined.
Automatic generation of pictorial transcripts of video programs
NASA Astrophysics Data System (ADS)
Shahraray, Behzad; Gibbon, David C.
1995-03-01
An automatic authoring system for the generation of pictorial transcripts of video programs which are accompanied by closed caption information is presented. A number of key frames, each of which represents the visual information in a segment of the video (i.e., a scene), are selected automatically by performing a content-based sampling of the video program. The textual information is recovered from the closed caption signal and is initially segmented based on its implied temporal relationship with the video segments. The text segmentation boundaries are then adjusted, based on lexical analysis and/or caption control information, to account for synchronization errors due to possible delays in the detection of scene boundaries or the transmission of the caption information. The closed caption text is further refined through linguistic processing for conversion to lower- case with correct capitalization. The key frames and the related text generate a compact multimedia presentation of the contents of the video program which lends itself to efficient storage and transmission. This compact representation can be viewed on a computer screen, or used to generate the input to a commercial text processing package to generate a printed version of the program.
An On-Demand Optical Quantum Random Number Generator with In-Future Action and Ultra-Fast Response
Stipčević, Mario; Ursin, Rupert
2015-01-01
Random numbers are essential for our modern information based society e.g. in cryptography. Unlike frequently used pseudo-random generators, physical random number generators do not depend on complex algorithms but rather on a physicsal process to provide true randomness. Quantum random number generators (QRNG) do rely on a process, wich can be described by a probabilistic theory only, even in principle. Here we present a conceptualy simple implementation, which offers a 100% efficiency of producing a random bit upon a request and simultaneously exhibits an ultra low latency. A careful technical and statistical analysis demonstrates its robustness against imperfections of the actual implemented technology and enables to quickly estimate randomness of very long sequences. Generated random numbers pass standard statistical tests without any post-processing. The setup described, as well as the theory presented here, demonstrate the maturity and overall understanding of the technology. PMID:26057576
Secure ADS-B authentication system and method
NASA Technical Reports Server (NTRS)
Viggiano, Marc J (Inventor); Valovage, Edward M (Inventor); Samuelson, Kenneth B (Inventor); Hall, Dana L (Inventor)
2010-01-01
A secure system for authenticating the identity of ADS-B systems, including: an authenticator, including a unique id generator and a transmitter transmitting the unique id to one or more ADS-B transmitters; one or more ADS-B transmitters, including a receiver receiving the unique id, one or more secure processing stages merging the unique id with the ADS-B transmitter's identification, data and secret key and generating a secure code identification and a transmitter transmitting a response containing the secure code and ADSB transmitter's data to the authenticator; the authenticator including means for independently determining each ADS-B transmitter's secret key, a receiver receiving each ADS-B transmitter's response, one or more secure processing stages merging the unique id, ADS-B transmitter's identification and data and generating a secure code, and comparison processing comparing the authenticator-generated secure code and the ADS-B transmitter-generated secure code and providing an authentication signal based on the comparison result.
Feller processes: the next generation in modeling. Brownian motion, Lévy processes and beyond.
Böttcher, Björn
2010-12-03
We present a simple construction method for Feller processes and a framework for the generation of sample paths of Feller processes. The construction is based on state space dependent mixing of Lévy processes. Brownian Motion is one of the most frequently used continuous time Markov processes in applications. In recent years also Lévy processes, of which Brownian Motion is a special case, have become increasingly popular. Lévy processes are spatially homogeneous, but empirical data often suggest the use of spatially inhomogeneous processes. Thus it seems necessary to go to the next level of generalization: Feller processes. These include Lévy processes and in particular brownian motion as special cases but allow spatial inhomogeneities. Many properties of Feller processes are known, but proving the very existence is, in general, very technical. Moreover, an applicable framework for the generation of sample paths of a Feller process was missing. We explain, with practitioners in mind, how to overcome both of these obstacles. In particular our simulation technique allows to apply Monte Carlo methods to Feller processes.
Feller Processes: The Next Generation in Modeling. Brownian Motion, Lévy Processes and Beyond
Böttcher, Björn
2010-01-01
We present a simple construction method for Feller processes and a framework for the generation of sample paths of Feller processes. The construction is based on state space dependent mixing of Lévy processes. Brownian Motion is one of the most frequently used continuous time Markov processes in applications. In recent years also Lévy processes, of which Brownian Motion is a special case, have become increasingly popular. Lévy processes are spatially homogeneous, but empirical data often suggest the use of spatially inhomogeneous processes. Thus it seems necessary to go to the next level of generalization: Feller processes. These include Lévy processes and in particular Brownian motion as special cases but allow spatial inhomogeneities. Many properties of Feller processes are known, but proving the very existence is, in general, very technical. Moreover, an applicable framework for the generation of sample paths of a Feller process was missing. We explain, with practitioners in mind, how to overcome both of these obstacles. In particular our simulation technique allows to apply Monte Carlo methods to Feller processes. PMID:21151931
Selecting Processes to Minimize Hexavalent Chromium from Stainless Steel Welding
KEANE, M.; SIERT, A.; STONE, S.; CHEN, B.; SLAVEN, J.; CUMPSTON, A.; ANTONINI, J.
2015-01-01
Eight welding processes/shielding gas combinations were assessed for generation of hexavalent chromium (Cr6+) in stainless steel welding fumes. The processes examined were gas metal arc welding (GMAW) (axial spray, short circuit, and pulsed spray modes), flux cored arc welding (FCAW), and shielded metal arc welding (SMAW). The Cr6+ fractions were measured in the fumes; fume generation rates, Cr6+ generation rates, and Cr6+ generation rates per unit mass of welding wire were determined. A limited controlled comparison study was done in a welding shop including SMAW, FCAW, and three GMAW methods. The processes studied were compared for costs, including relative labor costs. Results indicate the Cr6+ in the fume varied widely, from a low of 2800 to a high of 34,000 ppm. Generation rates of Cr6+ ranged from 69 to 7800 μg/min, and Cr6+ generation rates per unit of wire ranged from 1 to 270 μg/g. The results of field study were similar to the findings in the laboratory. The Cr6+ (ppm) in the fume did not necessarily correlate with the Cr6+ generation rate. Physical properties were similar for the processes, with mass median aerodynamic diameters ranging from 250 to 336 nm, while the FCAW and SMAW fumes were larger (360 and 670 nm, respectively). Conclusion: The pulsed axial spray method was the best choice of the processes studied based on minimal fume generation, minimal Cr6+ generation, and cost per weld. This method is usable in any position, has a high metal deposition rate, and is relatively simple to learn and use. PMID:26690276
Feynman-Kac formula for stochastic hybrid systems.
Bressloff, Paul C
2017-01-01
We derive a Feynman-Kac formula for functionals of a stochastic hybrid system evolving according to a piecewise deterministic Markov process. We first derive a stochastic Liouville equation for the moment generator of the stochastic functional, given a particular realization of the underlying discrete Markov process; the latter generates transitions between different dynamical equations for the continuous process. We then analyze the stochastic Liouville equation using methods recently developed for diffusion processes in randomly switching environments. In particular, we obtain dynamical equations for the moment generating function, averaged with respect to realizations of the discrete Markov process. The resulting Feynman-Kac formula takes the form of a differential Chapman-Kolmogorov equation. We illustrate the theory by calculating the occupation time for a one-dimensional velocity jump process on the infinite or semi-infinite real line. Finally, we present an alternative derivation of the Feynman-Kac formula based on a recent path-integral formulation of stochastic hybrid systems.
Emerging Approach of Natural Language Processing in Opinion Mining: A Review
NASA Astrophysics Data System (ADS)
Kim, Tai-Hoon
Natural language processing (NLP) is a subfield of artificial intelligence and computational linguistics. It studies the problems of automated generation and understanding of natural human languages. This paper outlines a framework to use computer and natural language techniques for various levels of learners to learn foreign languages in Computer-based Learning environment. We propose some ideas for using the computer as a practical tool for learning foreign language where the most of courseware is generated automatically. We then describe how to build Computer Based Learning tools, discuss its effectiveness, and conclude with some possibilities using on-line resources.
Compact solar autoclave based on steam generation using broadband light-harvesting nanoparticles.
Neumann, Oara; Feronti, Curtis; Neumann, Albert D; Dong, Anjie; Schell, Kevin; Lu, Benjamin; Kim, Eric; Quinn, Mary; Thompson, Shea; Grady, Nathaniel; Nordlander, Peter; Oden, Maria; Halas, Naomi J
2013-07-16
The lack of readily available sterilization processes for medicine and dentistry practices in the developing world is a major risk factor for the propagation of disease. Modern medical facilities in the developed world often use autoclave systems to sterilize medical instruments and equipment and process waste that could contain harmful contagions. Here, we show the use of broadband light-absorbing nanoparticles as solar photothermal heaters, which generate high-temperature steam for a standalone, efficient solar autoclave useful for sanitation of instruments or materials in resource-limited, remote locations. Sterilization was verified using a standard Geobacillus stearothermophilus-based biological indicator.
Reconfigurable environmentally adaptive computing
NASA Technical Reports Server (NTRS)
Coxe, Robin L. (Inventor); Galica, Gary E. (Inventor)
2008-01-01
Described are methods and apparatus, including computer program products, for reconfigurable environmentally adaptive computing technology. An environmental signal representative of an external environmental condition is received. A processing configuration is automatically selected, based on the environmental signal, from a plurality of processing configurations. A reconfigurable processing element is reconfigured to operate according to the selected processing configuration. In some examples, the environmental condition is detected and the environmental signal is generated based on the detected condition.
NASA Astrophysics Data System (ADS)
Kersevan, Borut Paul; Richter-Waş, Elzbieta
2013-03-01
The AcerMC Monte Carlo generator is dedicated to the generation of Standard Model background processes which were recognised as critical for the searches at LHC, and generation of which was either unavailable or not straightforward so far. The program itself provides a library of the massive matrix elements (coded by MADGRAPH) and native phase space modules for generation of a set of selected processes. The hard process event can be completed by the initial and the final state radiation, hadronisation and decays through the existing interface with either PYTHIA, HERWIG or ARIADNE event generators and (optionally) TAUOLA and PHOTOS. Interfaces to all these packages are provided in the distribution version. The phase-space generation is based on the multi-channel self-optimising approach using the modified Kajantie-Byckling formalism for phase space construction and further smoothing of the phase space was obtained by using a modified ac-VEGAS algorithm. An additional improvement in the recent versions is the inclusion of the consistent prescription for matching the matrix element calculations with parton showering for a select list of processes. Catalogue identifier: ADQQ_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADQQ_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3853309 No. of bytes in distributed program, including test data, etc.: 68045728 Distribution format: tar.gz Programming language: FORTRAN 77 with popular extensions (g77, gfortran). Computer: All running Linux. Operating system: Linux. Classification: 11.2, 11.6. External routines: CERNLIB (http://cernlib.web.cern.ch/cernlib/), LHAPDF (http://lhapdf.hepforge.org/) Catalogue identifier of previous version: ADQQ_v1_0 Journal reference of previous version: Comput. Phys. Comm. 149(2003)142 Does the new version supersede the previous version?: Yes Nature of problem: Despite a large repertoire of processes implemented for generation in event generators like PYTHIA [1] or HERWIG [2] a number of background processes, crucial for studying the expected physics of the LHC experiments, is missing. For some of these processes the matrix element expressions are rather lengthy and/or to achieve a reasonable generation efficiency it is necessary to tailor the phase space selection procedure to the dynamics of the process. That is why it is not practical to imagine that any of the above general purpose generators will contain every, or even only observable, processes which will occur at LHC collisions. A more practical solution can be found in a library of dedicated matrix-element-based generators, with the standardised interfaces like that proposed in [3], to the more universal one which is used to complete the event generation. Solution method: The AcerMC EventGenerator provides a library of the matrix-element-based generators for several processes. The initial- and final-state showers, beam remnants and underlying events, fragmentation and remaining decays are supposed to be performed by the other universal generator to which this one is interfaced. We will call it a supervising generator. The interfaces to PYTHIA 6.4, ARIADNE 4.1 and HERWIG 6.5, as such generators, are provided. Provided is also an interface to TAUOLA [4] and PHOTOS [5] packages for τ-lepton decays (including spin correlations treatment) and QED radiations in decays of particles. At present, the following matrix-element-based processes have been implemented: gg,qq¯→tt¯bb¯, qq¯→W(→ℓν)bb¯; qq¯→W(→ℓν)tt¯; gg,qq¯→Z/γ∗(→ℓℓ)bb¯; gg,qq¯→Z/γ∗(→ℓℓ,νν,bb¯)tt¯; complete EW gg,qq¯→(Z/W/γ∗→)tt¯bb¯; gg,qq¯→tt¯tt¯; gg,qq¯→(tt¯→)ff¯bff¯b¯; gg,qq¯→(WWbb →)ff¯ff¯bb¯. Both interfaces allow the use of the LHAPDF/LHAGLUE library of parton density functions. Provided is also a set of control processes: qq¯→W→ℓν; qq¯→Z/γ∗→ℓℓ; gg,qq¯→tt¯ and gg→(tt¯→)WbWb¯; Reasons for new version: Implementation of several new processes and methods. Summary of revisions: Each version added new processes or functionalities, a detailed list is given in the section “Changes since AcerMC 1.0”. Restrictions: The package is optimised for the 14 TeV pp collision simulated in the LHC environment and also works at the achieved LHC energies of 7 TeV and 8 TeV. The consistency between results of the complete generation using PYTHIA 6.4 or HERWIG 6.5 interfaces is technically limited by the different approaches taken in both these generators for evaluating αQCD and αQED couplings and by the different models for fragmentation/hadronisation. For the consistency check, in the AcerMC library contains native coded definitions of the QCD and αQED. Using these native definitions leads to the same total cross-sections both with PYTHIA 6.4 or HERWIG 6.5 interfaces.
Capodaglio, Andrea G; Bojanowska-Czajka, Anna; Trojanowicz, Marek
2018-04-18
Carbamazepine and diclofenac are two examples of drugs with widespread geographical and environmental media proliferation that are poorly removed by traditional wastewater treatment processes. Advanced oxidation processes (AOPs) have been proposed as alternative methods to remove these compounds in solution. AOPs are based on a wide class of powerful technologies, including UV radiation, ozone, hydrogen peroxide, Fenton process, catalytic wet peroxide oxidation, heterogeneous photocatalysis, electrochemical oxidation and their combinations, sonolysis, and microwaves applicable to both water and wastewater. Moreover, processes rely on the production of oxidizing radicals (•OH and others) in a solution to decompose present pollutants. Water radiolysis-based processes, which are an alternative to the former, involve the use of concentrated energy (beams of accelerated electrons or γ-rays) to split water molecules, generating strong oxidants and reductants (radicals) at the same time. In this paper, the degradation of carbamazepine and diclofenac by means of all these processes is discussed and compared. Energy and byproduct generation issues are also addressed.
D Surface Generation from Aerial Thermal Imagery
NASA Astrophysics Data System (ADS)
Khodaei, B.; Samadzadegan, F.; Dadras Javan, F.; Hasani, H.
2015-12-01
Aerial thermal imagery has been recently applied to quantitative analysis of several scenes. For the mapping purpose based on aerial thermal imagery, high accuracy photogrammetric process is necessary. However, due to low geometric resolution and low contrast of thermal imaging sensors, there are some challenges in precise 3D measurement of objects. In this paper the potential of thermal video in 3D surface generation is evaluated. In the pre-processing step, thermal camera is geometrically calibrated using a calibration grid based on emissivity differences between the background and the targets. Then, Digital Surface Model (DSM) generation from thermal video imagery is performed in four steps. Initially, frames are extracted from video, then tie points are generated by Scale-Invariant Feature Transform (SIFT) algorithm. Bundle adjustment is then applied and the camera position and orientation parameters are determined. Finally, multi-resolution dense image matching algorithm is used to create 3D point cloud of the scene. Potential of the proposed method is evaluated based on thermal imaging cover an industrial area. The thermal camera has 640×480 Uncooled Focal Plane Array (UFPA) sensor, equipped with a 25 mm lens which mounted in the Unmanned Aerial Vehicle (UAV). The obtained results show the comparable accuracy of 3D model generated based on thermal images with respect to DSM generated from visible images, however thermal based DSM is somehow smoother with lower level of texture. Comparing the generated DSM with the 9 measured GCPs in the area shows the Root Mean Square Error (RMSE) value is smaller than 5 decimetres in both X and Y directions and 1.6 meters for the Z direction.
Measurement-based quantum communication with resource states generated by entanglement purification
NASA Astrophysics Data System (ADS)
Wallnöfer, J.; Dür, W.
2017-01-01
We investigate measurement-based quantum communication with noisy resource states that are generated by entanglement purification. We consider the transmission of encoded information via noisy quantum channels using a measurement-based implementation of encoding, error correction, and decoding. We show that such an approach offers advantages over direct transmission, gate-based error correction, and measurement-based schemes with direct generation of resource states. We analyze the noise structure of resource states generated by entanglement purification and show that a local error model, i.e., noise acting independently on all qubits of the resource state, is a good approximation in general, and provides an exact description for Greenberger-Horne-Zeilinger states. The latter are resources for a measurement-based implementation of error-correction codes for bit-flip or phase-flip errors. This provides an approach to link the recently found very high thresholds for fault-tolerant measurement-based quantum information processing based on local error models for resource states with error thresholds for gate-based computational models.
An ontology model for nursing narratives with natural language generation technology.
Min, Yul Ha; Park, Hyeoun-Ae; Jeon, Eunjoo; Lee, Joo Yun; Jo, Soo Jung
2013-01-01
The purpose of this study was to develop an ontology model to generate nursing narratives as natural as human language from the entity-attribute-value triplets of a detailed clinical model using natural language generation technology. The model was based on the types of information and documentation time of the information along the nursing process. The typesof information are data characterizing the patient status, inferences made by the nurse from the patient data, and nursing actions selected by the nurse to change the patient status. This information was linked to the nursing process based on the time of documentation. We describe a case study illustrating the application of this model in an acute-care setting. The proposed model provides a strategy for designing an electronic nursing record system.
Saito, Kyosuke; Tanabe, Tadao; Oyama, Yutaka
2014-07-14
Terahertz (THz) wave generation via difference frequency mixing (DFM) process in strain silicon membrane waveguides by introducing the straining layer is theoretically investigated. The Si(3)N(4) straining layer induces anisotropic compressive strain in the silicon core and results in the appearance of the bulk second order nonlinear susceptibility χ((2)) by breaking the crystal symmetry. We have proposed waveguide structures for THz wave generation under the DFM process by .using the modal birefringence in the waveguide core. Our simulations show that an output power of up to 0.95 mW can be achieved at 9.09 THz. The strained silicon optical device may open a widow in the field of the silicon-based active THz photonic device applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Myronakis, M; Cai, W; Dhou, S
Purpose: To design a comprehensive open-source, publicly available, graphical user interface (GUI) to facilitate the configuration, generation, processing and use of the 4D Extended Cardiac-Torso (XCAT) phantom. Methods: The XCAT phantom includes over 9000 anatomical objects as well as respiratory, cardiac and tumor motion. It is widely used for research studies in medical imaging and radiotherapy. The phantom generation process involves the configuration of a text script to parameterize the geometry, motion, and composition of the whole body and objects within it, and to generate simulated PET or CT images. To avoid the need for manual editing or script writing,more » our MATLAB-based GUI uses slider controls, drop-down lists, buttons and graphical text input to parameterize and process the phantom. Results: Our GUI can be used to: a) generate parameter files; b) generate the voxelized phantom; c) combine the phantom with a lesion; d) display the phantom; e) produce average and maximum intensity images from the phantom output files; f) incorporate irregular patient breathing patterns; and f) generate DICOM files containing phantom images. The GUI provides local help information using tool-tip strings on the currently selected phantom, minimizing the need for external documentation. The DICOM generation feature is intended to simplify the process of importing the phantom images into radiotherapy treatment planning systems or other clinical software. Conclusion: The GUI simplifies and automates the use of the XCAT phantom for imaging-based research projects in medical imaging or radiotherapy. This has the potential to accelerate research conducted with the XCAT phantom, or to ease the learning curve for new users. This tool does not include the XCAT phantom software itself. We would like to acknowledge funding from MRA, Varian Medical Systems Inc.« less
Nakajima, Toshiyuki
2017-12-01
Evolution by natural selection requires the following conditions: (1) a particular selective environment; (2) variation of traits in the population; (3) differential survival/reproduction among the types of organisms; and (4) heritable traits. However, the traditional (standard) model does not clearly explain how and why these conditions are generated or determined. What generates a selective environment? What generates new types? How does a certain type replace, or coexist with, others? In this paper, based on the holistic philosophy of Western and Eastern traditions, I focus on the ecosystem as a higher-level system and generator of conditions that induce the evolution of component populations; I also aim to identify the ecosystem processes that generate those conditions. In particular, I employ what I call the scientific principle of dependent-arising (SDA), which is tailored for scientific use and is based on Buddhism principle called "pratītya-samutpāda" in Sanskrit. The SDA principle asserts that there exists a higher-level system, or entity, which includes a focal process of a system as a part within it; this determines or generates the conditions required for the focal process to work in a particular way. I conclude that the ecosystem generates (1) selective environments for component species through ecosystem dynamics; (2) new genetic types through lateral gene transfer, hybridization, and symbiogenesis among the component species of the ecosystem; (3) mechanistic processes of replacement of an old type with a new one. The results of this study indicate that the ecological extension of the theoretical model of adaptive evolution is required for better understanding of adaptive evolution. Copyright © 2017 Elsevier Ltd. All rights reserved.
PMR polyimides-review and update
NASA Technical Reports Server (NTRS)
Serafini, T. T.; Delvigs, P.; Alston, W. B.
1982-01-01
Fiber reinforced PMR polyimides are finding increased acceptance as engineering materials for high performance structural applications. Prepreg materials based on this novel class of highly processable, high temperature resistant polyimides are commercially available and the PMR concept is used by other investigators. The current status of first and second generation PMR polyimides were reviewed. Emphasis is given to the chemistry, processing and applications of the first generation material known as PMR-15.
Test Generation Algorithm for Fault Detection of Analog Circuits Based on Extreme Learning Machine
Zhou, Jingyu; Tian, Shulin; Yang, Chenglin; Ren, Xuelong
2014-01-01
This paper proposes a novel test generation algorithm based on extreme learning machine (ELM), and such algorithm is cost-effective and low-risk for analog device under test (DUT). This method uses test patterns derived from the test generation algorithm to stimulate DUT, and then samples output responses of the DUT for fault classification and detection. The novel ELM-based test generation algorithm proposed in this paper contains mainly three aspects of innovation. Firstly, this algorithm saves time efficiently by classifying response space with ELM. Secondly, this algorithm can avoid reduced test precision efficiently in case of reduction of the number of impulse-response samples. Thirdly, a new process of test signal generator and a test structure in test generation algorithm are presented, and both of them are very simple. Finally, the abovementioned improvement and functioning are confirmed in experiments. PMID:25610458
Materials for Better Li-based Storage Systems for a "Green Energy Society"
Jean-Marie Tarascon
2017-12-09
Li-ion batteries are strongly considered for powering the upcoming generations of HEVs and PHEVs, but there are still the issues of safety and costs in terms of materials resources and abundances, synthesis, and recycling processes. Notions of materials having minimum footprint in nature, made via eco-efficient processes, must be integrated in our new research towards the next generation of sustainable and "greener" Li-ion batteries. In this July 13, 2009 talk sponsored by Berkeley Lab's Environental Energy Technologies Division, Jean-Marie Tarascon, a professor at the University of Picardie (Amiens), discuss Eco-efficient synthesis via hydrothermal/solvothermal processes using latent bases as well as structure directing templates or other bio-related approaches of LiFePO4 nanopowders.
Pathways for Disposal of Commercially-Generated Tritiated Waste
DOE Office of Scientific and Technical Information (OSTI.GOV)
Halverson, Nancy V.
From a waste disposal standpoint, tritium is a major challenge. Because it behaves like hydrogen, tritium exchanges readily with hydrogen in the ground water and moves easily through the ground. Land disposal sites must control the tritium activity and mobility of incoming wastes to protect human health and the environment. Consequently, disposal of tritiated low-level wastes is highly regulated and disposal options are limited. The United States has had eight operating commercial facilities licensed for low-level radioactive waste disposal, only four of which are currently receiving waste. Each of these is licensed and regulated by its state. Only two ofmore » these sites accept waste from states outside of their specified regional compact. For waste streams that cannot be disposed directly at one of the four active commercial low-level waste disposal facilities, processing facilities offer various forms of tritiated low-level waste processing and treatment, and then transport and dispose of the residuals at a disposal facility. These processing facilities may remove and recycle tritium, reduce waste volume, solidify liquid waste, remove hazardous constituents, or perform a number of additional treatments. Waste brokers also offer many low-level and mixed waste management and transportation services. These services can be especially helpful for small-quantity tritiated-waste generators, such as universities, research institutions, medical facilities, and some industries. The information contained in this report covers general capabilities and requirements for the various disposal/processing facilities and brokerage companies, but is not considered exhaustive. Typically, each facility has extensive waste acceptance criteria and will require a generator to thoroughly characterize their wastes. Then a contractual agreement between the waste generator and the disposal/processing/broker entity must be in place before waste is accepted. Costs for tritiated waste transportation, processing and disposal vary based a number of factors. In many cases, wastes with very low radioactivity are priced primarily based on weight or volume. For higher activities, costs are based on both volume and activity, with the activity-based charges usually being much larger than volume-based charges. Other factors affecting cost include location, waste classification and form, other hazards in the waste, etc. Costs may be based on general guidelines used by an individual disposal or processing site, but final costs are established by specific contract with each generator. For this report, seven hypothetical waste streams intended to represent commercially-generated tritiated waste were defined in order to calculate comparative costs. Ballpark costs for disposition of these hypothetical waste streams were calculated. These costs ranged from thousands to millions of dollars. Due to the complexity of the cost-determining factors mentioned above, the costs calculated in this report should be understood to represent very rough cost estimates for the various hypothetical wastes. Actual costs could be higher or could be lower due to quantity discounts or other factors.« less
Advanced Coal-Based Power Generations
NASA Technical Reports Server (NTRS)
Robson, F. L.
1982-01-01
Advanced power-generation systems using coal-derived fuels are evaluated in two-volume report. Report considers fuel cells, combined gas- and steam-turbine cycles, and magnetohydrodynamic (MHD) energy conversion. Presents technological status of each type of system and analyzes performance of each operating on medium-Btu fuel gas, either delivered via pipeline to powerplant or generated by coal-gasification process at plantsite.
GPU-based efficient realistic techniques for bleeding and smoke generation in surgical simulators.
Halic, Tansel; Sankaranarayanan, Ganesh; De, Suvranu
2010-12-01
In actual surgery, smoke and bleeding due to cauterization processes provide important visual cues to the surgeon, which have been proposed as factors in surgical skill assessment. While several virtual reality (VR)-based surgical simulators have incorporated the effects of bleeding and smoke generation, they are not realistic due to the requirement of real-time performance. To be interactive, visual update must be performed at at least 30 Hz and haptic (touch) information must be refreshed at 1 kHz. Simulation of smoke and bleeding is, therefore, either ignored or simulated using highly simplified techniques, since other computationally intensive processes compete for the available Central Processing Unit (CPU) resources. In this study we developed a novel low-cost method to generate realistic bleeding and smoke in VR-based surgical simulators, which outsources the computations to the graphical processing unit (GPU), thus freeing up the CPU for other time-critical tasks. This method is independent of the complexity of the organ models in the virtual environment. User studies were performed using 20 subjects to determine the visual quality of the simulations compared to real surgical videos. The smoke and bleeding simulation were implemented as part of a laparoscopic adjustable gastric banding (LAGB) simulator. For the bleeding simulation, the original implementation using the shader did not incur noticeable overhead. However, for smoke generation, an input/output (I/O) bottleneck was observed and two different methods were developed to overcome this limitation. Based on our benchmark results, a buffered approach performed better than a pipelined approach and could support up to 15 video streams in real time. Human subject studies showed that the visual realism of the simulations were as good as in real surgery (median rating of 4 on a 5-point Likert scale). Based on the performance results and subject study, both bleeding and smoke simulations were concluded to be efficient, highly realistic and well suited to VR-based surgical simulators. Copyright © 2010 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Nemoto, Mitsutaka; Hayashi, Naoto; Hanaoka, Shouhei; Nomura, Yukihiro; Miki, Soichiro; Yoshikawa, Takeharu; Ohtomo, Kuni
2016-03-01
The purpose of this study is to evaluate the feasibility of a novel feature generation, which is based on multiple deep neural networks (DNNs) with boosting, for computer-assisted detection (CADe). It is hard and time-consuming to optimize the hyperparameters for DNNs such as stacked denoising autoencoder (SdA). The proposed method allows using SdA based features without the burden of the hyperparameter setting. The proposed method was evaluated by an application for detecting cerebral aneurysms on magnetic resonance angiogram (MRA). A baseline CADe process included four components; scaling, candidate area limitation, candidate detection, and candidate classification. Proposed feature generation method was applied to extract the optimal features for candidate classification. Proposed method only required setting range of the hyperparameters for SdA. The optimal feature set was selected from a large quantity of SdA based features by multiple SdAs, each of which was trained using different hyperparameter set. The feature selection was operated through ada-boost ensemble learning method. Training of the baseline CADe process and proposed feature generation were operated with 200 MRA cases, and the evaluation was performed with 100 MRA cases. Proposed method successfully provided SdA based features just setting the range of some hyperparameters for SdA. The CADe process by using both previous voxel features and SdA based features had the best performance with 0.838 of an area under ROC curve and 0.312 of ANODE score. The results showed that proposed method was effective in the application for detecting cerebral aneurysms on MRA.
Small-Size High-Current Generators for X-Ray Backlighting
NASA Astrophysics Data System (ADS)
Chaikovsky, S. A.; Artyomov, A. P.; Zharova, N. V.; Zhigalin, A. S.; Lavrinovich, I. V.; Oreshkin, V. I.; Ratakhin, N. A.; Rousskikh, A. G.; Fedunin, A. V.; Fedushchak, V. F.; Erfort, A. A.
2017-12-01
The paper deals with the soft X-ray backlighting based on the X-pinch as a powerful tool for physical studies of fast processes. Proposed are the unique small-size pulsed power generators operating as a low-inductance capacitor bank. These pulse generators provide the X-pinch-based soft X-ray source (hν = 1-10 keV) of micron size at 2-3 ns pulse duration. The small size and weight of pulse generators allow them to be transported to any laboratory for conducting X-ray backlighting of test objects with micron space resolution and nanosecond exposure time. These generators also allow creating synchronized multi-frame radiographic complexes with frame delay variation in a broad range.
Solar thermochemical splitting of water to generate hydrogen
Rao, C. N. R.; Dey, Sunita
2017-01-01
Solar photochemical means of splitting water (artificial photosynthesis) to generate hydrogen is emerging as a viable process. The solar thermochemical route also promises to be an attractive means of achieving this objective. In this paper we present different types of thermochemical cycles that one can use for the purpose. These include the low-temperature multistep process as well as the high-temperature two-step process. It is noteworthy that the multistep process based on the Mn(II)/Mn(III) oxide system can be carried out at 700 °C or 750 °C. The two-step process has been achieved at 1,300 °C/900 °C by using yttrium-based rare earth manganites. It seems possible to render this high-temperature process as an isothermal process. Thermodynamics and kinetics of H2O splitting are largely controlled by the inherent redox properties of the materials. Interestingly, under the conditions of H2O splitting in the high-temperature process CO2 can also be decomposed to CO, providing a feasible method for generating the industrially important syngas (CO+H2). Although carbonate formation can be addressed as a hurdle during CO2 splitting, the problem can be avoided by a suitable choice of experimental conditions. The choice of the solar reactor holds the key for the commercialization of thermochemical fuel production. PMID:28522461
High-brightness line generators and fiber-coupled sources based on low-smile laser diode arrays
NASA Astrophysics Data System (ADS)
Watson, J.; Schleuning, D.; Lavikko, P.; Alander, T.; Lee, D.; Lovato, P.; Winhold, H.; Griffin, M.; Tolman, S.; Liang, P.; Hasenberg, T.; Reed, M.
2008-02-01
We describe the performance of diode laser bars mounted on conductive and water cooled platforms using low smile processes. Total smile of <1μm is readily achieved on both In and AuSn based platforms. Combined with environmentally robust lensing, these mounts form the basis of multiple, high-brightness products. Free-space-coupled devices utilizing conductively-cooled bars delivering 100W from a 200μm, 0.22NA fiber at 976nm have been developed for pumping fiber lasers, as well as for materials processing. Additionally, line generators for graphics and materials processing applications have been produced. Starting from single bars mounted on water-cooled packages that do not require de-ionized or pH-controlled water, these line generators deliver over 80W of power into a line with an aspect ratio of 600:1, and have a BPP of <2mm-mrad in the direction orthogonal to the line.
Shukla, Nagesh; Keast, John E; Ceglarek, Darek
2014-10-01
The modelling of complex workflows is an important problem-solving technique within healthcare settings. However, currently most of the workflow models use a simplified flow chart of patient flow obtained using on-site observations, group-based debates and brainstorming sessions, together with historic patient data. This paper presents a systematic and semi-automatic methodology for knowledge acquisition with detailed process representation using sequential interviews of people in the key roles involved in the service delivery process. The proposed methodology allows the modelling of roles, interactions, actions, and decisions involved in the service delivery process. This approach is based on protocol generation and analysis techniques such as: (i) initial protocol generation based on qualitative interviews of radiology staff, (ii) extraction of key features of the service delivery process, (iii) discovering the relationships among the key features extracted, and, (iv) a graphical representation of the final structured model of the service delivery process. The methodology is demonstrated through a case study of a magnetic resonance (MR) scanning service-delivery process in the radiology department of a large hospital. A set of guidelines is also presented in this paper to visually analyze the resulting process model for identifying process vulnerabilities. A comparative analysis of different workflow models is also conducted. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Real-time stereo matching using orthogonal reliability-based dynamic programming.
Gong, Minglun; Yang, Yee-Hong
2007-03-01
A novel algorithm is presented in this paper for estimating reliable stereo matches in real time. Based on the dynamic programming-based technique we previously proposed, the new algorithm can generate semi-dense disparity maps using as few as two dynamic programming passes. The iterative best path tracing process used in traditional dynamic programming is replaced by a local minimum searching process, making the algorithm suitable for parallel execution. Most computations are implemented on programmable graphics hardware, which improves the processing speed and makes real-time estimation possible. The experiments on the four new Middlebury stereo datasets show that, on an ATI Radeon X800 card, the presented algorithm can produce reliable matches for 60% approximately 80% of pixels at the rate of 10 approximately 20 frames per second. If needed, the algorithm can be configured for generating full density disparity maps.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jagtiani, Ashish V.; Miyazoe, Hiroyuki; Chang, Josephine
2016-01-15
The ability to achieve atomic layer precision is the utmost goal in the implementation of atomic layer etch technology. Carbon-based materials such as carbon nanotubes (CNTs) and graphene are single atomic layers of carbon with unique properties and, as such, represent the ultimate candidates to study the ability to process with atomic layer precision and assess impact of plasma damage to atomic layer materials. In this work, the authors use these materials to evaluate the atomic layer processing capabilities of electron beam generated plasmas. First, the authors evaluate damage to semiconducting CNTs when exposed to beam-generated plasmas and compare thesemore » results against the results using typical plasma used in semiconductor processing. The authors find that the beam generated plasma resulted in significantly lower current degradation in comparison to typical plasmas. Next, the authors evaluated the use of electron beam generated plasmas to process graphene-based devices by functionalizing graphene with fluorine, nitrogen, or oxygen to facilitate atomic layer deposition (ALD). The authors found that all adsorbed species resulted in successful ALD with varying impact on the transconductance of the graphene. Furthermore, the authors compare the ability of both beam generated plasma as well as a conventional low ion energy inductively coupled plasma (ICP) to remove silicon nitride (SiN) deposited on top of the graphene films. Our results indicate that, while both systems can remove SiN, an increase in the D/G ratio from 0.08 for unprocessed graphene to 0.22 to 0.26 for the beam generated plasma, while the ICP yielded values from 0.52 to 1.78. Generally, while some plasma-induced damage was seen for both plasma sources, a much wider process window as well as far less damage to CNTs and graphene was observed when using electron beam generated plasmas.« less
Model Based Document and Report Generation for Systems Engineering
NASA Technical Reports Server (NTRS)
Delp, Christopher; Lam, Doris; Fosse, Elyse; Lee, Cin-Young
2013-01-01
As Model Based Systems Engineering (MBSE) practices gain adoption, various approaches have been developed in order to simplify and automate the process of generating documents from models. Essentially, all of these techniques can be unified around the concept of producing different views of the model according to the needs of the intended audience. In this paper, we will describe a technique developed at JPL of applying SysML Viewpoints and Views to generate documents and reports. An architecture of model-based view and document generation will be presented, and the necessary extensions to SysML with associated rationale will be explained. A survey of examples will highlight a variety of views that can be generated, and will provide some insight into how collaboration and integration is enabled. We will also describe the basic architecture for the enterprise applications that support this approach.
Model based document and report generation for systems engineering
NASA Astrophysics Data System (ADS)
Delp, C.; Lam, D.; Fosse, E.; Lee, Cin-Young
As Model Based Systems Engineering (MBSE) practices gain adoption, various approaches have been developed in order to simplify and automate the process of generating documents from models. Essentially, all of these techniques can be unified around the concept of producing different views of the model according to the needs of the intended audience. In this paper, we will describe a technique developed at JPL of applying SysML Viewpoints and Views to generate documents and reports. An architecture of model-based view and document generation will be presented, and the necessary extensions to SysML with associated rationale will be explained. A survey of examples will highlight a variety of views that can be generated, and will provide some insight into how collaboration and integration is enabled. We will also describe the basic architecture for the enterprise applications that support this approach.
Rule Based Category Learning in Patients with Parkinson’s Disease
Price, Amanda; Filoteo, J. Vincent; Maddox, W. Todd
2009-01-01
Measures of explicit rule-based category learning are commonly used in neuropsychological evaluation of individuals with Parkinson’s disease (PD) and the pattern of PD performance on these measures tends to be highly varied. We review the neuropsychological literature to clarify the manner in which PD affects the component processes of rule-based category learning and work to identify and resolve discrepancies within this literature. In particular, we address the manner in which PD and its common treatments affect the processes of rule generation, maintenance, shifting and selection. We then integrate the neuropsychological research with relevant neuroimaging and computational modeling evidence to clarify the neurobiological impact of PD on each process. Current evidence indicates that neurochemical changes associated with PD primarily disrupt rule shifting, and may disturb feedback-mediated learning processes that guide rule selection. Although surgical and pharmacological therapies remediate this deficit, it appears that the same treatments may contribute to impaired rule generation, maintenance and selection processes. These data emphasize the importance of distinguishing between the impact of PD and its common treatments when considering the neuropsychological profile of the disease. PMID:19428385
InSAR Deformation Time Series Processed On-Demand in the Cloud
NASA Astrophysics Data System (ADS)
Horn, W. B.; Weeden, R.; Dimarchi, H.; Arko, S. A.; Hogenson, K.
2017-12-01
During this past year, ASF has developed a cloud-based on-demand processing system known as HyP3 (http://hyp3.asf.alaska.edu/), the Hybrid Pluggable Processing Pipeline, for Synthetic Aperture Radar (SAR) data. The system makes it easy for a user who doesn't have the time or inclination to install and use complex SAR processing software to leverage SAR data in their research or operations. One such processing algorithm is generation of a deformation time series product, which is a series of images representing ground displacements over time, which can be computed using a time series of interferometric SAR (InSAR) products. The set of software tools necessary to generate this useful product are difficult to install, configure, and use. Moreover, for a long time series with many images, the processing of just the interferograms can take days. Principally built by three undergraduate students at the ASF DAAC, the deformation time series processing relies the new Amazon Batch service, which enables processing of jobs with complex interconnected dependencies in a straightforward and efficient manner. In the case of generating a deformation time series product from a stack of single-look complex SAR images, the system uses Batch to serialize the up-front processing, interferogram generation, optional tropospheric correction, and deformation time series generation. The most time consuming portion is the interferogram generation, because even for a fairly small stack of images many interferograms need to be processed. By using AWS Batch, the interferograms are all generated in parallel; the entire process completes in hours rather than days. Additionally, the individual interferograms are saved in Amazon's cloud storage, so that when new data is acquired in the stack, an updated time series product can be generated with minimal addiitonal processing. This presentation will focus on the development techniques and enabling technologies that were used in developing the time series processing in the ASF HyP3 system. Data and process flow from job submission through to order completion will be shown, highlighting the benefits of the cloud for each step.
Nair, Ramkumar B; Kabir, Maryam M; Lennartsson, Patrik R; Taherzadeh, Mohammad J; Horváth, Ilona Sárvári
2018-01-01
Integration of wheat straw for a biorefinery-based energy generation process by producing ethanol and biogas together with the production of high-protein fungal biomass (suitable for feed application) was the main focus of the present study. An edible ascomycete fungal strain Neurospora intermedia was used for the ethanol fermentation and subsequent biomass production from dilute phosphoric acid (0.7 to 1.2% w/v) pretreated wheat straw. At optimum pretreatment conditions, an ethanol yield of 84 to 90% of the theoretical maximum, based on glucan content of substrate straw, was observed from fungal fermentation post the enzymatic hydrolysis process. The biogas production from the pretreated straw slurry showed an improved methane yield potential up to 162% increase, as compared to that of the untreated straw. Additional biogas production, using the syrup, a waste stream obtained post the ethanol fermentation, resulted in a combined total energy output of 15.8 MJ/kg wheat straw. Moreover, using thin stillage (a waste stream from the first-generation wheat-based ethanol process) as a co-substrate to the biogas process resulted in an additional increase by about 14 to 27% in the total energy output as compared to using only wheat straw-based substrates. ᅟ.
Liu, Gang; Bao, Jie
2017-11-01
This study takes the first insight on the rigorous evaluation of electricity generation based on the experimentally measured higher heating value (HHV) of lignin residue, as well as the chemical oxygen demand (COD) and biological oxygen demand (BOD 5 ) of wastewater. For producing one metric ton of ethanol fuel from five typical lignocellulose substrates, including corn stover, wheat straw, rice straw, sugarcane bagasse and poplar sawdust, 1.26-1.85tons of dry lignin residue is generated from biorefining process and 0.19-0.27tons of biogas is generated from anaerobic digestion of wastewater, equivalent to 4335-5981kWh and 1946-2795kWh of electricity by combustion of the generated lignin residue and biogas, respectively. The electricity generation not only sufficiently meets the electricity needs of process requirement, but also generates more than half of electricity surplus selling to the grid. Copyright © 2017 Elsevier Ltd. All rights reserved.
A model of oil-generation in a waterlogged and closed system
NASA Astrophysics Data System (ADS)
Zhigao, He
This paper presents a new model on synthetic effects on oil-generation in a waterlogged and closed system. It is suggested based on information about oil in high pressure layers (including gas dissolved in oil), marsh gas and its fermentative solution, fermentation processes and mechanisms, gaseous hydrocarbons of carbonate rocks by acid treatment, oil-field water, recent and ancient sediments, and simulation experiments of artificial marsh gas and biological action. The model differs completely from the theory of oil-generation by thermal degradation of kerogen but stresses the synthetic effects of oil-generation in special waterlogged and closed geological systems, the importance of pressure in oil-forming processes, and direct oil generation by micro-organisms. Oil generated directly by micro-organisms is a particular biochemical reaction. Another feature of this model is that generation, migration and accumulation of petroleum are considered as a whole.
NASA Astrophysics Data System (ADS)
Zhu, X. A.; Tsai, C. T.
2000-09-01
Dislocations in gallium arsenide (GaAs) crystals are generated by excessive thermal stresses induced during the crystal growth process. The presence of dislocations has adverse effects on the performance and reliability of the GaAs-based devices. It is well known that dislocation density can be significantly reduced by doping impurity atoms into a GaAs crystal during its growth process. A viscoplastic constitutive equation that couples the microscopic dislocation density with the macroscopic plastic deformation is employed in a crystallographic finite element model for calculating the dislocation density generated in the GaAs crystal during its growth process. The dislocation density is considered as an internal state variable and the drag stress caused by doping impurity is included in this constitutive equation. A GaAs crystal grown by the vertical Bridgman process is adopted as an example to study the influences of doping impurity and growth orientation on dislocation generation. The calculated results show that doping impurity can significantly reduce the dislocation density generated in the crystal. The level of reduction is also influenced by the growth orientation during the crystal growth process.
An engineering approach to automatic programming
NASA Technical Reports Server (NTRS)
Rubin, Stuart H.
1990-01-01
An exploratory study of the automatic generation and optimization of symbolic programs using DECOM - a prototypical requirement specification model implemented in pure LISP was undertaken. It was concluded, on the basis of this study, that symbolic processing languages such as LISP can support a style of programming based upon formal transformation and dependent upon the expression of constraints in an object-oriented environment. Such languages can represent all aspects of the software generation process (including heuristic algorithms for effecting parallel search) as dynamic processes since data and program are represented in a uniform format.
Schiek, Richard [Albuquerque, NM
2006-06-20
A method of generating two-dimensional masks from a three-dimensional model comprises providing a three-dimensional model representing a micro-electro-mechanical structure for manufacture and a description of process mask requirements, reducing the three-dimensional model to a topological description of unique cross sections, and selecting candidate masks from the unique cross sections and the cross section topology. The method further can comprise reconciling the candidate masks based on the process mask requirements description to produce two-dimensional process masks.
Neuroscientific Model of Motivational Process
Kim, Sung-il
2013-01-01
Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598
Neuroscientific model of motivational process.
Kim, Sung-Il
2013-01-01
Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.
Supervised Learning Based Hypothesis Generation from Biomedical Literature.
Sang, Shengtian; Yang, Zhihao; Li, Zongyao; Lin, Hongfei
2015-01-01
Nowadays, the amount of biomedical literatures is growing at an explosive speed, and there is much useful knowledge undiscovered in this literature. Researchers can form biomedical hypotheses through mining these works. In this paper, we propose a supervised learning based approach to generate hypotheses from biomedical literature. This approach splits the traditional processing of hypothesis generation with classic ABC model into AB model and BC model which are constructed with supervised learning method. Compared with the concept cooccurrence and grammar engineering-based approaches like SemRep, machine learning based models usually can achieve better performance in information extraction (IE) from texts. Then through combining the two models, the approach reconstructs the ABC model and generates biomedical hypotheses from literature. The experimental results on the three classic Swanson hypotheses show that our approach outperforms SemRep system.
NASA Astrophysics Data System (ADS)
Lee, Jeffrey; McGarvey, Steve
2013-04-01
The introduction of early test wafer (ETW) 450mm Surface Scanning Inspection Systems (SSIS) into Si manufacturing has brought with it numerous technical, commercial, and logistical challenges on the path to rapid recipe development and subsequent qualification of other 450mm wafer processing equipment. This paper will explore the feasibility of eliminating the Polystyrene Latex Sphere deposition process step and the subsequent creation of SSIS recipes based upon the theoretical optical properties of both the SSIS and the process film stack(s). The process of Polystyrene Latex Sphere deposition for SSIS recipe generation and development is generally accepted on the previous technology nodes for 150/200/300mm wafers. PSL is deposited with a commercially available deposition system onto a non-patterned bare Si or non-patterned filmed Si wafer. After deposition of multiple PSL spots, located in different positions on a wafer, the wafer is inspected on a SSIS and a response curve is generated. The response curve is based on the the light scattering intensity of the NIST certified PSL that was deposited on the wafer. As the initial 450mm Si wafer manufacturing began, there were no inspection systems with sub-90nm sensitivities available for defect and haze level verification. The introduction of a 450mm sub-30nm inspection system into the manufacturing line generated instant challenges. Whereas the 450mm wafers were relatively defect free at 90nm, at 40nm the wafers contained several hundred thousand defects. When PSL was deposited onto wafers with these kinds of defect levels, PSL with signals less than the sub-90nm defects were difficult to extract. As the defectivity level of the wafers from the Si suppliers rapidly improves the challenges of SSIS recipe creation with high defectivity decreases while at the same time the cost of PSL deposition increases. The current cost per wafer is fifteen thousand dollars for a 450mm PSL deposition service. When viewed from the standpoint of the generations of hundreds of SSIS recipes for the global member companies of ISMI, it is simply not economically viable to create all recipes based on PSL based light scattering response curves. This paper will explore the challenges/end results encountered with the PSL based SSIS recipe generation and compare those against the challenges/end results of SSIS recipes generated based strictly upon theoretical Bidirectional reflectance distribution function (BRDF) light scattering modeling. The BRDF modeling will allow for the creation of SSIS recipes without PSL deposition, which is greatly appealing for a multitude of both technical and commercial considerations. This paper will also explore the technical challenges of SSIS recipe generation based strictly upon BRDF modeling.
Failure detection system design methodology. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Chow, E. Y.
1980-01-01
The design of a failure detection and identification system consists of designing a robust residual generation process and a high performance decision making process. The design of these two processes are examined separately. Residual generation is based on analytical redundancy. Redundancy relations that are insensitive to modelling errors and noise effects are important for designing robust residual generation processes. The characterization of the concept of analytical redundancy in terms of a generalized parity space provides a framework in which a systematic approach to the determination of robust redundancy relations are developed. The Bayesian approach is adopted for the design of high performance decision processes. The FDI decision problem is formulated as a Bayes sequential decision problem. Since the optimal decision rule is incomputable, a methodology for designing suboptimal rules is proposed. A numerical algorithm is developed to facilitate the design and performance evaluation of suboptimal rules.
Controllable 3D architectures of aligned carbon nanotube arrays by multi-step processes
NASA Astrophysics Data System (ADS)
Huang, Shaoming
2003-06-01
An effective way to fabricate large area three-dimensional (3D) aligned CNTs pattern based on pyrolysis of iron(II) phthalocyanine (FePc) by two-step processes is reported. The controllable generation of different lengths and selective growth of the aligned CNT arrays on metal-patterned (e.g., Ag and Au) substrate are the bases for generating such 3D aligned CNTs architectures. By controlling experimental conditions 3D aligned CNT arrays with different lengths/densities and morphologies/structures as well as multi-layered architectures can be fabricated in large scale by multi-step pyrolysis of FePc. These 3D architectures could have interesting properties and be applied for developing novel nanotube-based devices.
Towards automatic planning for manufacturing generative processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
CALTON,TERRI L.
2000-05-24
Generative process planning describes methods process engineers use to modify manufacturing/process plans after designs are complete. A completed design may be the result from the introduction of a new product based on an old design, an assembly upgrade, or modified product designs used for a family of similar products. An engineer designs an assembly and then creates plans capturing manufacturing processes, including assembly sequences, component joining methods, part costs, labor costs, etc. When new products originate as a result of an upgrade, component geometry may change, and/or additional components and subassemblies may be added to or are omitted from themore » original design. As a result process engineers are forced to create new plans. This is further complicated by the fact that the process engineer is forced to manually generate these plans for each product upgrade. To generate new assembly plans for product upgrades, engineers must manually re-specify the manufacturing plan selection criteria and re-run the planners. To remedy this problem, special-purpose assembly planning algorithms have been developed to automatically recognize design modifications and automatically apply previously defined manufacturing plan selection criteria and constraints.« less
Rules based process window OPC
NASA Astrophysics Data System (ADS)
O'Brien, Sean; Soper, Robert; Best, Shane; Mason, Mark
2008-03-01
As a preliminary step towards Model-Based Process Window OPC we have analyzed the impact of correcting post-OPC layouts using rules based methods. Image processing on the Brion Tachyon was used to identify sites where the OPC model/recipe failed to generate an acceptable solution. A set of rules for 65nm active and poly were generated by classifying these failure sites. The rules were based upon segment runlengths, figure spaces, and adjacent figure widths. 2.1 million sites for active were corrected in a small chip (comparing the pre and post rules based operations), and 59 million were found at poly. Tachyon analysis of the final reticle layout found weak margin sites distinct from those sites repaired by rules-based corrections. For the active layer more than 75% of the sites corrected by rules would have printed without a defect indicating that most rulesbased cleanups degrade the lithographic pattern. Some sites were missed by the rules based cleanups due to either bugs in the DRC software or gaps in the rules table. In the end dramatic changes to the reticle prevented catastrophic lithography errors, but this method is far too blunt. A more subtle model-based procedure is needed changing only those sites which have unsatisfactory lithographic margin.
Kwon, Min-Woo; Kim, Seung-Cheol; Kim, Eun-Soo
2016-01-20
A three-directional motion-compensation mask-based novel look-up table method is proposed and implemented on graphics processing units (GPUs) for video-rate generation of digital holographic videos of three-dimensional (3D) scenes. Since the proposed method is designed to be well matched with the software and memory structures of GPUs, the number of compute-unified-device-architecture kernel function calls can be significantly reduced. This results in a great increase of the computational speed of the proposed method, allowing video-rate generation of the computer-generated hologram (CGH) patterns of 3D scenes. Experimental results reveal that the proposed method can generate 39.8 frames of Fresnel CGH patterns with 1920×1080 pixels per second for the test 3D video scenario with 12,088 object points on dual GPU boards of NVIDIA GTX TITANs, and they confirm the feasibility of the proposed method in the practical application fields of electroholographic 3D displays.
Triangle Geometry Processing for Surface Modeling and Cartesian Grid Generation
NASA Technical Reports Server (NTRS)
Aftosmis, Michael J. (Inventor); Melton, John E. (Inventor); Berger, Marsha J. (Inventor)
2002-01-01
Cartesian mesh generation is accomplished for component based geometries, by intersecting components subject to mesh generation to extract wetted surfaces with a geometry engine using adaptive precision arithmetic in a system which automatically breaks ties with respect to geometric degeneracies. During volume mesh generation, intersected surface triangulations are received to enable mesh generation with cell division of an initially coarse grid. The hexagonal cells are resolved, preserving the ability to directionally divide cells which are locally well aligned.
Triangle geometry processing for surface modeling and cartesian grid generation
Aftosmis, Michael J [San Mateo, CA; Melton, John E [Hollister, CA; Berger, Marsha J [New York, NY
2002-09-03
Cartesian mesh generation is accomplished for component based geometries, by intersecting components subject to mesh generation to extract wetted surfaces with a geometry engine using adaptive precision arithmetic in a system which automatically breaks ties with respect to geometric degeneracies. During volume mesh generation, intersected surface triangulations are received to enable mesh generation with cell division of an initially coarse grid. The hexagonal cells are resolved, preserving the ability to directionally divide cells which are locally well aligned.
Jakobsson, Lotta; Lindman, Magdalena; Svanberg, Bo; Carlsson, Henrik
2010-01-01
This study analyses the outcome of the continuous improved occupant protection over the last two decades for front seat near side occupants in side impacts based on a real world driven working process. The effectiveness of four generations of improved side impact protection are calculated based on data from Volvo’s statistical accident database of Volvo Cars in Sweden. Generation I includes vehicles with a new structural and interior concept (SIPS). Generation II includes vehicles with structural improvements and a new chest airbag (SIPSbag). Generation III includes vehicles with further improved SIPS and SIPSbag as well as the new concept with a head protecting Inflatable Curtain (IC). Generation IV includes the most recent vehicles with further improvements of all the systems plus advanced sensors and seat belt pretensioner activation. Compared to baseline vehicles, vehicles of generation I reduce MAIS2+ injuries by 54%, generation II by 61% and generation III by 72%. For generation IV effectiveness figures cannot be calculated because of the lack of MAIS2+ injuries. A continuous improved performance is also seen when studying the AIS2+ pelvis, abdomen, chest and head injuries separately. By using the same real world driven working process, future improvements and possibly new passive as well as active safety systems, will be developed with the aim of further improved protection to near side occupants in side impacts. PMID:21050597
Intelligent Control of Micro Grid: A Big Data-Based Control Center
NASA Astrophysics Data System (ADS)
Liu, Lu; Wang, Yanping; Liu, Li; Wang, Zhiseng
2018-01-01
In this paper, a structure of micro grid system with big data-based control center is introduced. Energy data from distributed generation, storage and load are analized through the control center, and from the results new trends will be predicted and applied as a feedback to optimize the control. Therefore, each step proceeded in micro grid can be adjusted and orgnized in a form of comprehensive management. A framework of real-time data collection, data processing and data analysis will be proposed by employing big data technology. Consequently, a integrated distributed generation and a optimized energy storage and transmission process can be implemented in the micro grid system.
pyPcazip: A PCA-based toolkit for compression and analysis of molecular simulation data
NASA Astrophysics Data System (ADS)
Shkurti, Ardita; Goni, Ramon; Andrio, Pau; Breitmoser, Elena; Bethune, Iain; Orozco, Modesto; Laughton, Charles A.
The biomolecular simulation community is currently in need of novel and optimised software tools that can analyse and process, in reasonable timescales, the large generated amounts of molecular simulation data. In light of this, we have developed and present here pyPcazip: a suite of software tools for compression and analysis of molecular dynamics (MD) simulation data. The software is compatible with trajectory file formats generated by most contemporary MD engines such as AMBER, CHARMM, GROMACS and NAMD, and is MPI parallelised to permit the efficient processing of very large datasets. pyPcazip is a Unix based open-source software (BSD licenced) written in Python.
Compact solar autoclave based on steam generation using broadband light-harvesting nanoparticles
Neumann, Oara; Feronti, Curtis; Neumann, Albert D.; Dong, Anjie; Schell, Kevin; Lu, Benjamin; Kim, Eric; Quinn, Mary; Thompson, Shea; Grady, Nathaniel; Nordlander, Peter; Oden, Maria; Halas, Naomi J.
2013-01-01
The lack of readily available sterilization processes for medicine and dentistry practices in the developing world is a major risk factor for the propagation of disease. Modern medical facilities in the developed world often use autoclave systems to sterilize medical instruments and equipment and process waste that could contain harmful contagions. Here, we show the use of broadband light-absorbing nanoparticles as solar photothermal heaters, which generate high-temperature steam for a standalone, efficient solar autoclave useful for sanitation of instruments or materials in resource-limited, remote locations. Sterilization was verified using a standard Geobacillus stearothermophilus-based biological indicator. PMID:23836642
Note: Tesla based pulse generator for electrical breakdown study of liquid dielectrics
NASA Astrophysics Data System (ADS)
Veda Prakash, G.; Kumar, R.; Patel, J.; Saurabh, K.; Shyam, A.
2013-12-01
In the process of studying charge holding capability and delay time for breakdown in liquids under nanosecond (ns) time scales, a Tesla based pulse generator has been developed. Pulse generator is a combination of Tesla transformer, pulse forming line, a fast closing switch, and test chamber. Use of Tesla transformer over conventional Marx generators makes the pulse generator very compact, cost effective, and requires less maintenance. The system has been designed and developed to deliver maximum output voltage of 300 kV and rise time of the order of tens of nanoseconds. The paper deals with the system design parameters, breakdown test procedure, and various experimental results. To validate the pulse generator performance, experimental results have been compared with PSPICE simulation software and are in good agreement with simulation results.
Item Difficulty Modeling of Paragraph Comprehension Items
ERIC Educational Resources Information Center
Gorin, Joanna S.; Embretson, Susan E.
2006-01-01
Recent assessment research joining cognitive psychology and psychometric theory has introduced a new technology, item generation. In algorithmic item generation, items are systematically created based on specific combinations of features that underlie the processing required to correctly solve a problem. Reading comprehension items have been more…
New generation of meteorology cameras
NASA Astrophysics Data System (ADS)
Janout, Petr; Blažek, Martin; Páta, Petr
2017-12-01
A new generation of the WILLIAM (WIde-field aLL-sky Image Analyzing Monitoring system) camera includes new features such as monitoring of rain and storm clouds during the day observation. Development of the new generation of weather monitoring cameras responds to the demand for monitoring of sudden weather changes. However, new WILLIAM cameras are ready to process acquired image data immediately, release warning against sudden torrential rains, and send it to user's cell phone and email. Actual weather conditions are determined from image data, and results of image processing are complemented by data from sensors of temperature, humidity, and atmospheric pressure. In this paper, we present the architecture, image data processing algorithms of mentioned monitoring camera and spatially-variant model of imaging system aberrations based on Zernike polynomials.
Towards an improved ensemble precipitation forecast: A probabilistic post-processing approach
NASA Astrophysics Data System (ADS)
Khajehei, Sepideh; Moradkhani, Hamid
2017-03-01
Recently, ensemble post-processing (EPP) has become a commonly used approach for reducing the uncertainty in forcing data and hence hydrologic simulation. The procedure was introduced to build ensemble precipitation forecasts based on the statistical relationship between observations and forecasts. More specifically, the approach relies on a transfer function that is developed based on a bivariate joint distribution between the observations and the simulations in the historical period. The transfer function is used to post-process the forecast. In this study, we propose a Bayesian EPP approach based on copula functions (COP-EPP) to improve the reliability of the precipitation ensemble forecast. Evaluation of the copula-based method is carried out by comparing the performance of the generated ensemble precipitation with the outputs from an existing procedure, i.e. mixed type meta-Gaussian distribution. Monthly precipitation from Climate Forecast System Reanalysis (CFS) and gridded observation from Parameter-Elevation Relationships on Independent Slopes Model (PRISM) have been employed to generate the post-processed ensemble precipitation. Deterministic and probabilistic verification frameworks are utilized in order to evaluate the outputs from the proposed technique. Distribution of seasonal precipitation for the generated ensemble from the copula-based technique is compared to the observation and raw forecasts for three sub-basins located in the Western United States. Results show that both techniques are successful in producing reliable and unbiased ensemble forecast, however, the COP-EPP demonstrates considerable improvement in the ensemble forecast in both deterministic and probabilistic verification, in particular in characterizing the extreme events in wet seasons.
Model-based adaptive 3D sonar reconstruction in reverberating environments.
Saucan, Augustin-Alexandru; Sintes, Christophe; Chonavel, Thierry; Caillec, Jean-Marc Le
2015-10-01
In this paper, we propose a novel model-based approach for 3D underwater scene reconstruction, i.e., bathymetry, for side scan sonar arrays in complex and highly reverberating environments like shallow water areas. The presence of multipath echoes and volume reverberation generates false depth estimates. To improve the resulting bathymetry, this paper proposes and develops an adaptive filter, based on several original geometrical models. This multimodel approach makes it possible to track and separate the direction of arrival trajectories of multiple echoes impinging the array. Echo tracking is perceived as a model-based processing stage, incorporating prior information on the temporal evolution of echoes in order to reject cluttered observations generated by interfering echoes. The results of the proposed filter on simulated and real sonar data showcase the clutter-free and regularized bathymetric reconstruction. Model validation is carried out with goodness of fit tests, and demonstrates the importance of model-based processing for bathymetry reconstruction.
Automatic Generation of High Quality DSM Based on IRS-P5 Cartosat-1 Stereo Data
NASA Astrophysics Data System (ADS)
d'Angelo, Pablo; Uttenthaler, Andreas; Carl, Sebastian; Barner, Frithjof; Reinartz, Peter
2010-12-01
IRS-P5 Cartosat-1 high resolution stereo satellite imagery is well suited for the creation of digital surface models (DSM). A system for highly automated and operational DSM and orthoimage generation based on IRS-P5 Cartosat-1 imagery is presented, with an emphasis on automated processing and product quality. The proposed system processes IRS-P5 level-1 stereo scenes using the rational polynomial coefficients (RPC) universal sensor model. The described method uses an RPC correction based on DSM alignment instead of using reference images with a lower lateral accuracy, this results in improved geolocation of the DSMs and orthoimages. Following RPC correction, highly detailed DSMs with 5 m grid spacing are derived using Semiglobal Matching. The proposed method is part of an operational Cartosat-1 processor for the generation of a high resolution DSM. Evaluation of 18 scenes against independent ground truth measurements indicates a mean lateral error (CE90) of 6.7 meters and a mean vertical accuracy (LE90) of 5.1 meters.
Advanced microgrid design and analysis for forward operating bases
NASA Astrophysics Data System (ADS)
Reasoner, Jonathan
This thesis takes a holistic approach in creating an improved electric power generation system for a forward operating base (FOB) in the future through the design of an isolated microgrid. After an extensive literature search, this thesis found a need for drastic improvement of the FOB power system. A thorough design process analyzed FOB demand, researched demand side management improvements, evaluated various generation sources and energy storage options, and performed a HOMERRTM discrete optimization to determine the best microgrid design. Further sensitivity analysis was performed to see how changing parameters would affect the outcome. Lastly, this research also looks at some of the challenges which are associated with incorporating a design which relies heavily on inverter-based generation sources, and gives possible solutions to help make a renewable energy powered microgrid a reality. While this thesis uses a FOB as the case study, the process and discussion can be adapted to aide in the design of an off-grid small-scale power grid which utilizes high-penetration levels of renewable energy.
NASA Astrophysics Data System (ADS)
WANG, Qingrong; ZHU, Changfeng
2017-06-01
Integration of distributed heterogeneous data sources is the key issues under the big data applications. In this paper the strategy of variable precision is introduced to the concept lattice, and the one-to-one mapping mode of variable precision concept lattice and ontology concept lattice is constructed to produce the local ontology by constructing the variable precision concept lattice for each subsystem, and the distributed generation algorithm of variable precision concept lattice based on ontology heterogeneous database is proposed to draw support from the special relationship between concept lattice and ontology construction. Finally, based on the standard of main concept lattice of the existing heterogeneous database generated, a case study has been carried out in order to testify the feasibility and validity of this algorithm, and the differences between the main concept lattice and the standard concept lattice are compared. Analysis results show that this algorithm above-mentioned can automatically process the construction process of distributed concept lattice under the heterogeneous data sources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chernoguzov, Alexander; Markham, Thomas R.; Haridas, Harshal S.
A method includes generating at least one access vector associated with a specified device in an industrial process control and automation system. The specified device has one of multiple device roles. The at least one access vector is generated based on one or more communication policies defining communications between one or more pairs of devices roles in the industrial process control and automation system, where each pair of device roles includes the device role of the specified device. The method also includes providing the at least one access vector to at least one of the specified device and one ormore » more other devices in the industrial process control and automation system in order to control communications to or from the specified device.« less
NASA Technical Reports Server (NTRS)
Khan, Gufran Sayeed; Gubarev, Mikhail; Speegle, Chet; Ramsey, Brian
2010-01-01
The presentation includes grazing incidence X-ray optics, motivation and challenges, mid spatial frequency generation in cylindrical polishing, design considerations for polishing lap, simulation studies and experimental results, future scope, and summary. Topics include current status of replication optics technology, cylindrical polishing process using large size polishing lap, non-conformance of polishin lap to the optics, development of software and polishing machine, deterministic prediction of polishing, polishing experiment under optimum conditions, and polishing experiment based on known error profile. Future plans include determination of non-uniformity in the polishing lap compliance, development of a polishing sequence based on a known error profile of the specimen, software for generating a mandrel polishing sequence, design an development of a flexible polishing lap, and computer controlled localized polishing process.
Semantic based man-machine interface for real-time communication
NASA Technical Reports Server (NTRS)
Ali, M.; Ai, C.-S.
1988-01-01
A flight expert system (FLES) was developed to assist pilots in monitoring, diagnosing and recovering from in-flight faults. To provide a communications interface between the flight crew and FLES, a natural language interface (NALI) was implemented. Input to NALI is processed by three processors: (1) the semantics parser; (2) the knowledge retriever; and (3) the response generator. First the semantic parser extracts meaningful words and phrases to generate an internal representation of the query. At this point, the semantic parser has the ability to map different input forms related to the same concept into the same internal representation. Then the knowledge retriever analyzes and stores the context of the query to aid in resolving ellipses and pronoun references. At the end of this process, a sequence of retrievel functions is created as a first step in generating the proper response. Finally, the response generator generates the natural language response to the query. The architecture of NALI was designed to process both temporal and nontemporal queries. The architecture and implementation of NALI are described.
Kannan, Srimathi; Schulz, Amy; Israel, Barbara; Ayra, Indira; Weir, Sheryl; Dvonch, Timothy J.; Rowe, Zachary; Miller, Patricia; Benjamin, Alison
2008-01-01
Background Computer tailoring and personalizing recommendations for dietary health-promoting behaviors are in accordance with community-based participatory research (CBPR) principles, which emphasizes research that benefits the participants and community involved. Objective To describe the CBPR process utilized to computer-generate and disseminate personalized nutrition feedback reports (NFRs) for Detroit Healthy Environments Partnership (HEP) study participants. METHODS The CBPR process included discussion and feedback from HEP partners on several draft personalized reports. The nutrition feedback process included defining the feedback objectives; prioritizing the nutrients; customizing the report design; reviewing and revising the NFR template and readability; producing and disseminating the report; and participant follow-up. Lessons Learned Application of CBPR principles in designing the NFR resulted in a reader-friendly product with useful recommendations to promote heart health. Conclusions A CBPR process can enhance computer tailoring of personalized NFRs to address racial and socioeconomic disparities in cardiovascular disease (CVD). PMID:19337572
ERIC Educational Resources Information Center
Davidson, Jenica Van Tassell
2016-01-01
The purpose of this narrative study is to understand the educational experiences of academically resilient, first-generation students from low-income backgrounds who demonstrate first-year success in college. Through a framework based on academic resilience, this study aims to provide a strengths-based exploration of the contributing factors that…
Case-based reasoning in design: An apologia
NASA Technical Reports Server (NTRS)
Pulaski, Kirt
1990-01-01
Three positions are presented and defended: the process of generating solutions in problem solving is viewable as a design task; case-based reasoning is a strong method of problem solving; and a synergism exists between case-based reasoning and design problem solving.
Process Based on SysML for New Launchers System and Software Developments
NASA Astrophysics Data System (ADS)
Hiron, Emmanuel; Miramont, Philippe
2010-08-01
The purpose of this paper is to present the Astrium-ST engineering process based on SysML. This process is currently set-up in the frame of common CNES /Astrium-ST R&T studies related to the Ariane 5 electrical system and flight software modelling. The tool used to set up this process is Rhapsody release 7.3 from IBM-Software firm [1]. This process focuses on the system engineering phase dedicated to Software with the objective to generate both System documents (sequential system design and flight control) and Software specifications.
Kwon, Jinhyeong; Cho, Hyunmin; Jung, Jinwook; Lee, Habeom; Hong, Sukjoon; Yeo, Junyeob; Han, Seungyong; Ko, Seung Hwan
2018-05-12
To date, solar energy generation devices have been widely studied to meet a clean and sustainable energy source. Among them, water splitting photoelectrochemical cell is regarded as a promising energy generation way for splitting water molecules and generating hydrogen by sunlight. While many nanostructured metal oxides are considered as a candidate, most of them have an improper bandgap structure lowering energy transition efficiency. Herein, we introduce a novel wet-based, successive photoreduction process that can improve charge transfer efficiency by surface plasmon effect for a solar-driven water splitting device. The proposed process enables to fabricate ZnO/CuO/Ag or ZnO/CuO/Au hierarchical nanostructure, having an enhanced electrical, optical, photoelectrochemical property. The fabricated hierarchical nanostructures are demonstrated as a photocathode in the photoelectrochemical cell and characterized by using various analytic tools.
Kwon, Jinhyeong; Cho, Hyunmin; Jung, Jinwook; Lee, Habeom; Han, Seungyong
2018-01-01
To date, solar energy generation devices have been widely studied to meet a clean and sustainable energy source. Among them, water splitting photoelectrochemical cell is regarded as a promising energy generation way for splitting water molecules and generating hydrogen by sunlight. While many nanostructured metal oxides are considered as a candidate, most of them have an improper bandgap structure lowering energy transition efficiency. Herein, we introduce a novel wet-based, successive photoreduction process that can improve charge transfer efficiency by surface plasmon effect for a solar-driven water splitting device. The proposed process enables to fabricate ZnO/CuO/Ag or ZnO/CuO/Au hierarchical nanostructure, having an enhanced electrical, optical, photoelectrochemical property. The fabricated hierarchical nanostructures are demonstrated as a photocathode in the photoelectrochemical cell and characterized by using various analytic tools. PMID:29757225
Parisi Kern, Andrea; Ferreira Dias, Michele; Piva Kulakowski, Marlova; Paulo Gomes, Luciana
2015-05-01
Reducing construction waste is becoming a key environmental issue in the construction industry. The quantification of waste generation rates in the construction sector is an invaluable management tool in supporting mitigation actions. However, the quantification of waste can be a difficult process because of the specific characteristics and the wide range of materials used in different construction projects. Large variations are observed in the methods used to predict the amount of waste generated because of the range of variables involved in construction processes and the different contexts in which these methods are employed. This paper proposes a statistical model to determine the amount of waste generated in the construction of high-rise buildings by assessing the influence of design process and production system, often mentioned as the major culprits behind the generation of waste in construction. Multiple regression was used to conduct a case study based on multiple sources of data of eighteen residential buildings. The resulting statistical model produced dependent (i.e. amount of waste generated) and independent variables associated with the design and the production system used. The best regression model obtained from the sample data resulted in an adjusted R(2) value of 0.694, which means that it predicts approximately 69% of the factors involved in the generation of waste in similar constructions. Most independent variables showed a low determination coefficient when assessed in isolation, which emphasizes the importance of assessing their joint influence on the response (dependent) variable. Copyright © 2015 Elsevier Ltd. All rights reserved.
Raster Scan Computer Image Generation (CIG) System Based On Refresh Memory
NASA Astrophysics Data System (ADS)
Dichter, W.; Doris, K.; Conkling, C.
1982-06-01
A full color, Computer Image Generation (CIG) raster visual system has been developed which provides a high level of training sophistication by utilizing advanced semiconductor technology and innovative hardware and firmware techniques. Double buffered refresh memory and efficient algorithms eliminate the problem of conventional raster line ordering by allowing the generated image to be stored in a random fashion. Modular design techniques and simplified architecture provide significant advantages in reduced system cost, standardization of parts, and high reliability. The major system components are a general purpose computer to perform interfacing and data base functions; a geometric processor to define the instantaneous scene image; a display generator to convert the image to a video signal; an illumination control unit which provides final image processing; and a CRT monitor for display of the completed image. Additional optional enhancements include texture generators, increased edge and occultation capability, curved surface shading, and data base extensions.
Processing digital images and calculation of beam emittance (pepper-pot method for the Krion source)
NASA Astrophysics Data System (ADS)
Alexandrov, V. S.; Donets, E. E.; Nyukhalova, E. V.; Kaminsky, A. K.; Sedykh, S. N.; Tuzikov, A. V.; Philippov, A. V.
2016-12-01
Programs for the pre-processing of photographs of beam images on the mask based on Wolfram Mathematica and Origin software are described. Angles of rotation around the axis and in the vertical plane are taken into account in the generation of the file with image coordinates. Results of the emittance calculation by the Pep_emit program written in Visual Basic using the generated file in the test mode are presented.
Distillation Brine Purification for Resource Recovery Applications
NASA Technical Reports Server (NTRS)
Wheeler, Raymond M.
2014-01-01
Wastewater processing systems for space generate residual brine that contains water and salts that could be recovered to life support consumables. The project assessed the use of ion-exchange resins to selectively remove salts from wastewater treatment brines. The resins were then regenerated for additional use. The intention would be to generate a Na/K and CI rich or purified brine that would then be processed into high value chemicals, such as acids, bases, and/or bleach.
NASA Technical Reports Server (NTRS)
Trosset, Michael W.
1999-01-01
Comprehensive computational experiments to assess the performance of algorithms for numerical optimization require (among other things) a practical procedure for generating pseudorandom nonlinear objective functions. We propose a procedure that is based on the convenient fiction that objective functions are realizations of stochastic processes. This report details the calculations necessary to implement our procedure for the case of certain stationary Gaussian processes and presents a specific implementation in the statistical programming language S-PLUS.
Multiprocessor graphics computation and display using transputers
NASA Technical Reports Server (NTRS)
Ellis, Graham K.
1988-01-01
A package of two-dimensional graphics routines was developed to run on a transputer-based parallel processing system. These routines were designed to enable applications programmers to easily generate and display results from the transputer network in a graphic format. The graphics procedures were designed for the lowest possible network communication overhead for increased performance. The routines were designed for ease of use and to present an intuitive approach to generating graphics on the transputer parallel processing system.
An efficient hole-filling method based on depth map in 3D view generation
NASA Astrophysics Data System (ADS)
Liang, Haitao; Su, Xiu; Liu, Yilin; Xu, Huaiyuan; Wang, Yi; Chen, Xiaodong
2018-01-01
New virtual view is synthesized through depth image based rendering(DIBR) using a single color image and its associated depth map in 3D view generation. Holes are unavoidably generated in the 2D to 3D conversion process. We propose a hole-filling method based on depth map to address the problem. Firstly, we improve the process of DIBR by proposing a one-to-four (OTF) algorithm. The "z-buffer" algorithm is used to solve overlap problem. Then, based on the classical patch-based algorithm of Criminisi et al., we propose a hole-filling algorithm using the information of depth map to handle the image after DIBR. In order to improve the accuracy of the virtual image, inpainting starts from the background side. In the calculation of the priority, in addition to the confidence term and the data term, we add the depth term. In the search for the most similar patch in the source region, we define the depth similarity to improve the accuracy of searching. Experimental results show that the proposed method can effectively improve the quality of the 3D virtual view subjectively and objectively.
Recent Progress on the Second Generation CMORPH: A Prototype Operational Processing System
NASA Astrophysics Data System (ADS)
Xie, Pingping; Joyce, Robert; Wu, Shaorong
2016-04-01
As reported at the EGU General Assembly of 2015, a conceptual test system was developed for the second generation CMORPH to produce global analyses of 30-min precipitation on a 0.05deg lat/lon grid over the entire globe from pole to pole through integration of information from satellite observations as well as numerical model simulations. The second generation CMORPH is built upon the Kalman Filter based CMORPH algorithm of Joyce and Xie (2011). Inputs to the system include both rainfall and snowfall rate retrievals from passive microwave (PMW) measurements aboard all available low earth orbit (LEO) satellites, precipitation estimates derived from infrared (IR) observations of geostationary (GEO) as well as LEO platforms, and precipitation simulations from numerical global models. Sub-systems were developed and refined to derive precipitation estimates from the GEO and LEO IR observations and to compute precipitating cloud motion vectors. The results were reported at the EGU of 2014 and the AGU 2015 Fall Meetings. In this presentation, we report our recent work on the construction of a prototype operational processing system for the second generation CMORPH. The second generation CMORPH prototype operational processing system takes in the passive microwave (PMW) retrievals of instantaneous precipitation rates from all available sensors, the full-resolution GEO and LEO IR data, as well as the hourly precipitation fields generated by the NOAA/NCEP Climate Forecast System (CFS) Reanalysis (CFS). First, a combined field of PMW based precipitation retrievals (MWCOMB) is created on a 0.05deg lat/lon grid over the entire globe through inter-calibrating retrievals from various sensors against a common reference. For this experiment, the reference field is the GMI based retrievals with climatological adjustment against the TMI retrievals using data over the overlapping period. Precipitation estimation is then derived from the GEO and LEO IR data through calibration against the global MWCOMB and the CloudSat CPR based estimates. At the meantime, precipitating cloud motion vectors are derived through the combination of vectors computed from the GEO IR based precipitation estimates and the CFSR precipitation with a 2DVAR technique. A prototype system is applied to generate integrated global precipitation estimates over the entire globe for a three-month period from June 1 to August 31 of 2015. Preliminary tests are conducted to optimize the performance of the system. Specific efforts are made to improve the computational efficiency of the system. The second generation CMORPH test products are compared to the first generation CMORPH and ground observations. Detailed results will be reported at the EGU.
Oliveira, Roberta B; Pereira, Aledir S; Tavares, João Manuel R S
2017-10-01
The number of deaths worldwide due to melanoma has risen in recent times, in part because melanoma is the most aggressive type of skin cancer. Computational systems have been developed to assist dermatologists in early diagnosis of skin cancer, or even to monitor skin lesions. However, there still remains a challenge to improve classifiers for the diagnosis of such skin lesions. The main objective of this article is to evaluate different ensemble classification models based on input feature manipulation to diagnose skin lesions. Input feature manipulation processes are based on feature subset selections from shape properties, colour variation and texture analysis to generate diversity for the ensemble models. Three subset selection models are presented here: (1) a subset selection model based on specific feature groups, (2) a correlation-based subset selection model, and (3) a subset selection model based on feature selection algorithms. Each ensemble classification model is generated using an optimum-path forest classifier and integrated with a majority voting strategy. The proposed models were applied on a set of 1104 dermoscopic images using a cross-validation procedure. The best results were obtained by the first ensemble classification model that generates a feature subset ensemble based on specific feature groups. The skin lesion diagnosis computational system achieved 94.3% accuracy, 91.8% sensitivity and 96.7% specificity. The input feature manipulation process based on specific feature subsets generated the greatest diversity for the ensemble classification model with very promising results. Copyright © 2017 Elsevier B.V. All rights reserved.
Electrochemical advanced oxidation processes: today and tomorrow. A review.
Sirés, Ignasi; Brillas, Enric; Oturan, Mehmet A; Rodrigo, Manuel A; Panizza, Marco
2014-01-01
In recent years, new advanced oxidation processes based on the electrochemical technology, the so-called electrochemical advanced oxidation processes (EAOPs), have been developed for the prevention and remediation of environmental pollution, especially focusing on water streams. These methods are based on the electrochemical generation of a very powerful oxidizing agent, such as the hydroxyl radical ((•)OH) in solution, which is then able to destroy organics up to their mineralization. EAOPs include heterogeneous processes like anodic oxidation and photoelectrocatalysis methods, in which (•)OH are generated at the anode surface either electrochemically or photochemically, and homogeneous processes like electro-Fenton, photoelectro-Fenton, and sonoelectrolysis, in which (•)OH are produced in the bulk solution. This paper presents a general overview of the application of EAOPs on the removal of aqueous organic pollutants, first reviewing the most recent works and then looking to the future. A global perspective on the fundamentals and experimental setups is offered, and laboratory-scale and pilot-scale experiments are examined and discussed.
Matsuoka, Kenichi; Albrecht, Ken; Yamamoto, Kimihisa; Fujita, Katsuhiko
2017-01-01
Thermally activated delayed fluorescence (TADF) materials emerged as promising light sources in third generation organic light-emitting diodes (OLED). Much effort has been invested for the development of small molecular TADF materials and vacuum process-based efficient TADF-OLEDs. In contrast, a limited number of solution processable high-molecular weight TADF materials toward low cost, large area, and scalable manufacturing of solution processed TADF-OLEDs have been reported so far. In this context, we report benzophenone-core carbazole dendrimers (GnB, n = generation) showing TADF and aggregation-induced emission enhancement (AIEE) properties along with alcohol resistance enabling further solution-based lamination of organic materials. The dendritic structure was found to play an important role for both TADF and AIEE activities in the neat films. By using these multifunctional dendritic emitters as non-doped emissive layers, OLED devices with fully solution processed organic multilayers were successfully fabricated and achieved maximum external quantum efficiency of 5.7%. PMID:28139768
NASA Astrophysics Data System (ADS)
Matsuoka, Kenichi; Albrecht, Ken; Yamamoto, Kimihisa; Fujita, Katsuhiko
2017-01-01
Thermally activated delayed fluorescence (TADF) materials emerged as promising light sources in third generation organic light-emitting diodes (OLED). Much effort has been invested for the development of small molecular TADF materials and vacuum process-based efficient TADF-OLEDs. In contrast, a limited number of solution processable high-molecular weight TADF materials toward low cost, large area, and scalable manufacturing of solution processed TADF-OLEDs have been reported so far. In this context, we report benzophenone-core carbazole dendrimers (GnB, n = generation) showing TADF and aggregation-induced emission enhancement (AIEE) properties along with alcohol resistance enabling further solution-based lamination of organic materials. The dendritic structure was found to play an important role for both TADF and AIEE activities in the neat films. By using these multifunctional dendritic emitters as non-doped emissive layers, OLED devices with fully solution processed organic multilayers were successfully fabricated and achieved maximum external quantum efficiency of 5.7%.
Fast in-situ tool inspection based on inverse fringe projection and compact sensor heads
NASA Astrophysics Data System (ADS)
Matthias, Steffen; Kästner, Markus; Reithmeier, Eduard
2016-11-01
Inspection of machine elements is an important task in production processes in order to ensure the quality of produced parts and to gather feedback for the continuous improvement process. A new measuring system is presented, which is capable of performing the inspection of critical tool geometries, such as gearing elements, inside the forming machine. To meet the constraints on sensor head size and inspection time imposed by the limited space inside the machine and the cycle time of the process, the measuring device employs a combination of endoscopy techniques with the fringe projection principle. Compact gradient index lenses enable a compact design of the sensor head, which is connected to a CMOS camera and a flexible micro-mirror based projector via flexible fiber bundles. Using common fringe projection patterns, the system achieves measuring times of less than five seconds. To further reduce the time required for inspection, the generation of inverse fringe projection patterns has been implemented for the system. Inverse fringe projection speeds up the inspection process by employing object-adapted patterns, which enable the detection of geometry deviations in a single image. Two different approaches to generate object adapted patterns are presented. The first approach uses a reference measurement of a manufactured tool master to generate the inverse pattern. The second approach is based on a virtual master geometry in the form of a CAD file and a ray-tracing model of the measuring system. Virtual modeling of the measuring device and inspection setup allows for geometric tolerancing for free-form surfaces by the tool designer in the CAD-file. A new approach is presented, which uses virtual tolerance specifications and additional simulation steps to enable fast checking of metric tolerances. Following the description of the pattern generation process, the image processing steps required for inspection are demonstrated on captures of gearing geometries.
Automated lattice data generation
NASA Astrophysics Data System (ADS)
Ayyar, Venkitesh; Hackett, Daniel C.; Jay, William I.; Neil, Ethan T.
2018-03-01
The process of generating ensembles of gauge configurations (and measuring various observables over them) can be tedious and error-prone when done "by hand". In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.
Evaluating the Process of Generating a Clinical Trial Protocol
Franciosi, Lui G.; Butterfield, Noam N.; MacLeod, Bernard A.
2002-01-01
The research protocol is the principal document in the conduct of a clinical trial. Its generation requires knowledge about the research problem, the potential experimental confounders, and the relevant Good Clinical Practices for conducting the trial. However, such information is not always available to authors during the writing process. A checklist of over 80 items has been developed to better understand the considerations made by authors in generating a protocol. It is based on the most cited requirements for designing and implementing the randomised controlled trial. Items are categorised according to the trial's research question, experimental design, statistics, ethics, and standard operating procedures. This quality assessment tool evaluates the extent that a generated protocol deviates from the best-planned clinical trial.
Wide-bandgap III-Nitride based Second Harmonic Generation
2014-10-02
fabrication process for a GaN LPS. Fig. 1: 3-step Fabrication process of a GaN based lateral polar structure. ( a ) Growth of a 20 nm AlN buffer layer...etching of the LT-AlN stripes. This results are shown in Fig. 2 ( a ) and (b). Fig. 2: AFM images of KOH ( a ) and RIE (b) patterned templates for lateral ...was varied between 0.6 - 1.0. FIG. 3: Growth process of AlGaN based Lateral Polar Structures. ( a ) RIE patterning. (b) Growth of HT- AlN. (c
Automation of route identification and optimisation based on data-mining and chemical intuition.
Lapkin, A A; Heer, P K; Jacob, P-M; Hutchby, M; Cunningham, W; Bull, S D; Davidson, M G
2017-09-21
Data-mining of Reaxys and network analysis of the combined literature and in-house reactions set were used to generate multiple possible reaction routes to convert a bio-waste feedstock, limonene, into a pharmaceutical API, paracetamol. The network analysis of data provides a rich knowledge-base for generation of the initial reaction screening and development programme. Based on the literature and the in-house data, an overall flowsheet for the conversion of limonene to paracetamol was proposed. Each individual reaction-separation step in the sequence was simulated as a combination of the continuous flow and batch steps. The linear model generation methodology allowed us to identify the reaction steps requiring further chemical optimisation. The generated model can be used for global optimisation and generation of environmental and other performance indicators, such as cost indicators. However, the identified further challenge is to automate model generation to evolve optimal multi-step chemical routes and optimal process configurations.
Domain Decomposition By the Advancing-Partition Method for Parallel Unstructured Grid Generation
NASA Technical Reports Server (NTRS)
Pirzadeh, Shahyar Z.; Zagaris, George
2009-01-01
A new method of domain decomposition has been developed for generating unstructured grids in subdomains either sequentially or using multiple computers in parallel. Domain decomposition is a crucial and challenging step for parallel grid generation. Prior methods are generally based on auxiliary, complex, and computationally intensive operations for defining partition interfaces and usually produce grids of lower quality than those generated in single domains. The new technique, referred to as "Advancing Partition," is based on the Advancing-Front method, which partitions a domain as part of the volume mesh generation in a consistent and "natural" way. The benefits of this approach are: 1) the process of domain decomposition is highly automated, 2) partitioning of domain does not compromise the quality of the generated grids, and 3) the computational overhead for domain decomposition is minimal. The new method has been implemented in NASA's unstructured grid generation code VGRID.
NASA Astrophysics Data System (ADS)
Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.
2016-11-01
The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.
Visual form predictions facilitate auditory processing at the N1.
Paris, Tim; Kim, Jeesun; Davis, Chris
2017-02-20
Auditory-visual (AV) events often involve a leading visual cue (e.g. auditory-visual speech) that allows the perceiver to generate predictions about the upcoming auditory event. Electrophysiological evidence suggests that when an auditory event is predicted, processing is sped up, i.e., the N1 component of the ERP occurs earlier (N1 facilitation). However, it is not clear (1) whether N1 facilitation is based specifically on predictive rather than multisensory integration and (2) which particular properties of the visual cue it is based on. The current experiment used artificial AV stimuli in which visual cues predicted but did not co-occur with auditory cues. Visual form cues (high and low salience) and the auditory-visual pairing were manipulated so that auditory predictions could be based on form and timing or on timing only. The results showed that N1 facilitation occurred only for combined form and temporal predictions. These results suggest that faster auditory processing (as indicated by N1 facilitation) is based on predictive processing generated by a visual cue that clearly predicts both what and when the auditory stimulus will occur. Copyright © 2016. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Biermann, D.; Gausemeier, J.; Heim, H.-P.; Hess, S.; Petersen, M.; Ries, A.; Wagner, T.
2014-05-01
In this contribution a framework for the computer-aided planning and optimisation of functional graded components is presented. The framework is divided into three modules - the "Component Description", the "Expert System" for the synthetisation of several process chains and the "Modelling and Process Chain Optimisation". The Component Description module enhances a standard computer-aided design (CAD) model by a voxel-based representation of the graded properties. The Expert System synthesises process steps stored in the knowledge base to generate several alternative process chains. Each process chain is capable of producing components according to the enhanced CAD model and usually consists of a sequence of heating-, cooling-, and forming processes. The dependencies between the component and the applied manufacturing processes as well as between the processes themselves need to be considered. The Expert System utilises an ontology for that purpose. The ontology represents all dependencies in a structured way and connects the information of the knowledge base via relations. The third module performs the evaluation of the generated process chains. To accomplish this, the parameters of each process are optimised with respect to the component specification, whereby the result of the best parameterisation is used as representative value. Finally, the process chain which is capable of manufacturing a functionally graded component in an optimal way regarding to the property distributions of the component description is presented by means of a dedicated specification technique.
Cornelius, Judith B; Xiong, Pa H
2015-07-01
This study assessed generational differences in the sexual communication process between 40 African American parent and 40 grandparent caregivers of adolescent children. The study reports findings from a secondary analysis of data from two databases. The HIV Risk Reduction Survey was used to examine the sexual communication process. Grandparents wanted to talk about sex and had open sexual communications, while parents valued sexual abstinence and had limited communications. Based on the findings, healthcare providers and programs need to recognize that differences do exist between parents and grandparents with sexual communications. © 2015, Wiley Periodicals, Inc.
Zhang, Fangzheng; Guo, Qingshui; Pan, Shilong
2017-10-23
Real-time and high-resolution target detection is highly desirable in modern radar applications. Electronic techniques have encountered grave difficulties in the development of such radars, which strictly rely on a large instantaneous bandwidth. In this article, a photonics-based real-time high-range-resolution radar is proposed with optical generation and processing of broadband linear frequency modulation (LFM) signals. A broadband LFM signal is generated in the transmitter by photonic frequency quadrupling, and the received echo is de-chirped to a low frequency signal by photonic frequency mixing. The system can operate at a high frequency and a large bandwidth while enabling real-time processing by low-speed analog-to-digital conversion and digital signal processing. A conceptual radar is established. Real-time processing of an 8-GHz LFM signal is achieved with a sampling rate of 500 MSa/s. Accurate distance measurement is implemented with a maximum error of 4 mm within a range of ~3.5 meters. Detection of two targets is demonstrated with a range-resolution as high as 1.875 cm. We believe the proposed radar architecture is a reliable solution to overcome the limitations of current radar on operation bandwidth and processing speed, and it is hopefully to be used in future radars for real-time and high-resolution target detection and imaging.
Elaborative Retrieval: Do Semantic Mediators Improve Memory?
ERIC Educational Resources Information Center
Lehman, Melissa; Karpicke, Jeffrey D.
2016-01-01
The elaborative retrieval account of retrieval-based learning proposes that retrieval enhances retention because the retrieval process produces the generation of semantic mediators that link cues to target information. We tested 2 assumptions that form the basis of this account: that semantic mediators are more likely to be generated during…
Design Course for Micropower Generation Devices
ERIC Educational Resources Information Center
Mitsos, Alexander
2009-01-01
A project-based design course is developed for man-portable power generation via microfabricated fuel cell systems. Targeted audience are undergraduate chemical/process engineering students in their final year. The course covers 6 weeks, with three hours of lectures per week. Two alternative projects are developed, one focusing on selection of…
Accelerator Generation and Thermal Separation (AGATS) of Technetium-99m
Grover, Blaine
2018-05-01
Accelerator Generation and Thermal Separation (AGATS) of Technetium-99m is a linear electron accelerator-based technology for producing medical imaging radioisotopes from a separation process that heats, vaporizes and condenses the desired radioisotope. You can learn more about INL's education programs at http://www.facebook.com/idahonationallaboratory.
Language Evolution by Iterated Learning with Bayesian Agents
ERIC Educational Resources Information Center
Griffiths, Thomas L.; Kalish, Michael L.
2007-01-01
Languages are transmitted from person to person and generation to generation via a process of iterated learning: people learn a language from other people who once learned that language themselves. We analyze the consequences of iterated learning for learning algorithms based on the principles of Bayesian inference, assuming that learners compute…
Computer-Based Arithmetic Test Generation
ERIC Educational Resources Information Center
Trocchi, Robert F.
1973-01-01
The computer can be a welcome partner in the instructional process, but only if there is man-machine interaction. Man should not compromise system design because of available hardware; the computer must fit the system design for the result to represent an acceptable solution to instructional technology. The Arithmetic Test Generator system fits…
Linguistically Motivated Features for CCG Realization Ranking
ERIC Educational Resources Information Center
Rajkumar, Rajakrishnan
2012-01-01
Natural Language Generation (NLG) is the process of generating natural language text from an input, which is a communicative goal and a database or knowledge base. Informally, the architecture of a standard NLG system consists of the following modules (Reiter and Dale, 2000): content determination, sentence planning (or microplanning) and surface…
Naturally p-Hydroxybenzoylated Lignins in Palms
Fachuang Lu; Steven D. Karlen; Matt Regner; Hoon Kim; Sally A. Ralph; Run-Cang Sun; Ken-ichi Kuroda; Mary Ann Augustin; Raymond Mawson; Henry Sabarez; Tanoj Singh; Gerardo Jimenez-Monteon; Sarani Zakaria; Stefan Hill; Philip J. Harris; Wout Boerjan; Curtis G. Wilkerson; Shawn D. Mansfield; John Ralph
2015-01-01
The industrial production of palm oil concurrently generates a substantial amount of empty fruit bunch (EFB) fibers that could be used as a feedstock in a lignocellulose based biorefinery. Lignin byproducts generated by this process may offer opportunities for the isolation of value-added products, such as p-hydroxybenzoate (pBz),...
Robo-Sensei's NLP-Based Error Detection and Feedback Generation
ERIC Educational Resources Information Center
Nagata, Noriko
2009-01-01
This paper presents a new version of Robo-Sensei's NLP (Natural Language Processing) system which updates the version currently available as the software package "ROBO-SENSEI: Personal Japanese Tutor" (Nagata, 2004). Robo-Sensei's NLP system includes a lexicon, a morphological generator, a word segmentor, a morphological parser, a syntactic…
NASA Astrophysics Data System (ADS)
Altuna, F. I.; Antonacci, J.; Arenas, G. F.; Pettarin, V.; Hoppe, C. E.; Williams, R. J. J.
2016-04-01
Green laser irradiation successfully activated self-healing processes in epoxy-acid networks modified with low amounts of gold nanoparticles (NPs). A bio-based polymer matrix, obtained by crosslinking epoxidized soybean oil (ESO) with an aqueous citric acid (CA) solution, was self-healed through molecular rearrangements produced by transesterification reactions of β-hydroxyester groups generated in the polymerization reaction. The temperature increase required for the triggering of these thermally activated reactions was attained by green light irradiation of the damaged area. Compression force needed to assure a good contact between crack faces was achieved by volume dilatation generated by the same temperature rise. Gold NPs dispersed in the polymer efficiently generated heat in the presence of electromagnetic radiation under plasmon resonance, acting as nanometric heating sources and allowing remote activation of the self-healing in the crosslinked polymer.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Junge, D.C.
1978-12-01
Significant quantities of wood residue fuels are presently being used in industrial steam generating facilities. Recent studies indicate that substantial additional quantities of wood residue fuels are available for energy generation in the form of steam and/or electricity. A limited data base on the combustion characteristics of wood residue fuels has resulted in the installation and operation of inefficient combustion systems for these fuels. This investigation of the combustion characteristics of wood residue fuels was undertaken to provide a data base which could be used to optimize the combustion of such fuels. Optimization of the combustion process in industrial boilersmore » serves to improve combustion efficiency and to reduce air pollutant emissions generated in the combustion process. Data are presented on the combustion characteristics of eastern white pine bark mixed with Douglas fir planer shavings.« less
NASA Astrophysics Data System (ADS)
Karnatak, H.; Pandey, K.; Oberai, K.; Roy, A.; Joshi, D.; Singh, H.; Raju, P. L. N.; Krishna Murthy, Y. V. N.
2014-11-01
National Biodiversity Characterization at Landscape Level, a project jointly sponsored by Department of Biotechnology and Department of Space, was implemented to identify and map the potential biodiversity rich areas in India. This project has generated spatial information at three levels viz. Satellite based primary information (Vegetation Type map, spatial locations of road & village, Fire occurrence); geospatially derived or modelled information (Disturbance Index, Fragmentation, Biological Richness) and geospatially referenced field samples plots. The study provides information of high disturbance and high biological richness areas suggesting future management strategies and formulating action plans. The study has generated for the first time baseline database in India which will be a valuable input towards climate change study in the Indian Subcontinent. The spatial data generated during the study is organized as central data repository in Geo-RDBMS environment using PostgreSQL and POSTGIS. The raster and vector data is published as OGC WMS and WFS standard for development of web base geoinformation system using Service Oriented Architecture (SOA). The WMS and WFS based system allows geo-visualization, online query and map outputs generation based on user request and response. This is a typical mashup architecture based geo-information system which allows access to remote web services like ISRO Bhuvan, Openstreet map, Google map etc., with overlay on Biodiversity data for effective study on Bio-resources. The spatial queries and analysis with vector data is achieved through SQL queries on POSTGIS and WFS-T operations. But the most important challenge is to develop a system for online raster based geo-spatial analysis and processing based on user defined Area of Interest (AOI) for large raster data sets. The map data of this study contains approximately 20 GB of size for each data layer which are five in number. An attempt has been to develop system using python, PostGIS and PHP for raster data analysis over the web for Biodiversity conservation and prioritization. The developed system takes inputs from users as WKT, Openlayer based Polygon geometry and Shape file upload as AOI to perform raster based operation using Python and GDAL/OGR. The intermediate products are stored in temporary files and tables which generate XML outputs for web representation. The raster operations like clip-zip-ship, class wise area statistics, single to multi-layer operations, diagrammatic representation and other geo-statistical analysis are performed. This is indigenous geospatial data processing engine developed using Open system architecture for spatial analysis of Biodiversity data sets in Internet GIS environment. The performance of this applications in multi-user environment like Internet domain is another challenging task which is addressed by fine tuning the source code, server hardening, spatial indexing and running the process in load balance mode. The developed system is hosted in Internet domain (http://bis.iirs.gov.in) for user access.
Nanosecond pulsed laser generation of holographic structures on metals
NASA Astrophysics Data System (ADS)
Wlodarczyk, Krystian L.; Ardron, Marcus; Weston, Nick J.; Hand, Duncan P.
2016-03-01
A laser-based process for the generation of phase holographic structures directly onto the surface of metals is presented. This process uses 35ns long laser pulses of wavelength 355nm to generate optically-smooth surface deformations on a metal. The laser-induced surface deformations (LISDs) are produced by either localized laser melting or the combination of melting and evaporation. The geometry (shape and dimension) of the LISDs depends on the laser processing parameters, in particular the pulse energy, as well as on the chemical composition of a metal. In this paper, we explain the mechanism of the LISDs formation on various metals, such as stainless steel, pure nickel and nickel-chromium Inconel® alloys. In addition, we provide information about the design and fabrication process of the phase holographic structures and demonstrate their use as robust markings for the identification and traceability of high value metal goods.
On storm movement and its applications
NASA Astrophysics Data System (ADS)
Niemczynowicz, Janusz
Rainfall-runoff models applicable for design and analysis of sewage systems in urban areas are further developed in order to represent better different physical processes going on on an urban catchment. However, one important part of the modelling procedure, the generation of the rainfall input is still a weak point. The main problem is lack of adequate rainfall data which represent temporal and spatial variations of the natural rainfall process. Storm movement is a natural phenomenon which influences urban runoff. However, the rainfall movement and its influence on runoff generation process is not represented in presently available urban runoff simulation models. Physical description of the rainfall movement and its parameters is given based on detailed measurements performed on twelve gauges in Lund, Sweden. The paper discusses the significance of the rainfall movement on the runoff generation process and gives suggestions how the rainfall movement parameters may be used in runoff modelling.
Routes to the past: neural substrates of direct and generative autobiographical memory retrieval.
Addis, Donna Rose; Knapp, Katie; Roberts, Reece P; Schacter, Daniel L
2012-02-01
Models of autobiographical memory propose two routes to retrieval depending on cue specificity. When available cues are specific and personally-relevant, a memory can be directly accessed. However, when available cues are generic, one must engage a generative retrieval process to produce more specific cues to successfully access a relevant memory. The current study sought to characterize the neural bases of these retrieval processes. During functional magnetic resonance imaging (fMRI), participants were shown personally-relevant cues to elicit direct retrieval, or generic cues (nouns) to elicit generative retrieval. We used spatiotemporal partial least squares to characterize the spatial and temporal characteristics of the networks associated with direct and generative retrieval. Both retrieval tasks engaged regions comprising the autobiographical retrieval network, including hippocampus, and medial prefrontal and parietal cortices. However, some key neural differences emerged. Generative retrieval differentially recruited lateral prefrontal and temporal regions early on during the retrieval process, likely supporting the strategic search operations and initial recovery of generic autobiographical information. However, many regions were activated more strongly during direct versus generative retrieval, even when we time-locked the analysis to the successful recovery of events in both conditions. This result suggests that there may be fundamental differences between memories that are accessed directly and those that are recovered via the iterative search and retrieval process that characterizes generative retrieval. Copyright © 2011 Elsevier Inc. All rights reserved.
Routes to the past: Neural substrates of direct and generative autobiographical memory retrieval
Addis, Donna Rose; Knapp, Katie; Roberts, Reece P.; Schacter, Daniel L.
2011-01-01
Models of autobiographical memory propose two routes to retrieval depending on cue specificity. When available cues are specific and personally-relevant, a memory can be directly accessed. However, when available cues are generic, one must engage a generative retrieval process to produce more specific cues to successfully access a relevant memory. The current study sought to characterize the neural bases of these retrieval processes. During functional magnetic resonance imaging (fMRI), participants were shown personally-relevant cues to elicit direct retrieval, or generic cues (nouns) to elicit generative retrieval. We used spatiotemporal partial least squares to characterize the spatial and temporal characteristics of the networks associated with direct and generative retrieval. Both retrieval tasks engaged regions comprising the autobiographical retrieval network, including hippocampus, and medial prefrontal and parietal cortices. However, some key neural differences emerged. Generative retrieval differentially recruited lateral prefrontal and temporal regions early on during the retrieval process, likely supporting the strategic search operations and initial recovery of generic autobiographical information. However, many regions were activated more strongly during direct versus generative retrieval, even when we time-locked the analysis to the successful recovery of events in both conditions. This result suggests that there may be fundamental differences between memories that are accessed directly and those that are recovered via the iterative search and retrieval process that characterizes generative retrieval. PMID:22001264
NASA Astrophysics Data System (ADS)
Genxu, W.
2017-12-01
There is a lack of knowledge about how to quantify runoff generation and the hydrological processes operating in permafrost catchments on permafrost-dominant catchments. To understand the mechanism of runoff generation processes in permafrost catchments, a typical headwater catchment with continuous permafrost on the Tibetan Plateau was measured. A new approach is presented in this study to account for runoff processes on the spring thawing period and autumn freezing period, when runoff generation clearly differs from that of non-permafrost catchments. This approach introduces a soil temperature-based water saturation function and modifies the soil water storage curve with a soil temperature threshold. The results show that surface soil thawing induced saturation excess runoff and subsurface interflow account for approximately 66-86% and 14-34% of total spring runoff, respectively, and the soil temperature significantly affects the runoff generation pattern, the runoff composition and the runoff coefficient with the enlargement of the active layer. The suprapermafrost groundwater discharge decreases exponentially with active layer frozen processes during autumn runoff recession, whereas the ratio of groundwater discharge to total runoff and the direct surface runoff coefficient simultaneously increase. The bidirectional freezing of the active layer controls and changes the autumn runoff processes and runoff composition. The new approach could be used to further develop hydrological models of cold regions dominated by permafrost.
NASA Technical Reports Server (NTRS)
Torres-Pomales, Wilfredo
2014-01-01
This report presents an example of the application of multi-criteria decision analysis to the selection of an architecture for a safety-critical distributed computer system. The design problem includes constraints on minimum system availability and integrity, and the decision is based on the optimal balance of power, weight and cost. The analysis process includes the generation of alternative architectures, evaluation of individual decision criteria, and the selection of an alternative based on overall value. In this example presented here, iterative application of the quantitative evaluation process made it possible to deliberately generate an alternative architecture that is superior to all others regardless of the relative importance of cost.
Medicinal chemistry inspired fragment-based drug discovery.
Lanter, James; Zhang, Xuqing; Sui, Zhihua
2011-01-01
Lead generation can be a very challenging phase of the drug discovery process. The two principal methods for this stage of research are blind screening and rational design. Among the rational or semirational design approaches, fragment-based drug discovery (FBDD) has emerged as a useful tool for the generation of lead structures. It is particularly powerful as a complement to high-throughput screening approaches when the latter failed to yield viable hits for further development. Engagement of medicinal chemists early in the process can accelerate the progression of FBDD efforts by incorporating drug-friendly properties in the earliest stages of the design process. Medium-chain acyl-CoA synthetase 2b and ketohexokinase are chosen as examples to illustrate the importance of close collaboration of medicinal chemists, crystallography, and modeling. Copyright © 2011 Elsevier Inc. All rights reserved.
Śliwińska, Anna; Burchart-Korol, Dorota; Smoliński, Adam
2017-01-01
This paper presents a life cycle assessment (LCA) of greenhouse gas emissions generated through methanol and electricity co-production system based on coal gasification technology. The analysis focuses on polygeneration technologies from which two products are produced, and thus, issues related to an allocation procedure for LCA are addressed in this paper. In the LCA, two methods were used: a 'system expansion' method based on two approaches, the 'avoided burdens approach' and 'direct system enlargement' methods and an 'allocation' method involving proportional partitioning based on physical relationships in a technological process. Cause-effect relationships in the analysed production process were identified, allowing for the identification of allocation factors. The 'system expansion' method involved expanding the analysis to include five additional variants of electricity production technologies in Poland (alternative technologies). This method revealed environmental consequences of implementation for the analysed technologies. It was found that the LCA of polygeneration technologies based on the 'system expansion' method generated a more complete source of information on environmental consequences than the 'allocation' method. The analysis shows that alternative technologies chosen for generating LCA results are crucial. Life cycle assessment was performed for the analysed, reference and variant alternative technologies. Comparative analysis was performed between the analysed technologies of methanol and electricity co-production from coal gasification as well as a reference technology of methanol production from the natural gas reforming process. Copyright © 2016 Elsevier B.V. All rights reserved.
Subsurface damage distribution in the lapping process.
Wang, Zhuo; Wu, Yulie; Dai, Yifan; Li, Shengyi
2008-04-01
To systematically investigate the influence of lapping parameters on subsurface damage (SSD) depth and characterize the damage feature comprehensively, maximum depth and distribution of SSD generated in the optical lapping process were measured with the magnetorheological finishing wedge technique. Then, an interaction of adjacent indentations was applied to interpret the generation of maximum depth of SSD. Eventually, the lapping procedure based on the influence of lapping parameters on the material removal rate and SSD depth was proposed to improve the lapping efficiency.
NASA Technical Reports Server (NTRS)
Baez, Marivell; Vickerman, Mary; Choo, Yung
2000-01-01
SmaggIce (Surface Modeling And Grid Generation for Iced Airfoils) is one of NASNs aircraft icing research codes developed at the Glenn Research Center. It is a software toolkit used in the process of aerodynamic performance prediction of iced airfoils. It includes tools which complement the 2D grid-based Computational Fluid Dynamics (CFD) process: geometry probing; surface preparation for gridding: smoothing and re-discretization of geometry. Future releases will also include support for all aspects of gridding: domain decomposition; perimeter discretization; grid generation and modification.
Life cycle design and design management strategies in fashion apparel manufacturing
NASA Astrophysics Data System (ADS)
Tutia, R.; Mendes, FD; Ventura, A.
2017-10-01
The generation of solid textile waste in the process of development and clothing production is an error that causes serious damages to the environment and must be minimized. The greatest volume of textile residues is generated by the department of cut, such as textiles parings and snips that are not used in the productive process. (MILAN et al, 2007). One way to conceive new products environmently conscious is turned to the adoption of a methodology based on Life Cycle Design (LCD) and Design Management.
Molecular Diagnostics in Pathology: Time for a Next-Generation Pathologist?
Fassan, Matteo
2018-03-01
- Comprehensive molecular investigations of mainstream carcinogenic processes have led to the use of effective molecular targeted agents in most cases of solid tumors in clinical settings. - To update readers regarding the evolving role of the pathologist in the therapeutic decision-making process and the introduction of next-generation technologies into pathology practice. - Current literature on the topic, primarily sourced from the PubMed (National Center for Biotechnology Information, Bethesda, Maryland) database, were reviewed. - Adequate evaluation of cytologic-based and tissue-based predictive diagnostic biomarkers largely depends on both proper pathologic characterization and customized processing of biospecimens. Moreover, increased requests for molecular testing have paralleled the recent, sharp decrease in tumor material to be analyzed-material that currently comprises cytology specimens or, at minimum, small biopsies in most cases of metastatic/advanced disease. Traditional diagnostic pathology has been completely revolutionized by the introduction of next-generation technologies, which provide multigene, targeted mutational profiling, even in the most complex of clinical cases. Combining traditional and molecular knowledge, pathologists integrate the morphological, clinical, and molecular dimensions of a disease, leading to a proper diagnosis and, therefore, the most-appropriate tailored therapy.
Ekemen, Zeynep; Ahmad, Zeeshan; Stride, Eleanor; Kaplan, David; Edirisinghe, Mohan
2013-05-13
Conventional fabrication techniques and structures employed in the design of silk fibroin (SF) based porous materials provide only limited control over pore size and require several processing stages. In this study, it is shown that, by utilizing electrohydrodynamic bubbling, not only can new hollow spherical structures of SF be formed in a single step by means of bubbles, but the resulting bubbles can serve as pore generators when dehydrated. The bubble characteristics can be controlled through simple adjustments to the processing parameters. Bubbles with diameters in the range of 240-1000 μm were fabricated in controlled fashion. FT-IR characterization confirmed that the rate of air infused during processing enhanced β-sheet packing in SF at higher flow rates. Dynamic mechanical analysis also demonstrated a correlation between air flow rate and film tensile strength. Results indicate that electrohydrodynamically generated SF and their composite bubbles can be employed as new tools to generate porous structures in a controlled manner with a range of potential applications in biocoatings and tissue engineering scaffolds.
A Neural Dynamic Model Generates Descriptions of Object-Oriented Actions.
Richter, Mathis; Lins, Jonas; Schöner, Gregor
2017-01-01
Describing actions entails that relations between objects are discovered. A pervasively neural account of this process requires that fundamental problems are solved: the neural pointer problem, the binding problem, and the problem of generating discrete processing steps from time-continuous neural processes. We present a prototypical solution to these problems in a neural dynamic model that comprises dynamic neural fields holding representations close to sensorimotor surfaces as well as dynamic neural nodes holding discrete, language-like representations. Making the connection between these two types of representations enables the model to describe actions as well as to perceptually ground movement phrases-all based on real visual input. We demonstrate how the dynamic neural processes autonomously generate the processing steps required to describe or ground object-oriented actions. By solving the fundamental problems of neural pointing, binding, and emergent discrete processing, the model may be a first but critical step toward a systematic neural processing account of higher cognition. Copyright © 2017 The Authors. Topics in Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.
All-Optical Control of Linear and Nonlinear Energy Transfer via the Zeno Effect
NASA Astrophysics Data System (ADS)
Guo, Xiang; Zou, Chang-Ling; Jiang, Liang; Tang, Hong X.
2018-05-01
Microresonator-based nonlinear processes are fundamental to applications including microcomb generation, parametric frequency conversion, and harmonics generation. While nonlinear processes involving either second- (χ(2 )) or third- (χ(3 )) order nonlinearity have been extensively studied, the interaction between these two basic nonlinear processes has seldom been reported. In this paper we demonstrate a coherent interplay between second- and third- order nonlinear processes. The parametric (χ(2 ) ) coupling to a lossy ancillary mode shortens the lifetime of the target photonic mode and suppresses its density of states, preventing the photon emissions into the target photonic mode via the Zeno effect. Such an effect is then used to control the stimulated four-wave mixing process and realize a suppression ratio of 34.5.
Graphics processing unit (GPU) real-time infrared scene generation
NASA Astrophysics Data System (ADS)
Christie, Chad L.; Gouthas, Efthimios (Themie); Williams, Owen M.
2007-04-01
VIRSuite, the GPU-based suite of software tools developed at DSTO for real-time infrared scene generation, is described. The tools include the painting of scene objects with radiometrically-associated colours, translucent object generation, polar plot validation and versatile scene generation. Special features include radiometric scaling within the GPU and the presence of zoom anti-aliasing at the core of VIRSuite. Extension of the zoom anti-aliasing construct to cover target embedding and the treatment of translucent objects is described.
Scenario management and automated scenario generation
NASA Astrophysics Data System (ADS)
McKeever, William; Gilmour, Duane; Lehman, Lynn; Stirtzinger, Anthony; Krause, Lee
2006-05-01
The military planning process utilizes simulation to determine the appropriate course of action (COA) that will achieve a campaign end state. However, due to the difficulty in developing and generating simulation level COAs, only a few COAs are simulated. This may have been appropriate for traditional conflicts but the evolution of warfare from attrition based to effects based strategies, as well as the complexities of 4 th generation warfare and asymmetric adversaries have placed additional demands on military planners and simulation. To keep pace with this dynamic, changing environment, planners must be able to perform continuous, multiple, "what-if" COA analysis. Scenario management and generation are critical elements to achieving this goal. An effects based scenario generation research project demonstrated the feasibility of automated scenario generation techniques which support multiple stove-pipe and emerging broad scope simulations. This paper will discuss a case study in which the scenario generation capability was employed to support COA simulations to identify plan effectiveness. The study demonstrated the effectiveness of using multiple simulation runs to evaluate the effectiveness of alternate COAs in achieving the overall campaign (metrics-based) objectives. The paper will discuss how scenario generation technology can be employed to allow military commanders and mission planning staff to understand the impact of command decisions on the battlespace of tomorrow.
Design of virtual simulation experiment based on key events
NASA Astrophysics Data System (ADS)
Zhong, Zheng; Zhou, Dongbo; Song, Lingxiu
2018-06-01
Considering complex content and lacking of guidance in virtual simulation experiments, the key event technology in VR narrative theory was introduced for virtual simulation experiment to enhance fidelity and vividness process. Based on the VR narrative technology, an event transition structure was designed to meet the need of experimental operation process, and an interactive event processing model was used to generate key events in interactive scene. The experiment of" margin value of bees foraging" based on Biologic morphology was taken as an example, many objects, behaviors and other contents were reorganized. The result shows that this method can enhance the user's experience and ensure experimental process complete and effectively.
NASA Technical Reports Server (NTRS)
Cariapa, Vikram
1993-01-01
The trend in the modern global economy towards free market policies has motivated companies to use rapid prototyping technologies to not only reduce product development cycle time but also to maintain their competitive edge. A rapid prototyping technology is one which combines computer aided design with computer controlled tracking of focussed high energy source (eg. lasers, heat) on modern ceramic powders, metallic powders, plastics or photosensitive liquid resins in order to produce prototypes or models. At present, except for the process of shape melting, most rapid prototyping processes generate products that are only dimensionally similar to those of the desired end product. There is an urgent need, therefore, to enhance the understanding of the characteristics of these processes in order to realize their potential for production. Currently, the commercial market is dominated by four rapid prototyping processes, namely selective laser sintering, stereolithography, fused deposition modelling and laminated object manufacturing. This phase of the research has focussed on the selective laser sintering and stereolithography rapid prototyping processes. A theoretical model for these processes is under development. Different rapid prototyping sites supplied test specimens (based on ASTM 638-84, Type I) that have been measured and tested to provide a data base on surface finish, dimensional variation and ultimate tensile strength. Further plans call for developing and verifying the theoretical models by carefully designed experiments. This will be a joint effort between NASA and other prototyping centers to generate a larger database, thus encouraging more widespread usage by product designers.
Lessons Learned From Developing Three Generations of Remote Sensing Science Data Processing Systems
NASA Technical Reports Server (NTRS)
Tilmes, Curt; Fleig, Albert J.
2005-01-01
The Biospheric Information Systems Branch at NASA s Goddard Space Flight Center has developed three generations of Science Investigator-led Processing Systems for use with various remote sensing instruments. The first system is used for data from the MODIS instruments flown on NASA s Earth Observing Systems @OS) Terra and Aqua Spacecraft launched in 1999 and 2002 respectively. The second generation is for the Ozone Measuring Instrument flying on the EOS Aura spacecraft launched in 2004. We are now developing a third generation of the system for evaluation science data processing for the Ozone Mapping and Profiler Suite (OMPS) to be flown by the NPOESS Preparatory Project (NPP) in 2006. The initial system was based on large scale proprietary hardware, operating and database systems. The current OMI system and the OMPS system being developed are based on commodity hardware, the LINUX Operating System and on PostgreSQL, an Open Source RDBMS. The new system distributes its data archive across multiple server hosts and processes jobs on multiple processor boxes. We have created several instances of this system, including one for operational processing, one for testing and reprocessing and one for applications development and scientific analysis. Prior to receiving the first data from OMI we applied the system to reprocessing information from the Solar Backscatter Ultraviolet (SBUV) and Total Ozone Mapping Spectrometer (TOMS) instruments flown from 1978 until now. The system was able to process 25 years (108,000 orbits) of data and produce 800,000 files (400 GiB) of level 2 and level 3 products in less than a week. We will describe the lessons we have learned and tradeoffs between system design, hardware, operating systems, operational staffing, user support and operational procedures. During each generational phase, the system has become more generic and reusable. While the system is not currently shrink wrapped we believe it is to the point where it could be readily adopted, with substantial cost savings, for other similar tasks.
Miró-Herrans, Aida T.; Al-Meeri, Ali; Mulligan, Connie J.
2014-01-01
Population migration has played an important role in human evolutionary history and in the patterning of human genetic variation. A deeper and empirically-based understanding of human migration dynamics is needed in order to interpret genetic and archaeological evidence and to accurately reconstruct the prehistoric processes that comprise human evolutionary history. Current empirical estimates of migration include either short time frames (i.e. within one generation) or partial knowledge about migration, such as proportion of migrants or distance of migration. An analysis of migration that includes both proportion of migrants and distance, and direction over multiple generations would better inform prehistoric reconstructions. To evaluate human migration, we use GPS coordinates from the place of residence of the Yemeni individuals sampled in our study, their birthplaces and their parents' and grandparents' birthplaces to calculate the proportion of migrants, as well as the distance and direction of migration events between each generation. We test for differences in these values between the generations and identify factors that influence the probability of migration. Our results show that the proportion and distance of migration between females and males is similar within generations. In contrast, the proportion and distance of migration is significantly lower in the grandparents' generation, most likely reflecting the decreasing effect of technology. Based on our results, we calculate the proportion of migration events (0.102) and mean and median distances of migration (96 km and 26 km) for the grandparent's generation to represent early times in human evolution. These estimates can serve to set parameter values of demographic models in model-based methods of prehistoric reconstruction, such as approximate Bayesian computation. Our study provides the first empirically-based estimates of human migration over multiple generations in a developing country and these estimates are intended to enable more precise reconstruction of the demographic processes that characterized human evolution. PMID:24759992
ERIC Educational Resources Information Center
Abele, Stephan
2018-01-01
This article deals with a theory-based investigation of the diagnostic problem-solving process in professional contexts. To begin with, a theory of the diagnostic problem-solving process was developed drawing on findings from different professional contexts. The theory distinguishes between four sub-processes of the diagnostic problem-solving…
Erdoğdu, Utku; Tan, Mehmet; Alhajj, Reda; Polat, Faruk; Rokne, Jon; Demetrick, Douglas
2013-01-01
The availability of enough samples for effective analysis and knowledge discovery has been a challenge in the research community, especially in the area of gene expression data analysis. Thus, the approaches being developed for data analysis have mostly suffered from the lack of enough data to train and test the constructed models. We argue that the process of sample generation could be successfully automated by employing some sophisticated machine learning techniques. An automated sample generation framework could successfully complement the actual sample generation from real cases. This argument is validated in this paper by describing a framework that integrates multiple models (perspectives) for sample generation. We illustrate its applicability for producing new gene expression data samples, a highly demanding area that has not received attention. The three perspectives employed in the process are based on models that are not closely related. The independence eliminates the bias of having the produced approach covering only certain characteristics of the domain and leading to samples skewed towards one direction. The first model is based on the Probabilistic Boolean Network (PBN) representation of the gene regulatory network underlying the given gene expression data. The second model integrates Hierarchical Markov Model (HIMM) and the third model employs a genetic algorithm in the process. Each model learns as much as possible characteristics of the domain being analysed and tries to incorporate the learned characteristics in generating new samples. In other words, the models base their analysis on domain knowledge implicitly present in the data itself. The developed framework has been extensively tested by checking how the new samples complement the original samples. The produced results are very promising in showing the effectiveness, usefulness and applicability of the proposed multi-model framework.
Pelaccia, Thierry; Tardif, Jacques; Triby, Emmanuel; Ammirati, Christine; Bertrand, Catherine; Dory, Valérie; Charlin, Bernard
2014-12-01
The ability to make a diagnosis is a crucial skill in emergency medicine. Little is known about the way emergency physicians reach a diagnosis. This study aims to identify how and when, during the initial patient examination, emergency physicians generate and evaluate diagnostic hypotheses. We carried out a qualitative research project based on semistructured interviews with emergency physicians. The interviews concerned management of an emergency situation during routine medical practice. They were associated with viewing the video recording of emergency situations filmed in an "own-point-of-view" perspective. The emergency physicians generated an average of 5 diagnostic hypotheses. Most of these hypotheses were generated before meeting the patient or within the first 5 minutes of the meeting. The hypotheses were then rank ordered within the context of a verification procedure based on identifying key information. These tasks were usually accomplished without conscious effort. No hypothesis was completely confirmed or refuted until the results of investigations were available. The generation and rank ordering of diagnostic hypotheses is based on the activation of cognitive processes, enabling expert emergency physicians to process environmental information and link it to past experiences. The physicians seemed to strive to avoid the risk of error by remaining aware of the possibility of alternative hypotheses as long as they did not have the results of investigations. Understanding the diagnostic process used by emergency physicians provides interesting ideas for training residents in a specialty in which the prevalence of reasoning errors leading to incorrect diagnoses is high. Copyright © 2014 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.
Hybrid Cascading Outage Analysis of Extreme Events with Optimized Corrective Actions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vallem, Mallikarjuna R.; Vyakaranam, Bharat GNVSR; Holzer, Jesse T.
2017-10-19
Power system are vulnerable to extreme contingencies (like an outage of a major generating substation) that can cause significant generation and load loss and can lead to further cascading outages of other transmission facilities and generators in the system. Some cascading outages are seen within minutes following a major contingency, which may not be captured exclusively using the dynamic simulation of the power system. The utilities plan for contingencies either based on dynamic or steady state analysis separately which may not accurately capture the impact of one process on the other. We address this gap in cascading outage analysis bymore » developing Dynamic Contingency Analysis Tool (DCAT) that can analyze hybrid dynamic and steady state behavior of the power system, including protection system models in dynamic simulations, and simulating corrective actions in post-transient steady state conditions. One of the important implemented steady state processes is to mimic operator corrective actions to mitigate aggravated states caused by dynamic cascading. This paper presents an Optimal Power Flow (OPF) based formulation for selecting corrective actions that utility operators can take during major contingency and thus automate the hybrid dynamic-steady state cascading outage process. The improved DCAT framework with OPF based corrective actions is demonstrated on IEEE 300 bus test system.« less
Incremental terrain processing for large digital elevation models
NASA Astrophysics Data System (ADS)
Ye, Z.
2012-12-01
Incremental terrain processing for large digital elevation models Zichuan Ye, Dean Djokic, Lori Armstrong Esri, 380 New York Street, Redlands, CA 92373, USA (E-mail: zye@esri.com, ddjokic@esri.com , larmstrong@esri.com) Efficient analyses of large digital elevation models (DEM) require generation of additional DEM artifacts such as flow direction, flow accumulation and other DEM derivatives. When the DEMs to analyze have a large number of grid cells (usually > 1,000,000,000) the generation of these DEM derivatives is either impractical (it takes too long) or impossible (software is incapable of processing such a large number of cells). Different strategies and algorithms can be put in place to alleviate this situation. This paper describes an approach where the overall DEM is partitioned in smaller processing units that can be efficiently processed. The processed DEM derivatives for each partition can then be either mosaicked back into a single large entity or managed on partition level. For dendritic terrain morphologies, the way in which partitions are to be derived and the order in which they are to be processed depend on the river and catchment patterns. These patterns are not available until flow pattern of the whole region is created, which in turn cannot be established upfront due to the size issues. This paper describes a procedure that solves this problem: (1) Resample the original large DEM grid so that the total number of cells is reduced to a level for which the drainage pattern can be established. (2) Run standard terrain preprocessing operations on the resampled DEM to generate the river and catchment system. (3) Define the processing units and their processing order based on the river and catchment system created in step (2). (4) Based on the processing order, apply the analysis, i.e., flow accumulation operation to each of the processing units, at the full resolution DEM. (5) As each processing unit is processed based on the processing order defined in (3), compare the resulting drainage pattern with the drainage pattern established at the coarser scale and adjust the drainage boundaries and rivers if necessary.
Distributed processing method for arbitrary view generation in camera sensor network
NASA Astrophysics Data System (ADS)
Tehrani, Mehrdad P.; Fujii, Toshiaki; Tanimoto, Masayuki
2003-05-01
Camera sensor network as a new advent of technology is a network that each sensor node can capture video signals, process and communicate them with other nodes. The processing task in this network is to generate arbitrary view, which can be requested from central node or user. To avoid unnecessary communication between nodes in camera sensor network and speed up the processing time, we have distributed the processing tasks between nodes. In this method, each sensor node processes part of interpolation algorithm to generate the interpolated image with local communication between nodes. The processing task in camera sensor network is ray-space interpolation, which is an object independent method and based on MSE minimization by using adaptive filtering. Two methods were proposed for distributing processing tasks, which are Fully Image Shared Decentralized Processing (FIS-DP), and Partially Image Shared Decentralized Processing (PIS-DP), to share image data locally. Comparison of the proposed methods with Centralized Processing (CP) method shows that PIS-DP has the highest processing speed after FIS-DP, and CP has the lowest processing speed. Communication rate of CP and PIS-DP is almost same and better than FIS-DP. So, PIS-DP is recommended because of its better performance than CP and FIS-DP.
Using Multiple Intelligences to Bridge the Educational Poverty Gap
ERIC Educational Resources Information Center
Goebel, Kym
2009-01-01
Students living in poverty have needs that are not being addressed in traditional classrooms. Students from "generational poverty" process information differently (Payne 1996). Information is processed based on their living conditions and upbringing. Differentiating instruction using Howard Gardener's Multiple Intelligence theory…
Applying activity-based costing to healthcare settings.
Canby, J B
1995-02-01
Activity-based costing (ABC) focuses on processes that drive cost. By tracing healthcare activities back to events that generate cost, a more accurate measurement of financial performance is possible. This article uses ABC principles and techniques to determine costs associated with the x-ray process in a midsized outpatient clinic. The article also provides several tips for initiating an ABC cost system for an entire healthcare organization.
The applicability of a material-treatment laser pulse in non-destructive evaluations.
Hrovatin, R; Petkovsek, R; Diaci, J; Mozina, J
2006-12-22
A practical optodynamic study was performed to determine the usability of different lengths of laser pulses for the generation of ultrasonic transients in a solid material. The aim of the study was to evaluate the possibility of a dual use for a laser pulse-for laser material processing, on the one hand, and for the ultrasonic wave generation on the other-with both processes being combined on the same production line. The propagation of the laser-generated ultrasonic waves is evaluated by detecting and measuring with a PID-controlled stabilized interferometer. Thus, both systems provided the basic tools, the generation and detection of ultrasonic waves, for an ultrasonic, laser-based, non-destructive material evaluation. The ultrasonic transients generated by 'classical' nanosecond laser pulses were compared with the transients generated by industrial laser pulses with a duration of a few tenths of a microsecond. The experimental results are compared with the results of a time-of-flight analysis that also involved part of a mode-conversion analysis for both regimes in a layered material structure. The differences between the two waveforms were assessed in terms of their visibility, wavelength and resolution. The limit values were calculated and estimated for the laser-pulse parameters, when such pulses are intended for use in an ultrasonic, laser-based, non-destructive evaluation. The possibility of using an industrial marking laser for laser ultrasound generation is thus demonstrated.
Secure web-based invocation of large-scale plasma simulation codes
NASA Astrophysics Data System (ADS)
Dimitrov, D. A.; Busby, R.; Exby, J.; Bruhwiler, D. L.; Cary, J. R.
2004-12-01
We present our design and initial implementation of a web-based system for running, both in parallel and serial, Particle-In-Cell (PIC) codes for plasma simulations with automatic post processing and generation of visual diagnostics.
GPU-based Efficient Realistic Techniques for Bleeding and Smoke Generation in Surgical Simulators
Halic, Tansel; Sankaranarayanan, Ganesh; De, Suvranu
2010-01-01
Background In actual surgery, smoke and bleeding due to cautery processes, provide important visual cues to the surgeon which have been proposed as factors in surgical skill assessment. While several virtual reality (VR)-based surgical simulators have incorporated effects of bleeding and smoke generation, they are not realistic due to the requirement of real time performance. To be interactive, visual update must be performed at least 30 Hz and haptic (touch) information must be refreshed at 1 kHz. Simulation of smoke and bleeding is, therefore, either ignored or simulated using highly simplified techniques since other computationally intensive processes compete for the available CPU resources. Methods In this work, we develop a novel low-cost method to generate realistic bleeding and smoke in VR-based surgical simulators which outsources the computations to the graphical processing unit (GPU), thus freeing up the CPU for other time-critical tasks. This method is independent of the complexity of the organ models in the virtual environment. User studies were performed using 20 subjects to determine the visual quality of the simulations compared to real surgical videos. Results The smoke and bleeding simulation were implemented as part of a Laparoscopic Adjustable Gastric Banding (LAGB) simulator. For the bleeding simulation, the original implementation using the shader did not incur in noticeable overhead. However, for smoke generation, an I/O (Input/Output) bottleneck was observed and two different methods were developed to overcome this limitation. Based on our benchmark results, a buffered approach performed better than a pipelined approach and could support up to 15 video streams in real time. Human subject studies showed that the visual realism of the simulations were as good as in real surgery (median rating of 4 on a 5-point Likert scale). Conclusions Based on the performance results and subject study, both bleeding and smoke simulations were concluded to be efficient, highly realistic and well suited in VR-based surgical simulators. PMID:20878651
UXDs-Driven Transferring Method from TRIZ Solution to Domain Solution
NASA Astrophysics Data System (ADS)
Ma, Lihui; Cao, Guozhong; Chang, Yunxia; Wei, Zihui; Ma, Kai
The translation process from TRIZ solutions to domain solutions is an analogy-based process. TRIZ solutions, such as 40 inventive principles and the related cases, are medium-solutions for domain problems. Unexpected discoveries (UXDs) are the key factors to trigger designers to generate new ideas for domain solutions. The Algorithm of UXD resolving based on Means-Ends Analysis(MEA) is studied and an UXDs-driven transferring method from TRIZ solution to domain solution is formed. A case study shows the application of the process.
Addressing and Presenting Quality of Satellite Data via Web-Based Services
NASA Technical Reports Server (NTRS)
Leptoukh, Gregory; Lynnes, C.; Ahmad, S.; Fox, P.; Zednik, S.; West, P.
2011-01-01
With the recent attention to climate change and proliferation of remote-sensing data utilization, climate model and various environmental monitoring and protection applications have begun to increasingly rely on satellite measurements. Research application users seek good quality satellite data, with uncertainties and biases provided for each data point. However, different communities address remote sensing quality issues rather inconsistently and differently. We describe our attempt to systematically characterize, capture, and provision quality and uncertainty information as it applies to the NASA MODIS Aerosol Optical Depth data product. In particular, we note the semantic differences in quality/bias/uncertainty at the pixel, granule, product, and record levels. We outline various factors contributing to uncertainty or error budget; errors. Web-based science analysis and processing tools allow users to access, analyze, and generate visualizations of data while alleviating users from having directly managing complex data processing operations. These tools provide value by streamlining the data analysis process, but usually shield users from details of the data processing steps, algorithm assumptions, caveats, etc. Correct interpretation of the final analysis requires user understanding of how data has been generated and processed and what potential biases, anomalies, or errors may have been introduced. By providing services that leverage data lineage provenance and domain-expertise, expert systems can be built to aid the user in understanding data sources, processing, and the suitability for use of products generated by the tools. We describe our experiences developing a semantic, provenance-aware, expert-knowledge advisory system applied to NASA Giovanni web-based Earth science data analysis tool as part of the ESTO AIST-funded Multi-sensor Data Synergy Advisor project.
2009-01-01
Background The identification of essential genes is important for the understanding of the minimal requirements for cellular life and for practical purposes, such as drug design. However, the experimental techniques for essential genes discovery are labor-intensive and time-consuming. Considering these experimental constraints, a computational approach capable of accurately predicting essential genes would be of great value. We therefore present here a machine learning-based computational approach relying on network topological features, cellular localization and biological process information for prediction of essential genes. Results We constructed a decision tree-based meta-classifier and trained it on datasets with individual and grouped attributes-network topological features, cellular compartments and biological processes-to generate various predictors of essential genes. We showed that the predictors with better performances are those generated by datasets with integrated attributes. Using the predictor with all attributes, i.e., network topological features, cellular compartments and biological processes, we obtained the best predictor of essential genes that was then used to classify yeast genes with unknown essentiality status. Finally, we generated decision trees by training the J48 algorithm on datasets with all network topological features, cellular localization and biological process information to discover cellular rules for essentiality. We found that the number of protein physical interactions, the nuclear localization of proteins and the number of regulating transcription factors are the most important factors determining gene essentiality. Conclusion We were able to demonstrate that network topological features, cellular localization and biological process information are reliable predictors of essential genes. Moreover, by constructing decision trees based on these data, we could discover cellular rules governing essentiality. PMID:19758426
Enhanced Product Generation at NASA Data Centers Through Grid Technology
NASA Technical Reports Server (NTRS)
Barkstrom, Bruce R.; Hinke, Thomas H.; Gavali, Shradha; Seufzer, William J.
2003-01-01
This paper describes how grid technology can support the ability of NASA data centers to provide customized data products. A combination of grid technology and commodity processors are proposed to provide the bandwidth necessary to perform customized processing of data, with customized data subsetting providing the initial example. This customized subsetting engine can be used to support a new type of subsetting, called phenomena-based subsetting, where data is subsetted based on its association with some phenomena, such as mesoscale convective systems or hurricanes. This concept is expanded to allow the phenomena to be detected in one type of data, with the subsetting requirements transmitted to the subsetting engine to subset a different type of data. The subsetting requirements are generated by a data mining system and transmitted to the subsetter in the form of an XML feature index that describes the spatial and temporal extent of the phenomena. For this work, a grid-based mining system called the Grid Miner is used to identify the phenomena and generate the feature index. This paper discusses the value of grid technology in facilitating the development of a high performance customized product processing and the coupling of a grid mining system to support phenomena-based subsetting.
Automated Tetrahedral Mesh Generation for CFD Analysis of Aircraft in Conceptual Design
NASA Technical Reports Server (NTRS)
Ordaz, Irian; Li, Wu; Campbell, Richard L.
2014-01-01
The paper introduces an automation process of generating a tetrahedral mesh for computational fluid dynamics (CFD) analysis of aircraft configurations in early conceptual design. The method was developed for CFD-based sonic boom analysis of supersonic configurations, but can be applied to aerodynamic analysis of aircraft configurations in any flight regime.
Modeling and Simulation of the Economics of Mining in the Bitcoin Market.
Cocco, Luisanna; Marchesi, Michele
2016-01-01
In January 3, 2009, Satoshi Nakamoto gave rise to the "Bitcoin Blockchain", creating the first block of the chain hashing on his computer's central processing unit (CPU). Since then, the hash calculations to mine Bitcoin have been getting more and more complex, and consequently the mining hardware evolved to adapt to this increasing difficulty. Three generations of mining hardware have followed the CPU's generation. They are GPU's, FPGA's and ASIC's generations. This work presents an agent-based artificial market model of the Bitcoin mining process and of the Bitcoin transactions. The goal of this work is to model the economy of the mining process, starting from GPU's generation, the first with economic significance. The model reproduces some "stylized facts" found in real-time price series and some core aspects of the mining business. In particular, the computational experiments performed can reproduce the unit root property, the fat tail phenomenon and the volatility clustering of Bitcoin price series. In addition, under proper assumptions, they can reproduce the generation of Bitcoins, the hashing capability, the power consumption, and the mining hardware and electrical energy expenditures of the Bitcoin network.
Domain Decomposition By the Advancing-Partition Method
NASA Technical Reports Server (NTRS)
Pirzadeh, Shahyar Z.
2008-01-01
A new method of domain decomposition has been developed for generating unstructured grids in subdomains either sequentially or using multiple computers in parallel. Domain decomposition is a crucial and challenging step for parallel grid generation. Prior methods are generally based on auxiliary, complex, and computationally intensive operations for defining partition interfaces and usually produce grids of lower quality than those generated in single domains. The new technique, referred to as "Advancing Partition," is based on the Advancing-Front method, which partitions a domain as part of the volume mesh generation in a consistent and "natural" way. The benefits of this approach are: 1) the process of domain decomposition is highly automated, 2) partitioning of domain does not compromise the quality of the generated grids, and 3) the computational overhead for domain decomposition is minimal. The new method has been implemented in NASA's unstructured grid generation code VGRID.
Developing a semantic web model for medical differential diagnosis recommendation.
Mohammed, Osama; Benlamri, Rachid
2014-10-01
In this paper we describe a novel model for differential diagnosis designed to make recommendations by utilizing semantic web technologies. The model is a response to a number of requirements, ranging from incorporating essential clinical diagnostic semantics to the integration of data mining for the process of identifying candidate diseases that best explain a set of clinical features. We introduce two major components, which we find essential to the construction of an integral differential diagnosis recommendation model: the evidence-based recommender component and the proximity-based recommender component. Both approaches are driven by disease diagnosis ontologies designed specifically to enable the process of generating diagnostic recommendations. These ontologies are the disease symptom ontology and the patient ontology. The evidence-based diagnosis process develops dynamic rules based on standardized clinical pathways. The proximity-based component employs data mining to provide clinicians with diagnosis predictions, as well as generates new diagnosis rules from provided training datasets. This article describes the integration between these two components along with the developed diagnosis ontologies to form a novel medical differential diagnosis recommendation model. This article also provides test cases from the implementation of the overall model, which shows quite promising diagnostic recommendation results.
Physical environment virtualization for human activities recognition
NASA Astrophysics Data System (ADS)
Poshtkar, Azin; Elangovan, Vinayak; Shirkhodaie, Amir; Chan, Alex; Hu, Shuowen
2015-05-01
Human activity recognition research relies heavily on extensive datasets to verify and validate performance of activity recognition algorithms. However, obtaining real datasets are expensive and highly time consuming. A physics-based virtual simulation can accelerate the development of context based human activity recognition algorithms and techniques by generating relevant training and testing videos simulating diverse operational scenarios. In this paper, we discuss in detail the requisite capabilities of a virtual environment to aid as a test bed for evaluating and enhancing activity recognition algorithms. To demonstrate the numerous advantages of virtual environment development, a newly developed virtual environment simulation modeling (VESM) environment is presented here to generate calibrated multisource imagery datasets suitable for development and testing of recognition algorithms for context-based human activities. The VESM environment serves as a versatile test bed to generate a vast amount of realistic data for training and testing of sensor processing algorithms. To demonstrate the effectiveness of VESM environment, we present various simulated scenarios and processed results to infer proper semantic annotations from the high fidelity imagery data for human-vehicle activity recognition under different operational contexts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makarov, Yuri V.; Huang, Zhenyu; Etingov, Pavel V.
2010-09-01
The power system balancing process, which includes the scheduling, real time dispatch (load following) and regulation processes, is traditionally based on deterministic models. Since the conventional generation needs time to be committed and dispatched to a desired megawatt level, the scheduling and load following processes use load and wind power production forecasts to achieve future balance between the conventional generation and energy storage on the one side, and system load, intermittent resources (such as wind and solar generation) and scheduled interchange on the other side. Although in real life the forecasting procedures imply some uncertainty around the load and windmore » forecasts (caused by forecast errors), only their mean values are actually used in the generation dispatch and commitment procedures. Since the actual load and intermittent generation can deviate from their forecasts, it becomes increasingly unclear (especially, with the increasing penetration of renewable resources) whether the system would be actually able to meet the conventional generation requirements within the look-ahead horizon, what the additional balancing efforts would be needed as we get closer to the real time, and what additional costs would be incurred by those needs. In order to improve the system control performance characteristics, maintain system reliability, and minimize expenses related to the system balancing functions, it becomes necessary to incorporate the predicted uncertainty ranges into the scheduling, load following, and, in some extent, into the regulation processes. It is also important to address the uncertainty problem comprehensively, by including all sources of uncertainty (load, intermittent generation, generators’ forced outages, etc.) into consideration. All aspects of uncertainty such as the imbalance size (which is the same as capacity needed to mitigate the imbalance) and generation ramping requirement must be taken into account. The latter unique features make this work a significant step forward toward the objective of incorporating of wind, solar, load, and other uncertainties into power system operations. In this report, a new methodology to predict the uncertainty ranges for the required balancing capacity, ramping capability and ramp duration is presented. Uncertainties created by system load forecast errors, wind and solar forecast errors, generation forced outages are taken into account. The uncertainty ranges are evaluated for different confidence levels of having the actual generation requirements within the corresponding limits. The methodology helps to identify system balancing reserve requirement based on a desired system performance levels, identify system “breaking points”, where the generation system becomes unable to follow the generation requirement curve with the user-specified probability level, and determine the time remaining to these potential events. The approach includes three stages: statistical and actual data acquisition, statistical analysis of retrospective information, and prediction of future grid balancing requirements for specified time horizons and confidence intervals. Assessment of the capacity and ramping requirements is performed using a specially developed probabilistic algorithm based on a histogram analysis incorporating all sources of uncertainty and parameters of a continuous (wind forecast and load forecast errors) and discrete (forced generator outages and failures to start up) nature. Preliminary simulations using California Independent System Operator (California ISO) real life data have shown the effectiveness of the proposed approach. A tool developed based on the new methodology described in this report will be integrated with the California ISO systems. Contractual work is currently in place to integrate the tool with the AREVA EMS system.« less
González-Ferrer, Arturo; ten Teije, Annette; Fdez-Olivares, Juan; Milian, Krystyna
2013-02-01
This paper describes a methodology which enables computer-aided support for the planning, visualization and execution of personalized patient treatments in a specific healthcare process, taking into account complex temporal constraints and the allocation of institutional resources. To this end, a translation from a time-annotated computer-interpretable guideline (CIG) model of a clinical protocol into a temporal hierarchical task network (HTN) planning domain is presented. The proposed method uses a knowledge-driven reasoning process to translate knowledge previously described in a CIG into a corresponding HTN Planning and Scheduling domain, taking advantage of HTNs known ability to (i) dynamically cope with temporal and resource constraints, and (ii) automatically generate customized plans. The proposed method, focusing on the representation of temporal knowledge and based on the identification of workflow and temporal patterns in a CIG, makes it possible to automatically generate time-annotated and resource-based care pathways tailored to the needs of any possible patient profile. The proposed translation is illustrated through a case study based on a 70 pages long clinical protocol to manage Hodgkin's disease, developed by the Spanish Society of Pediatric Oncology. We show that an HTN planning domain can be generated from the corresponding specification of the protocol in the Asbru language, providing a running example of this translation. Furthermore, the correctness of the translation is checked and also the management of ten different types of temporal patterns represented in the protocol. By interpreting the automatically generated domain with a state-of-art HTN planner, a time-annotated care pathway is automatically obtained, customized for the patient's and institutional needs. The generated care pathway can then be used by clinicians to plan and manage the patients long-term care. The described methodology makes it possible to automatically generate patient-tailored care pathways, leveraging an incremental knowledge-driven engineering process that starts from the expert knowledge of medical professionals. The presented approach makes the most of the strengths inherent in both CIG languages and HTN planning and scheduling techniques: for the former, knowledge acquisition and representation of the original clinical protocol, and for the latter, knowledge reasoning capabilities and an ability to deal with complex temporal and resource constraints. Moreover, the proposed approach provides immediate access to technologies such as business process management (BPM) tools, which are increasingly being used to support healthcare processes. Copyright © 2012 Elsevier B.V. All rights reserved.
ULSGEN (Uplink Summary Generator)
NASA Technical Reports Server (NTRS)
Wang, Y.-F.; Schrock, M.; Reeve, T.; Nguyen, K.; Smith, B.
2014-01-01
Uplink is an important part of spacecraft operations. Ensuring the accuracy of uplink content is essential to mission success. Before commands are radiated to the spacecraft, the command and sequence must be reviewed and verified by various teams. In most cases, this process requires collecting the command data, reviewing the data during a command conference meeting, and providing physical signatures by designated members of various teams to signify approval of the data. If commands or sequences are disapproved for some reason, the whole process must be restarted. Recording data and decision history is important for traceability reasons. Given that many steps and people are involved in this process, an easily accessible software tool for managing the process is vital to reducing human error which could result in uplinking incorrect data to the spacecraft. An uplink summary generator called ULSGEN was developed to assist this uplink content approval process. ULSGEN generates a web-based summary of uplink file content and provides an online review process. Spacecraft operations personnel view this summary as a final check before actual radiation of the uplink data. .
DOE Office of Scientific and Technical Information (OSTI.GOV)
Polese, Luigi Gentile; Brackney, Larry
An image-based occupancy sensor includes a motion detection module that receives and processes an image signal to generate a motion detection signal, a people detection module that receives the image signal and processes the image signal to generate a people detection signal, a face detection module that receives the image signal and processes the image signal to generate a face detection signal, and a sensor integration module that receives the motion detection signal from the motion detection module, receives the people detection signal from the people detection module, receives the face detection signal from the face detection module, and generatesmore » an occupancy signal using the motion detection signal, the people detection signal, and the face detection signal, with the occupancy signal indicating vacancy or occupancy, with an occupancy indication specifying that one or more people are detected within the monitored volume.« less
Experiments with Test Case Generation and Runtime Analysis
NASA Technical Reports Server (NTRS)
Artho, Cyrille; Drusinsky, Doron; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareanu, Corina; Rosu, Grigore; Visser, Willem; Koga, Dennis (Technical Monitor)
2003-01-01
Software testing is typically an ad hoc process where human testers manually write many test inputs and expected test results, perhaps automating their execution in a regression suite. This process is cumbersome and costly. This paper reports preliminary results on an approach to further automate this process. The approach consists of combining automated test case generation based on systematically exploring the program's input domain, with runtime analysis, where execution traces are monitored and verified against temporal logic specifications, or analyzed using advanced algorithms for detecting concurrency errors such as data races and deadlocks. The approach suggests to generate specifications dynamically per input instance rather than statically once-and-for-all. The paper describes experiments with variants of this approach in the context of two examples, a planetary rover controller and a space craft fault protection system.
The importance of source and cue type in time-based everyday prospective memory.
Oates, Joyce M; Peynircioğlu, Zehra F
2014-01-01
We examined the effects of the source of a prospective memory task (provided or generated) and the type of cue (specific or general) triggering that task in everyday settings. Participants were asked to complete both generated and experimenter-provided tasks and to send a text message when each task was completed. The cue/context for the to-be-completed tasks was either a specific time or a general deadline (time-based cue), and the cue/context for the texting task was the completion of the task itself (activity-based cue). Although generated tasks were completed more often, generated cues/contexts were no more effective than provided ones in triggering the intention. Furthermore, generated tasks were completed more often when the cue/context comprised a specific time, whereas provided tasks were completed more often when the cue/context comprised a general deadline. However, texting was unaffected by the source of the cue/context. Finally, emotion modulated the effects. Results are discussed within a process-driven framework.
Processing of complex N-glycans in IgG Fc-region is affected by core fucosylation
Castilho, Alexandra; Gruber, Clemens; Thader, Andreas; Oostenbrink, Chris; Pechlaner, Maria; Steinkellner, Herta; Altmann, Friedrich
2015-01-01
We investigated N-glycan processing of immunoglobulin G1 using the monoclonal antibody cetuximab (CxMab), which has a glycosite in the Fab domain in addition to the conserved Fc glycosylation, as a reporter. Three GlcNAc (Gn) terminating bi-antennary glycoforms of CxMab differing in core fucosylation (α1,3- and α1,6-linkage) were generated in a plant-based expression platform. These GnGn, GnGnF3, and GnGnF6 CxMab variants were subjected in vivo to further processing toward sialylation and GlcNAc diversification (bisected and branching structures). Mass spectrometry-based glycan analyses revealed efficient processing of Fab glycans toward envisaged structures. By contrast, Fc glycan processing largely depend on the presence of core fucose. A particularly strong support of glycan processing in the presence of plant-specific core α1,3-fucose was observed. Consistently, molecular modeling suggests changes in the interactions of the Fc carbohydrate chain depending on the presence of core fucose, possibly changing the accessibility. Here, we provide data that reveal molecular mechanisms of glycan processing of IgG antibodies, which may have implications for the generation of glycan-engineered therapeutic antibodies with improved efficacies. PMID:26067753
USDA-ARS?s Scientific Manuscript database
Corn zein was melt-processed with methylenediphenyl 4,4'-diisocyanate (MDI) using triethylamine (TEA) as catalyst. The objective is to construct a melt-processed, compatible blend of zein with MDI that can be used as a building block for generating bio-based thermoplastics. The impact of cross-linki...
Law, B.E.; Spencer, C.W.; Bostick, N.H.
1980-01-01
The onset of overpressuring occurs at c.3,500 m, near the base of the U. Cretaceous Lance Formation. The development of overpressuring may involve several processes; however, interpretation of the available information indicates that active generation of large amounts of wet gas is one of the more important processes. The present minimum temperature at the top of overpressuring is at least 88oC. The preservation of abnormally high pressures is due to presently active generation of gas in a thick interval of discontinuous, very low-permeability shales, siltstones, and sandstones. - from Authors
DEVELOPMENT OF SULFATE RADICAL-BASED CHEMICAL OXIDATION PROCESSES FOR TREATMENT OF PCBS
This study investigates transition metal based activation of peroxymonosulfate for generation of highly reactive sulfate radicals to degrade Polychlorinated Biphenyls (PCBs) in contaminated aqueous and sediment systems. Environmental friendly transition metal iron (Fe (II), Fe (I...
Accelerated rescaling of single Monte Carlo simulation runs with the Graphics Processing Unit (GPU).
Yang, Owen; Choi, Bernard
2013-01-01
To interpret fiber-based and camera-based measurements of remitted light from biological tissues, researchers typically use analytical models, such as the diffusion approximation to light transport theory, or stochastic models, such as Monte Carlo modeling. To achieve rapid (ideally real-time) measurement of tissue optical properties, especially in clinical situations, there is a critical need to accelerate Monte Carlo simulation runs. In this manuscript, we report on our approach using the Graphics Processing Unit (GPU) to accelerate rescaling of single Monte Carlo runs to calculate rapidly diffuse reflectance values for different sets of tissue optical properties. We selected MATLAB to enable non-specialists in C and CUDA-based programming to use the generated open-source code. We developed a software package with four abstraction layers. To calculate a set of diffuse reflectance values from a simulated tissue with homogeneous optical properties, our rescaling GPU-based approach achieves a reduction in computation time of several orders of magnitude as compared to other GPU-based approaches. Specifically, our GPU-based approach generated a diffuse reflectance value in 0.08ms. The transfer time from CPU to GPU memory currently is a limiting factor with GPU-based calculations. However, for calculation of multiple diffuse reflectance values, our GPU-based approach still can lead to processing that is ~3400 times faster than other GPU-based approaches.
Research on Finite Element Model Generating Method of General Gear Based on Parametric Modelling
NASA Astrophysics Data System (ADS)
Lei, Yulong; Yan, Bo; Fu, Yao; Chen, Wei; Hou, Liguo
2017-06-01
Aiming at the problems of low efficiency and poor quality of gear meshing in the current mainstream finite element software, through the establishment of universal gear three-dimensional model, and explore the rules of unit and node arrangement. In this paper, a finite element model generation method of universal gear based on parameterization is proposed. Visual Basic program is used to realize the finite element meshing, give the material properties, and set the boundary / load conditions and other pre-processing work. The dynamic meshing analysis of the gears is carried out with the method proposed in this pape, and compared with the calculated values to verify the correctness of the method. The method greatly shortens the workload of gear finite element pre-processing, improves the quality of gear mesh, and provides a new idea for the FEM pre-processing.
[Development and clinical evaluation of an anesthesia information management system].
Feng, Jing-yi; Chen, Hua; Zhu, Sheng-mei
2010-09-21
To study the design, implementation and clinical evaluation of an anesthesia information management system. To record, process and store peri-operative patient data automatically, all kinds of bedside monitoring equipments are connected into the system based on information integrating technology; after a statistical analysis of those patient data by data mining technology, patient status can be evaluated automatically based on risk prediction standard and decision support system, and then anesthetist could perform reasonable and safe clinical processes; with clinical processes electronically recorded, standard record tables could be generated, and clinical workflow is optimized, as well. With the system, kinds of patient data could be collected, stored, analyzed and archived, kinds of anesthesia documents could be generated, and patient status could be evaluated to support clinic decision. The anesthesia information management system is useful for improving anesthesia quality, decreasing risk of patient and clinician, and aiding to provide clinical proof.
Generating Test Templates via Automated Theorem Proving
NASA Technical Reports Server (NTRS)
Kancherla, Mani Prasad
1997-01-01
Testing can be used during the software development process to maintain fidelity between evolving specifications, program designs, and code implementations. We use a form of specification-based testing that employs the use of an automated theorem prover to generate test templates. A similar approach was developed using a model checker on state-intensive systems. This method applies to systems with functional rather than state-based behaviors. This approach allows for the use of incomplete specifications to aid in generation of tests for potential failure cases. We illustrate the technique on the cannonical triangle testing problem and discuss its use on analysis of a spacecraft scheduling system.
The Design and Evaluation of "CAPTools"--A Computer Aided Parallelization Toolkit
NASA Technical Reports Server (NTRS)
Yan, Jerry; Frumkin, Michael; Hribar, Michelle; Jin, Haoqiang; Waheed, Abdul; Johnson, Steve; Cross, Jark; Evans, Emyr; Ierotheou, Constantinos; Leggett, Pete;
1998-01-01
Writing applications for high performance computers is a challenging task. Although writing code by hand still offers the best performance, it is extremely costly and often not very portable. The Computer Aided Parallelization Tools (CAPTools) are a toolkit designed to help automate the mapping of sequential FORTRAN scientific applications onto multiprocessors. CAPTools consists of the following major components: an inter-procedural dependence analysis module that incorporates user knowledge; a 'self-propagating' data partitioning module driven via user guidance; an execution control mask generation and optimization module for the user to fine tune parallel processing of individual partitions; a program transformation/restructuring facility for source code clean up and optimization; a set of browsers through which the user interacts with CAPTools at each stage of the parallelization process; and a code generator supporting multiple programming paradigms on various multiprocessors. Besides describing the rationale behind the architecture of CAPTools, the parallelization process is illustrated via case studies involving structured and unstructured meshes. The programming process and the performance of the generated parallel programs are compared against other programming alternatives based on the NAS Parallel Benchmarks, ARC3D and other scientific applications. Based on these results, a discussion on the feasibility of constructing architectural independent parallel applications is presented.
Christe, Blaise; Burkhard, Pierre R; Pegna, Alan J; Mayer, Eugene; Hauert, Claude-Alain
2007-01-01
In this study, we developed a digitizing tablet-based instrument for the clinical assessment of human voluntary movements targeting motor processes of planning, programming and execution. The tool was used to investigate an adaptation of Fitts' reciprocal tapping task [10], comprising four conditions, each of them modulated by three indices of difficulty related to the amplitude of movement required. Temporal, spatial and sequential constraints underlying the various conditions allowed the intricate motor processes to be dissociated. Data obtained from a group of elderly healthy subjects (N=50) were in agreement with the literature on motor control, in the temporal and spatial domains. Speed constraints generated gains in the temporal domain and costs in the spatial one, while spatial constraints generated gain in the spatial domain and costs in the temporal one; finally, sequential constraints revealed the integrative nature of the cognitive operations involved in motor production. This versatile instrument proved capable of providing quantitative, accurate and sensitive measures of the various processes sustaining voluntary movement in healthy subjects. Altogether, analyses performed in this study generated a theoretical framework and reference data which could be used in the future for the clinical assessment of patients with various movement disorders, in particular Parkinson's disease.
Photogrammetric Processing of Planetary Linear Pushbroom Images Based on Approximate Orthophotos
NASA Astrophysics Data System (ADS)
Geng, X.; Xu, Q.; Xing, S.; Hou, Y. F.; Lan, C. Z.; Zhang, J. J.
2018-04-01
It is still a great challenging task to efficiently produce planetary mapping products from orbital remote sensing images. There are many disadvantages in photogrammetric processing of planetary stereo images, such as lacking ground control information and informative features. Among which, image matching is the most difficult job in planetary photogrammetry. This paper designs a photogrammetric processing framework for planetary remote sensing images based on approximate orthophotos. Both tie points extraction for bundle adjustment and dense image matching for generating digital terrain model (DTM) are performed on approximate orthophotos. Since most of planetary remote sensing images are acquired by linear scanner cameras, we mainly deal with linear pushbroom images. In order to improve the computational efficiency of orthophotos generation and coordinates transformation, a fast back-projection algorithm of linear pushbroom images is introduced. Moreover, an iteratively refined DTM and orthophotos scheme was adopted in the DTM generation process, which is helpful to reduce search space of image matching and improve matching accuracy of conjugate points. With the advantages of approximate orthophotos, the matching results of planetary remote sensing images can be greatly improved. We tested the proposed approach with Mars Express (MEX) High Resolution Stereo Camera (HRSC) and Lunar Reconnaissance Orbiter (LRO) Narrow Angle Camera (NAC) images. The preliminary experimental results demonstrate the feasibility of the proposed approach.
Identification of uncommon objects in containers
Bremer, Peer-Timo; Kim, Hyojin; Thiagarajan, Jayaraman J.
2017-09-12
A system for identifying in an image an object that is commonly found in a collection of images and for identifying a portion of an image that represents an object based on a consensus analysis of segmentations of the image. The system collects images of containers that contain objects for generating a collection of common objects within the containers. To process the images, the system generates a segmentation of each image. The image analysis system may also generate multiple segmentations for each image by introducing variations in the selection of voxels to be merged into a segment. The system then generates clusters of the segments based on similarity among the segments. Each cluster represents a common object found in the containers. Once the clustering is complete, the system may be used to identify common objects in images of new containers based on similarity between segments of images and the clusters.
Generating structure from experience: A retrieval-based model of language processing.
Johns, Brendan T; Jones, Michael N
2015-09-01
Standard theories of language generally assume that some abstraction of linguistic input is necessary to create higher level representations of linguistic structures (e.g., a grammar). However, the importance of individual experiences with language has recently been emphasized by both usage-based theories (Tomasello, 2003) and grounded and situated theories (e.g., Zwaan & Madden, 2005). Following the usage-based approach, we present a formal exemplar model that stores instances of sentences across a natural language corpus, applying recent advances from models of semantic memory. In this model, an exemplar memory is used to generate expectations about the future structure of sentences, using a mechanism for prediction in language processing (Altmann & Mirković, 2009). The model successfully captures a broad range of behavioral effects-reduced relative clause processing (Reali & Christiansen, 2007), the role of contextual constraint (Rayner & Well, 1996), and event knowledge activation (Ferretti, Kutas, & McRae, 2007), among others. We further demonstrate how perceptual knowledge could be integrated into this exemplar-based framework, with the goal of grounding language processing in perception. Finally, we illustrate how an exemplar memory system could have been used in the cultural evolution of language. The model provides evidence that an impressive amount of language processing may be bottom-up in nature, built on the storage and retrieval of individual linguistic experiences. (c) 2015 APA, all rights reserved).
Technical Note: Approximate Bayesian parameterization of a complex tropical forest model
NASA Astrophysics Data System (ADS)
Hartig, F.; Dislich, C.; Wiegand, T.; Huth, A.
2013-08-01
Inverse parameter estimation of process-based models is a long-standing problem in ecology and evolution. A key problem of inverse parameter estimation is to define a metric that quantifies how well model predictions fit to the data. Such a metric can be expressed by general cost or objective functions, but statistical inversion approaches are based on a particular metric, the probability of observing the data given the model, known as the likelihood. Deriving likelihoods for dynamic models requires making assumptions about the probability for observations to deviate from mean model predictions. For technical reasons, these assumptions are usually derived without explicit consideration of the processes in the simulation. Only in recent years have new methods become available that allow generating likelihoods directly from stochastic simulations. Previous applications of these approximate Bayesian methods have concentrated on relatively simple models. Here, we report on the application of a simulation-based likelihood approximation for FORMIND, a parameter-rich individual-based model of tropical forest dynamics. We show that approximate Bayesian inference, based on a parametric likelihood approximation placed in a conventional MCMC, performs well in retrieving known parameter values from virtual field data generated by the forest model. We analyze the results of the parameter estimation, examine the sensitivity towards the choice and aggregation of model outputs and observed data (summary statistics), and show results from using this method to fit the FORMIND model to field data from an Ecuadorian tropical forest. Finally, we discuss differences of this approach to Approximate Bayesian Computing (ABC), another commonly used method to generate simulation-based likelihood approximations. Our results demonstrate that simulation-based inference, which offers considerable conceptual advantages over more traditional methods for inverse parameter estimation, can successfully be applied to process-based models of high complexity. The methodology is particularly suited to heterogeneous and complex data structures and can easily be adjusted to other model types, including most stochastic population and individual-based models. Our study therefore provides a blueprint for a fairly general approach to parameter estimation of stochastic process-based models in ecology and evolution.
Recovery Processes of Organic Acids from Fermentation Broths in the Biomass-Based Industry.
Li, Qian-Zhu; Jiang, Xing-Lin; Feng, Xin-Jun; Wang, Ji-Ming; Sun, Chao; Zhang, Hai-Bo; Xian, Mo; Liu, Hui-Zhou
2016-01-01
The new movement towards green chemistry and renewable feedstocks makes microbial production of chemicals more competitive. Among the numerous chemicals, organic acids are more attractive targets for process development efforts in the renewable-based biorefinery industry. However, most of the production costs in microbial processes are higher than that in chemical processes, among which over 60% are generated by separation processes. Therefore, the research of separation and purification processes is important for a promising biorefinery industry. This review highlights the progress of recovery processes in the separation and purification of organic acids, including their advantages and disadvantages, current situation, and future prospects in terms of recovery yields and industrial application.
Scalable approximate policies for Markov decision process models of hospital elective admissions.
Zhu, George; Lizotte, Dan; Hoey, Jesse
2014-05-01
To demonstrate the feasibility of using stochastic simulation methods for the solution of a large-scale Markov decision process model of on-line patient admissions scheduling. The problem of admissions scheduling is modeled as a Markov decision process in which the states represent numbers of patients using each of a number of resources. We investigate current state-of-the-art real time planning methods to compute solutions to this Markov decision process. Due to the complexity of the model, traditional model-based planners are limited in scalability since they require an explicit enumeration of the model dynamics. To overcome this challenge, we apply sample-based planners along with efficient simulation techniques that given an initial start state, generate an action on-demand while avoiding portions of the model that are irrelevant to the start state. We also propose a novel variant of a popular sample-based planner that is particularly well suited to the elective admissions problem. Results show that the stochastic simulation methods allow for the problem size to be scaled by a factor of almost 10 in the action space, and exponentially in the state space. We have demonstrated our approach on a problem with 81 actions, four specialities and four treatment patterns, and shown that we can generate solutions that are near-optimal in about 100s. Sample-based planners are a viable alternative to state-based planners for large Markov decision process models of elective admissions scheduling. Copyright © 2014 Elsevier B.V. All rights reserved.
Development of a Neural Network-Based Renewable Energy Forecasting Framework for Process Industries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Soobin; Ryu, Jun-Hyung; Hodge, Bri-Mathias
2016-06-25
This paper presents a neural network-based forecasting framework for photovoltaic power (PV) generation as a decision-supporting tool to employ renewable energies in the process industry. The applicability of the proposed framework is illustrated by comparing its performance against other methodologies such as linear and nonlinear time series modelling approaches. A case study of an actual PV power plant in South Korea is presented.
15 maps merged in one data structure - GIS-based template for Dawn at Ceres
NASA Astrophysics Data System (ADS)
Naß, A.; Dawn Mapping Team
2017-09-01
Derive regional and global valid statements out of the map (quadrangles) is already a very time intensive task. However, another challenge is how individual mappers can generate one homogenous GIS-based project (w.r.t. geometrical and visual character) representing one geologically-consistent final map. Within this contribution a template will be presented which was generated for the process of the interpretative mapping project of Ceres to accomplish the requirement of unifying and merging individual quadrangle.
Contrast Enhancement of the LOASIS CPA Laser and Effects on Electron Beam Performance of LWFA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toth, Csaba; Gonsalves, Anthony J.; Panasenko, Dmitriy
2009-01-22
A nonlinear optical pulse cleaning technique based on cross-polarized wave (XPW) generation filtering [1] has been implemented to improve laser pulse contrast, and consequently to control pre-ionization in laser-plasma accelerator experiments. Three orders of magnitude improvement in pre-pulse contrast has been achieved, resulting in 4-fold increase in electron charge and improved stability of both the electron beam energy and THz radiation generated as a secondary process in the gas-jet-based LWFA experiments.
NASA Astrophysics Data System (ADS)
Zhang, Miao; Tong, Xiaojun
2017-07-01
This paper proposes a joint image encryption and compression scheme based on a new hyperchaotic system and curvelet transform. A new five-dimensional hyperchaotic system based on the Rabinovich system is presented. By means of the proposed hyperchaotic system, a new pseudorandom key stream generator is constructed. The algorithm adopts diffusion and confusion structure to perform encryption, which is based on the key stream generator and the proposed hyperchaotic system. The key sequence used for image encryption is relation to plain text. By means of the second generation curvelet transform, run-length coding, and Huffman coding, the image data are compressed. The joint operation of compression and encryption in a single process is performed. The security test results indicate the proposed methods have high security and good compression effect.
NASA Astrophysics Data System (ADS)
Wang, Zhi-peng; Zhang, Shuai; Liu, Hong-zhao; Qin, Yi
2014-12-01
Based on phase retrieval algorithm and QR code, a new optical encryption technology that only needs to record one intensity distribution is proposed. In this encryption process, firstly, the QR code is generated from the information to be encrypted; and then the generated QR code is placed in the input plane of 4-f system to have a double random phase encryption. For only one intensity distribution in the output plane is recorded as the ciphertext, the encryption process is greatly simplified. In the decryption process, the corresponding QR code is retrieved using phase retrieval algorithm. A priori information about QR code is used as support constraint in the input plane, which helps solve the stagnation problem. The original information can be recovered without distortion by scanning the QR code. The encryption process can be implemented either optically or digitally, and the decryption process uses digital method. In addition, the security of the proposed optical encryption technology is analyzed. Theoretical analysis and computer simulations show that this optical encryption system is invulnerable to various attacks, and suitable for harsh transmission conditions.
Pen-based computers: Computers without keys
NASA Technical Reports Server (NTRS)
Conklin, Cheryl L.
1994-01-01
The National Space Transportation System (NSTS) is comprised of many diverse and highly complex systems incorporating the latest technologies. Data collection associated with ground processing of the various Space Shuttle system elements is extremely challenging due to the many separate processing locations where data is generated. This presents a significant problem when the timely collection, transfer, collation, and storage of data is required. This paper describes how new technology, referred to as Pen-Based computers, is being used to transform the data collection process at Kennedy Space Center (KSC). Pen-Based computers have streamlined procedures, increased data accuracy, and now provide more complete information than previous methods. The end results is the elimination of Shuttle processing delays associated with data deficiencies.
Learning Physics-based Models in Hydrology under the Framework of Generative Adversarial Networks
NASA Astrophysics Data System (ADS)
Karpatne, A.; Kumar, V.
2017-12-01
Generative adversarial networks (GANs), that have been highly successful in a number of applications involving large volumes of labeled and unlabeled data such as computer vision, offer huge potential for modeling the dynamics of physical processes that have been traditionally studied using simulations of physics-based models. While conventional physics-based models use labeled samples of input/output variables for model calibration (estimating the right parametric forms of relationships between variables) or data assimilation (identifying the most likely sequence of system states in dynamical systems), there is a greater opportunity to explore the full power of machine learning (ML) methods (e.g, GANs) for studying physical processes currently suffering from large knowledge gaps, e.g. ground-water flow. However, success in this endeavor requires a principled way of combining the strengths of ML methods with physics-based numerical models that are founded on a wealth of scientific knowledge. This is especially important in scientific domains like hydrology where the number of data samples is small (relative to Internet-scale applications such as image recognition where machine learning methods has found great success), and the physical relationships are complex (high-dimensional) and non-stationary. We will present a series of methods for guiding the learning of GANs using physics-based models, e.g., by using the outputs of physics-based models as input data to the generator-learner framework, and by using physics-based models as generators trained using validation data in the adversarial learning framework. These methods are being developed under the broad paradigm of theory-guided data science that we are developing to integrate scientific knowledge with data science methods for accelerating scientific discovery.
Concept Learning through Image Processing.
ERIC Educational Resources Information Center
Cifuentes, Lauren; Yi-Chuan, Jane Hsieh
This study explored computer-based image processing as a study strategy for middle school students' science concept learning. Specifically, the research examined the effects of computer graphics generation on science concept learning and the impact of using computer graphics to show interrelationships among concepts during study time. The 87…
Brown, James A L
2016-05-06
A pedagogic intervention, in the form of an inquiry-based peer-assisted learning project (as a practical student-led bioinformatics module), was assessed for its ability to increase students' engagement, practical bioinformatic skills and process-specific knowledge. Elements assessed were process-specific knowledge following module completion, qualitative student-based module evaluation and the novelty, scientific validity and quality of written student reports. Bioinformatics is often the starting point for laboratory-based research projects, therefore high importance was placed on allowing students to individually develop and apply processes and methods of scientific research. Students led a bioinformatic inquiry-based project (within a framework of inquiry), discovering, justifying and exploring individually discovered research targets. Detailed assessable reports were produced, displaying data generated and the resources used. Mimicking research settings, undergraduates were divided into small collaborative groups, with distinctive central themes. The module was evaluated by assessing the quality and originality of the students' targets through reports, reflecting students' use and understanding of concepts and tools required to generate their data. Furthermore, evaluation of the bioinformatic module was assessed semi-quantitatively using pre- and post-module quizzes (a non-assessable activity, not contributing to their grade), which incorporated process- and content-specific questions (indicative of their use of the online tools). Qualitative assessment of the teaching intervention was performed using post-module surveys, exploring student satisfaction and other module specific elements. Overall, a positive experience was found, as was a post module increase in correct process-specific answers. In conclusion, an inquiry-based peer-assisted learning module increased students' engagement, practical bioinformatic skills and process-specific knowledge. © 2016 by The International Union of Biochemistry and Molecular Biology, 44:304-313 2016. © 2016 The International Union of Biochemistry and Molecular Biology.
Scalable UWB photonic generator based on the combination of doublet pulses.
Moreno, Vanessa; Rius, Manuel; Mora, José; Muriel, Miguel A; Capmany, José
2014-06-30
We propose and experimentally demonstrate a scalable and reconfigurable optical scheme to generate high order UWB pulses. Firstly, various ultra wideband doublets are created through a process of phase-to-intensity conversion by means of a phase modulation and a dispersive media. In a second stage, doublets are combined in an optical processing unit that allows the reconfiguration of UWB high order pulses. Experimental results both in time and frequency domains are presented showing good performance related to the fractional bandwidth and spectral efficiency parameters.
Nonstationary Dynamics Data Analysis with Wavelet-SVD Filtering
NASA Technical Reports Server (NTRS)
Brenner, Marty; Groutage, Dale; Bessette, Denis (Technical Monitor)
2001-01-01
Nonstationary time-frequency analysis is used for identification and classification of aeroelastic and aeroservoelastic dynamics. Time-frequency multiscale wavelet processing generates discrete energy density distributions. The distributions are processed using the singular value decomposition (SVD). Discrete density functions derived from the SVD generate moments that detect the principal features in the data. The SVD standard basis vectors are applied and then compared with a transformed-SVD, or TSVD, which reduces the number of features into more compact energy density concentrations. Finally, from the feature extraction, wavelet-based modal parameter estimation is applied.
Reconstruction of dynamical systems from resampled point processes produced by neuron models
NASA Astrophysics Data System (ADS)
Pavlova, Olga N.; Pavlov, Alexey N.
2018-04-01
Characterization of dynamical features of chaotic oscillations from point processes is based on embedding theorems for non-uniformly sampled signals such as the sequences of interspike intervals (ISIs). This theoretical background confirms the ability of attractor reconstruction from ISIs generated by chaotically driven neuron models. The quality of such reconstruction depends on the available length of the analyzed dataset. We discuss how data resampling improves the reconstruction for short amount of data and show that this effect is observed for different types of mechanisms for spike generation.
Removal of amino groups from anilines through diazonium salt-based reactions.
He, Linman; Qiu, Guanyinsheng; Gao, Yueqiu; Wu, Jie
2014-09-28
This minireview describes the applications of in situ generated diazonium salts from anilines in organic synthesis. In situ generation of diazonium salts from anilines represents an efficient and practical pathway, leading to a series of useful structures. In these transformations, the amino group of aniline formally acts as a leaving group. Two distinctive kinds of mechanisms, including transition metal (especially palladium)-catalyzed oxidative addition-reductive elimination and a radical process, are involved in the removal of amino groups from anilines, and both catalytic processes are described in this minireview.
A School-Based Mental Health Consultation Curriculum.
ERIC Educational Resources Information Center
Sandoval, Jonathan; Davis, John M.
1984-01-01
Presents one position on consultation that integrates a theoretical model, a process model, and a curriculum for training school-based mental health consultants. Elements of the proposed curriculum include: ethics, relationship building, maintaining rapport, defining problems, gathering data, sharing information, generating and supporting…
Network-Oriented Approach to Distributed Generation Planning
NASA Astrophysics Data System (ADS)
Kochukov, O.; Mutule, A.
2017-06-01
The main objective of the paper is to present an innovative complex approach to distributed generation planning and show the advantages over existing methods. The approach will be most suitable for DNOs and authorities and has specific calculation targets to support the decision-making process. The method can be used for complex distribution networks with different arrangement and legal base.
ERIC Educational Resources Information Center
Chang, Ching; Chang, Chih-Kai
2014-01-01
The study is based on the use of a flexible learning framework to help students improve information processes underlying strategy instruction in EFL listening. By exploiting the online videotext self-dictation-generation (video-SDG) learning activity implemented on the YouTube caption manager platform, the learning cycle was emphasized to promote…
ERIC Educational Resources Information Center
Freeland, Peter
2013-01-01
Charles Darwin supposed that evolution involved a process of gradual change, generated randomly, with the selection and retention over many generations of survival-promoting features. Some theists have never accepted this idea. "Intelligent design" is a relatively recent theory, supposedly based on scientific evidence, which attempts to…
Conductivity based on selective etch for GaN devices and applications thereof
Zhang, Yu; Sun, Qian; Han, Jung
2015-12-08
This invention relates to methods of generating NP gallium nitride (GaN) across large areas (>1 cm.sup.2) with controlled pore diameters, pore density, and porosity. Also disclosed are methods of generating novel optoelectronic devices based on porous GaN. Additionally a layer transfer scheme to separate and create free-standing crystalline GaN thin layers is disclosed that enables a new device manufacturing paradigm involving substrate recycling. Other disclosed embodiments of this invention relate to fabrication of GaN based nanocrystals and the use of NP GaN electrodes for electrolysis, water splitting, or photosynthetic process applications.
NASA Astrophysics Data System (ADS)
Plaza, Antonio; Plaza, Javier; Paz, Abel
2010-10-01
Latest generation remote sensing instruments (called hyperspectral imagers) are now able to generate hundreds of images, corresponding to different wavelength channels, for the same area on the surface of the Earth. In previous work, we have reported that the scalability of parallel processing algorithms dealing with these high-dimensional data volumes is affected by the amount of data to be exchanged through the communication network of the system. However, large messages are common in hyperspectral imaging applications since processing algorithms are pixel-based, and each pixel vector to be exchanged through the communication network is made up of hundreds of spectral values. Thus, decreasing the amount of data to be exchanged could improve the scalability and parallel performance. In this paper, we propose a new framework based on intelligent utilization of wavelet-based data compression techniques for improving the scalability of a standard hyperspectral image processing chain on heterogeneous networks of workstations. This type of parallel platform is quickly becoming a standard in hyperspectral image processing due to the distributed nature of collected hyperspectral data as well as its flexibility and low cost. Our experimental results indicate that adaptive lossy compression can lead to improvements in the scalability of the hyperspectral processing chain without sacrificing analysis accuracy, even at sub-pixel precision levels.
NASA Astrophysics Data System (ADS)
Prada, Jose Fernando
Keeping a contingency reserve in power systems is necessary to preserve the security of real-time operations. This work studies two different approaches to the optimal allocation of energy and reserves in the day-ahead generation scheduling process. Part I presents a stochastic security-constrained unit commitment model to co-optimize energy and the locational reserves required to respond to a set of uncertain generation contingencies, using a novel state-based formulation. The model is applied in an offer-based electricity market to allocate contingency reserves throughout the power grid, in order to comply with the N-1 security criterion under transmission congestion. The objective is to minimize expected dispatch and reserve costs, together with post contingency corrective redispatch costs, modeling the probability of generation failure and associated post contingency states. The characteristics of the scheduling problem are exploited to formulate a computationally efficient method, consistent with established operational practices. We simulated the distribution of locational contingency reserves on the IEEE RTS96 system and compared the results with the conventional deterministic method. We found that assigning locational spinning reserves can guarantee an N-1 secure dispatch accounting for transmission congestion at a reasonable extra cost. The simulations also showed little value of allocating downward reserves but sizable operating savings from co-optimizing locational nonspinning reserves. Overall, the results indicate the computational tractability of the proposed method. Part II presents a distributed generation scheduling model to optimally allocate energy and spinning reserves among competing generators in a day-ahead market. The model is based on the coordination between individual generators and a market entity. The proposed method uses forecasting, augmented pricing and locational signals to induce efficient commitment of generators based on firm posted prices. It is price-based but does not rely on multiple iterations, minimizes information exchange and simplifies the market clearing process. Simulations of the distributed method performed on a six-bus test system showed that, using an appropriate set of prices, it is possible to emulate the results of a conventional centralized solution, without need of providing make-whole payments to generators. Likewise, they showed that the distributed method can accommodate transactions with different products and complex security constraints.
Generation of low-temperature air plasma for food processing
NASA Astrophysics Data System (ADS)
Stepanova, Olga; Demidova, Maria; Astafiev, Alexander; Pinchuk, Mikhail; Balkir, Pinar; Turantas, Fulya
2015-11-01
The project is aimed at developing a physical and technical foundation of generating plasma with low gas temperature at atmospheric pressure for food industry needs. As known, plasma has an antimicrobial effect on the numerous types of microorganisms, including those that cause food spoilage. In this work an original experimental setup has been developed for the treatment of different foods. It is based on initiating corona or dielectric-barrier discharge in a chamber filled with ambient air in combination with a certain helium admixture. The experimental setup provides various conditions of discharge generation (including discharge gap geometry, supply voltage, velocity of gas flow, content of helium admixture in air and working pressure) and allows for the measurement of the electrical discharge parameters. Some recommendations on choosing optimal conditions of discharge generation for experiments on plasma food processing are developed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perumalla, Kalyan S.; Alam, Maksudul
A novel parallel algorithm is presented for generating random scale-free networks using the preferential-attachment model. The algorithm, named cuPPA, is custom-designed for single instruction multiple data (SIMD) style of parallel processing supported by modern processors such as graphical processing units (GPUs). To the best of our knowledge, our algorithm is the first to exploit GPUs, and also the fastest implementation available today, to generate scale free networks using the preferential attachment model. A detailed performance study is presented to understand the scalability and runtime characteristics of the cuPPA algorithm. In one of the best cases, when executed on an NVidiamore » GeForce 1080 GPU, cuPPA generates a scale free network of a billion edges in less than 2 seconds.« less
Petersen, Abdul M; Haigh, Kate; Görgens, Johann F
2014-01-01
Flow sheet options for integrating ethanol production from spent sulfite liquor (SSL) into the acid-based sulfite pulping process at the Sappi Saiccor mill (Umkomaas, South Africa) were investigated, including options for generation of thermal and electrical energy from onsite bio-wastes, such as bark. Processes were simulated with Aspen Plus® for mass- and energy-balances, followed by an estimation of the economic viability and environmental impacts. Various concentration levels of the total dissolved solids in magnesium oxide-based SSL, which currently fuels a recovery boiler, prior to fermentation was considered, together with return of the fermentation residues (distillation bottoms) to the recovery boiler after ethanol separation. The generation of renewable thermal and electrical energy from onsite bio-wastes were also included in the energy balance of the combined pulping-ethanol process, in order to partially replace coal consumption. The bio-energy supplementations included the combustion of bark for heat and electricity generation and the bio-digestion of the calcium oxide SSL to produce methane as additional energy source. Ethanol production from SSL at the highest substrate concentration was the most economically feasible when coal was used for process energy. However this solution did not provide any savings in greenhouse gas (GHG) emissions for the concentration-fermentation-distillation process. Maximizing the use of renewable energy sources to partially replace coal consumption yielded a satisfactory economic performance, with a minimum ethanol selling price of 0.83 US$/l , and a drastic reduction in the overall greenhouse gas emissions for the entire facility. High substrate concentrations and conventional distillation should be used when considering integrating ethanol production at sulfite pulping mills. Bio-wastes generated onsite should be utilized at their maximum potential for energy generation in order to maximize the GHG emissions reduction.
Zheng, Guanglou; Fang, Gengfa; Shankaran, Rajan; Orgun, Mehmet A; Zhou, Jie; Qiao, Li; Saleem, Kashif
2017-05-01
Generating random binary sequences (BSes) is a fundamental requirement in cryptography. A BS is a sequence of N bits, and each bit has a value of 0 or 1. For securing sensors within wireless body area networks (WBANs), electrocardiogram (ECG)-based BS generation methods have been widely investigated in which interpulse intervals (IPIs) from each heartbeat cycle are processed to produce BSes. Using these IPI-based methods to generate a 128-bit BS in real time normally takes around half a minute. In order to improve the time efficiency of such methods, this paper presents an ECG multiple fiducial-points based binary sequence generation (MFBSG) algorithm. The technique of discrete wavelet transforms is employed to detect arrival time of these fiducial points, such as P, Q, R, S, and T peaks. Time intervals between them, including RR, RQ, RS, RP, and RT intervals, are then calculated based on this arrival time, and are used as ECG features to generate random BSes with low latency. According to our analysis on real ECG data, these ECG feature values exhibit the property of randomness and, thus, can be utilized to generate random BSes. Compared with the schemes that solely rely on IPIs to generate BSes, this MFBSG algorithm uses five feature values from one heart beat cycle, and can be up to five times faster than the solely IPI-based methods. So, it achieves a design goal of low latency. According to our analysis, the complexity of the algorithm is comparable to that of fast Fourier transforms. These randomly generated ECG BSes can be used as security keys for encryption or authentication in a WBAN system.
Continuous data assimilation for downscaling large-footprint soil moisture retrievals
NASA Astrophysics Data System (ADS)
Altaf, Muhammad U.; Jana, Raghavendra B.; Hoteit, Ibrahim; McCabe, Matthew F.
2016-10-01
Soil moisture is a key component of the hydrologic cycle, influencing processes leading to runoff generation, infiltration and groundwater recharge, evaporation and transpiration. Generally, the measurement scale for soil moisture is found to be different from the modeling scales for these processes. Reducing this mismatch between observation and model scales in necessary for improved hydrological modeling. An innovative approach to downscaling coarse resolution soil moisture data by combining continuous data assimilation and physically based modeling is presented. In this approach, we exploit the features of Continuous Data Assimilation (CDA) which was initially designed for general dissipative dynamical systems and later tested numerically on the incompressible Navier-Stokes equation, and the Benard equation. A nudging term, estimated as the misfit between interpolants of the assimilated coarse grid measurements and the fine grid model solution, is added to the model equations to constrain the model's large scale variability by available measurements. Soil moisture fields generated at a fine resolution by a physically-based vadose zone model (HYDRUS) are subjected to data assimilation conditioned upon coarse resolution observations. This enables nudging of the model outputs towards values that honor the coarse resolution dynamics while still being generated at the fine scale. Results show that the approach is feasible to generate fine scale soil moisture fields across large extents, based on coarse scale observations. Application of this approach is likely in generating fine and intermediate resolution soil moisture fields conditioned on the radiometerbased, coarse resolution products from remote sensing satellites.
NASA Astrophysics Data System (ADS)
Hennig, Hanna; Rödiger, Tino; Laronne, Jonathan B.; Geyer, Stefan; Merz, Ralf
2016-04-01
Flash floods in (semi-) arid regions are fascinating in their suddenness and can be harmful for humans, infrastructure, industry and tourism. Generated within minutes, an early warning system is essential. A hydrological model is required to quantify flash floods. Current models to predict flash floods are often based on simplified concepts and/or on concepts which were developed for humid regions. To more closely relate such models to local conditions, processes within catchments where flash floods occur require consideration. In this study we present a monitoring approach to decipher different flash flood generating processes in the ephemeral Wadi Arugot on the western side of the Dead Sea. To understand rainfall input a dense rain gauge network was installed. Locations of rain gauges were chosen based on land use, slope and soil cover. The spatiotemporal variation of rain intensity will also be available from radar backscatter. Level pressure sensors located at the outlet of major tributaries have been deployed to analyze in which part of the catchment water is generated. To identify the importance of soil moisture preconditions, two cosmic ray sensors have been deployed. At the outlet of the Arugot water is sampled and level is monitored. To more accurately determine water discharge, water velocity is measured using portable radar velocimetry. A first analysis of flash flood processes will be presented following the FLEX-Topo concept .(Savenije, 2010), where each landscape type is represented using an individual hydrological model according to the processes within the three hydrological response units: plateau, desert and outlet. References: Savenije, H. H. G.: HESS Opinions "Topography driven conceptual modelling (FLEX-Topo)", Hydrol. Earth Syst. Sci., 14, 2681-2692, doi:10.5194/hess-14-2681-2010, 2010.
2012-08-01
processed through the powder metallurgy route and develops a polycrystalline microstructure consisting of γ grains with nanoscale γ’ precipitates...on the cooling rate employed. Faster cooling rates, such as those encountered during water quenching the alloy from the high temperature single γ...and the first generation γ’ precipitates. Subsequently on quenching to a lower temperature a second generation of γ’ precipitates are formed that are
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumar, Shekhar; Koganti, S.B.
2008-07-01
Acetohydroxamic acid (AHA) is a novel complexant for recycle of nuclear-fuel materials. It can be used in ordinary centrifugal extractors, eliminating the need for electro-redox equipment or complex maintenance requirements in a remotely maintained hot cell. In this work, the effect of AHA on Pu(IV) distribution ratios in 30% TBP system was quantified, modeled, and integrated in SIMPSEX code. Two sets of batch experiments involving macro Pu concentrations (conducted at IGCAR) and one high-Pu flowsheet (literature) were simulated for AHA based U-Pu separation. Based on the simulation and validation results, AHA based next-generation reprocessing flowsheets are proposed for co-processing basedmore » FBR and thermal-fuel reprocessing as well as evaporator-less macro-level Pu concentration process required for MOX fuel fabrication. Utilization of AHA results in significant simplification in plant design and simpler technology implementations with significant cost savings. (authors)« less
BIM authoring for an image-based bridge maintenance system of existing cable-supported bridges
NASA Astrophysics Data System (ADS)
Dang, N. S.; Shim, C. S.
2018-04-01
Infrastructure nowadays is increasingly become the main backbone for the metropolitan development in general. Along with the rise of new facilities, the demand in term of maintenance for the existing bridges is indispensable. Recently, the terminology of “preventive maintenance” is not unfamiliar with the engineer, literally is the use of a bridge maintenance system (BMS) based on a BIM-oriented model. In this paper, the process of generating a BMS based on BIM model is introduced in detail. Data management for this BMS is separated into two modules: site inspection system and information management system. The noteworthy aspect of this model lays on the closed and automatic process of “capture image, generate the technical damage report, and upload/feedback to the BMS” in real-time. A pilot BMS system for a cable-supported bridge is presented which showed a good performance and potential to further development of preventive maintenance.
Computer systems and methods for the query and visualization multidimensional databases
Stolte, Chris; Tang, Diane L.; Hanrahan, Patrick
2017-04-25
A method of generating a data visualization is performed at a computer having a display, one or more processors, and memory. The memory stores one or more programs for execution by the one or more processors. The process receives user specification of a plurality of characteristics of a data visualization. The data visualization is based on data from a multidimensional database. The characteristics specify at least x-position and y-position of data marks corresponding to tuples of data retrieved from the database. The process generates a data visualization according to the specified plurality of characteristics. The data visualization has an x-axis defined based on data for one or more first fields from the database that specify x-position of the data marks and the data visualization has a y-axis defined based on data for one or more second fields from the database that specify y-position of the data marks.
Technology advancement of the static feed water electrolysis process
NASA Technical Reports Server (NTRS)
Schubert, F. H.; Wynveen, R. A.
1977-01-01
A program to advance the technology of oxygen- and hydrogen-generating subsystems based on water electrolysis was studied. Major emphasis was placed on static feed water electrolysis, a concept characterized by low power consumption and high intrinsic reliability. The static feed based oxygen generation subsystem consists basically of three subassemblies: (1) a combined water electrolysis and product gas dehumidifier module; (2) a product gas pressure controller and; (3) a cyclically filled water feed tank. Development activities were completed at the subsystem as well as at the component level. An extensive test program including single cell, subsystem and integrated system testing was completed with the required test support accessories designed, fabricated, and assembled. Mini-product assurance activities were included throughout all phases of program activities. An extensive number of supporting technology studies were conducted to advance the technology base of the static feed water electrolysis process and to resolve problems.
Study of Nonlinear Propagation of Ultrashort Laser Pulses and Its Application to Harmonic Generation
NASA Astrophysics Data System (ADS)
Weerawarne, Darshana L.
Laser filamentation, which is one of the exotic nonlinear optical phenomena, is self-guidance of high-power laser beams due to the dynamic balance between the optical Kerr effect (self-focusing) and other nonlinear effects such as plasma defocusing. It has many applications including supercontinuum generation (SCG), high-order harmonic generation (HHG), lightning guiding, stand-off sensing, and rain making. The main focus of this work is on studying odd-order harmonic generation (HG) (i.e., 3o, 5o, 7o, etc., where o is the angular frequency) in centrosymmetric media while a high-power, ultrashort harmonic-driving pulse undergoes nonlinear propagation such as laser filamentation. The investigation of highly-controversial nonlinear indices of refraction by measuring low-order HG in air is carried out. Furthermore, time-resolved (i.e., pump-probe) experiments and significant harmonic enhancements are presented and a novel HG mechanism based on higher-order nonlinearities is proposed to explain the experimental results. C/C++ numerical simulations are used to solve the nonlinear Schrodinger equation (NLSE) which supports the experimental findings. Another project which I have performed is selective sintering using lasers. Short-pulse lasers provide a fascinating tool for material processing, especially when the conventional oven-based techniques fail to process flexible materials for smart energy/electronics applications. I present experimental and theoretical studies on laser processing of nanoparticle-coated flexible materials, aiming to fabricate flexible electronic devices.
Frank, Steven A.
2010-01-01
We typically observe large-scale outcomes that arise from the interactions of many hidden, small-scale processes. Examples include age of disease onset, rates of amino acid substitutions, and composition of ecological communities. The macroscopic patterns in each problem often vary around a characteristic shape that can be generated by neutral processes. A neutral generative model assumes that each microscopic process follows unbiased or random stochastic fluctuations: random connections of network nodes; amino acid substitutions with no effect on fitness; species that arise or disappear from communities randomly. These neutral generative models often match common patterns of nature. In this paper, I present the theoretical background by which we can understand why these neutral generative models are so successful. I show where the classic patterns come from, such as the Poisson pattern, the normal or Gaussian pattern, and many others. Each classic pattern was often discovered by a simple neutral generative model. The neutral patterns share a special characteristic: they describe the patterns of nature that follow from simple constraints on information. For example, any aggregation of processes that preserves information only about the mean and variance attracts to the Gaussian pattern; any aggregation that preserves information only about the mean attracts to the exponential pattern; any aggregation that preserves information only about the geometric mean attracts to the power law pattern. I present a simple and consistent informational framework of the common patterns of nature based on the method of maximum entropy. This framework shows that each neutral generative model is a special case that helps to discover a particular set of informational constraints; those informational constraints define a much wider domain of non-neutral generative processes that attract to the same neutral pattern. PMID:19538344
Hydrogen Generation Via Fuel Reforming
NASA Astrophysics Data System (ADS)
Krebs, John F.
2003-07-01
Reforming is the conversion of a hydrocarbon based fuel to a gas mixture that contains hydrogen. The H2 that is produced by reforming can then be used to produce electricity via fuel cells. The realization of H2-based power generation, via reforming, is facilitated by the existence of the liquid fuel and natural gas distribution infrastructures. Coupling these same infrastructures with more portable reforming technology facilitates the realization of fuel cell powered vehicles. The reformer is the first component in a fuel processor. Contaminants in the H2-enriched product stream, such as carbon monoxide (CO) and hydrogen sulfide (H2S), can significantly degrade the performance of current polymer electrolyte membrane fuel cells (PEMFC's). Removal of such contaminants requires extensive processing of the H2-rich product stream prior to utilization by the fuel cell to generate electricity. The remaining components of the fuel processor remove the contaminants in the H2 product stream. For transportation applications the entire fuel processing system must be as small and lightweight as possible to achieve desirable performance requirements. Current efforts at Argonne National Laboratory are focused on catalyst development and reactor engineering of the autothermal processing train for transportation applications.
Competitiveness and potentials of UCG-CCS on the European energy market
NASA Astrophysics Data System (ADS)
Kempka, T.; Nakaten, N.; Schlüter, R.; Fernandez-Steeger, T.; Azzam, R.
2009-04-01
The world-wide coal reserves can satisfy the world's primary energy demand for several hundred years. However, deep coal deposits with seams of low thickness and structural complexity do currently not allow an economic exploitation of many deposits. Here, underground coal gasification (UCG) can offer an economical approach for coal extraction. The intended overall process relies on coal deposit exploitation using directed drillings located at the coal seam base and the subsequent in situ coal conversion into a synthesis gas. The resulting synthesis gas is used for electricity generation in a combined cycle plant at the surface. A reduction of the CO2 emissions resulting from the combined process is realized by subsequent CO2 capture and its injection into the previously gasified coal seams. The scope of the present study was the investigation of UCG-CCS competitiveness on the European energy market and the determination of the impacting factors. For that purpose, a modular model for calculation of UCG-CCS electricity generation costs was implemented and adapted to the most relevant process parameters. Furthermore, the range of energy supply coverage was estimated based on different German energy generation scenarios.
NASA Astrophysics Data System (ADS)
Nandigam, V.; Crosby, C. J.; Baru, C.; Arrowsmith, R.
2009-12-01
LIDAR is an excellent example of the new generation of powerful remote sensing data now available to Earth science researchers. Capable of producing digital elevation models (DEMs) more than an order of magnitude higher resolution than those currently available, LIDAR data allows earth scientists to study the processes that contribute to landscape evolution at resolutions not previously possible, yet essential for their appropriate representation. Along with these high-resolution datasets comes an increase in the volume and complexity of data that the user must efficiently manage and process in order for it to be scientifically useful. Although there are expensive commercial LIDAR software applications available, processing and analysis of these datasets are typically computationally inefficient on the conventional hardware and software that is currently available to most of the Earth science community. We have designed and implemented an Internet-based system, the OpenTopography Portal, that provides integrated access to high-resolution LIDAR data as well as web-based tools for processing of these datasets. By using remote data storage and high performance compute resources, the OpenTopography Portal attempts to simplify data access and standard LIDAR processing tasks for the Earth Science community. The OpenTopography Portal allows users to access massive amounts of raw point cloud LIDAR data as well as a suite of DEM generation tools to enable users to generate custom digital elevation models to best fit their science applications. The Cyberinfrastructure software tools for processing the data are freely available via the portal and conveniently integrated with the data selection in a single user-friendly interface. The ability to run these tools on powerful Cyberinfrastructure resources instead of their own labs provides a huge advantage in terms of performance and compute power. The system also encourages users to explore data processing methods and the variations in algorithm parameters since all of the processing is done remotely and numerous jobs can be submitted in sequence. The web-based software also eliminates the need for users to deal with the hassles and costs associated with software installation and licensing while providing adequate disk space for storage and personal job archival capability. Although currently limited to data access and DEM generation tasks, the OpenTopography system is modular in design and can be modified to accommodate new processing tools as they become available. We are currently exploring implementation of higher-level DEM analysis tasks in OpenTopography, since such processing is often computationally intensive and thus lends itself to utilization of cyberinfrastructure. Products derived from OpenTopography processing are available in a variety of formats ranging from simple Google Earth visualizations of LIDAR-derived hillshades to various GIS-compatible grid formats. To serve community users less interested in data processing, OpenTopography also hosts 1 km^2 digital elevation model tiles as well as Google Earth image overlays for a synoptic view of the data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Junge, D.C.
1979-09-01
Significant quantits of wood resiue fuels are presently being used in industrial steam generating facilities. Recent studies indicate that substantial additional quantities of wood residue fuels are available for energy generation in the form of steam and/or electricity. A limited data base on the combustion characteristics of wood residue fuels has resulted in the installation and operation of inefficient combustion systems for these fuels. This investigation of the combustion characteristics of wood residue fuels was undertaken to provide a data base which could be used to optimize the combustion of such fuels. Optimization of the combustion process in industrial boilersmore » serves to improve combustion efficiency and to reduce air pollutant emissions generated in the combustion process. This report presents data on the combustion characteristics of lodge pole pine wood chips. The data were obtained in a pilot scale combustion test facility at Oregon State University.« less
Peng, Chen; Frommlet, Alexandra; Perez, Manuel; Cobas, Carlos; Blechschmidt, Anke; Dominguez, Santiago; Lingel, Andreas
2016-04-14
NMR binding assays are routinely applied in hit finding and validation during early stages of drug discovery, particularly for fragment-based lead generation. To this end, compound libraries are screened by ligand-observed NMR experiments such as STD, T1ρ, and CPMG to identify molecules interacting with a target. The analysis of a high number of complex spectra is performed largely manually and therefore represents a limiting step in hit generation campaigns. Here we report a novel integrated computational procedure that processes and analyzes ligand-observed proton and fluorine NMR binding data in a fully automated fashion. A performance evaluation comparing automated and manual analysis results on (19)F- and (1)H-detected data sets shows that the program delivers robust, high-confidence hit lists in a fraction of the time needed for manual analysis and greatly facilitates visual inspection of the associated NMR spectra. These features enable considerably higher throughput, the assessment of larger libraries, and shorter turn-around times.
Adaptation to sensory-motor reflex perturbations is blind to the source of errors.
Hudson, Todd E; Landy, Michael S
2012-01-06
In the study of visual-motor control, perhaps the most familiar findings involve adaptation to externally imposed movement errors. Theories of visual-motor adaptation based on optimal information processing suppose that the nervous system identifies the sources of errors to effect the most efficient adaptive response. We report two experiments using a novel perturbation based on stimulating a visually induced reflex in the reaching arm. Unlike adaptation to an external force, our method induces a perturbing reflex within the motor system itself, i.e., perturbing forces are self-generated. This novel method allows a test of the theory that error source information is used to generate an optimal adaptive response. If the self-generated source of the visually induced reflex perturbation is identified, the optimal response will be via reflex gain control. If the source is not identified, a compensatory force should be generated to counteract the reflex. Gain control is the optimal response to reflex perturbation, both because energy cost and movement errors are minimized. Energy is conserved because neither reflex-induced nor compensatory forces are generated. Precision is maximized because endpoint variance is proportional to force production. We find evidence against source-identified adaptation in both experiments, suggesting that sensory-motor information processing is not always optimal.
Ultrasound-assisted vapor generation of mercury.
Ribeiro, Anderson S; Vieira, Mariana A; Willie, Scott; Sturgeon, Ralph E
2007-06-01
Cold vapor generation arising from reduction of both Hg(2+) and CH(3)Hg(+) occurs using ultrasonic (US) fields of sufficient density to achieve both localized heating as well as radical-based attack in solutions of formic and acetic acids and tetramethylammonium hydroxide (TMAH). A batch sonoreactor utilizing an ultrasonic probe as an energy source and a flow through system based on a US bath were optimized for this purpose. Reduction of CH(3)Hg(+) to Hg(0) occurs only at relatively high US field density (>10 W cm(-3) of sample solution) and is thus not observed when a conventional US bath is used for cold vapor generation. Speciation of mercury is thus possible by altering the power density during the measurement process. Thermal reduction of Hg(2+) is efficient in formic acid and TMAH at 70 degrees C and occurs in the absence of the US field. Room temperature studies with the batch sonoreactor reveal a slow reduction process, producing temporally broad signals having an efficiency of approximately 68% of that arising from use of a conventional SnCl(2) reduction system. Molecular species of mercury are generated at high concentrations of formic and acetic acid. Factors affecting the generation of Hg(0) were optimized and the batch sonoreactor used for the determination of total mercury in SLRS-4 river water reference material.
A continuous quality improvement team approach to adverse drug reaction reporting.
Flowers, P; Dzierba, S; Baker, O
1992-07-01
Crossfunctional teams can generate more new ideas, concepts, and possible solutions than does a department-based process alone. Working collaboratively can increase knowledge of teams using CQI approaches and appropriate tools. CQI produces growth and development at multiple levels resulting from involvement in the process of incremental improvement.
The Red and White Yeast Lab: An Introduction to Science as a Process.
ERIC Educational Resources Information Center
White, Brian T.
1999-01-01
Describes an experimental system based on an engineered strain of bakers' yeast that is designed to involve students in the process by which scientific knowledge is generated. Students are asked to determine why the yeast grow to form a reproducible pattern of red and white. (WRM)
Investigation of Copper Sorption by Sugar Beet Processing Lime Waste
In the western United States, sugar beet processing for sugar recovery generates a lime-based waste product (~250,000 Mg yr-1) that has little liming value in the region’s calcareous soils. This area has recently experienced an increase in dairy production, with dairi...
Students Matter: Quality Measurements in Online Courses
ERIC Educational Resources Information Center
Unal, Zafer; Unal, Aslihan
2016-01-01
Quality Matters (QM) is a peer review process designed to certify the quality of online courses and online components. It has generated widespread interest and received national recognition for its peer-based approach to quality assurance and continuous improvement in online education. While the entire QM online course design process is…
NASA Technical Reports Server (NTRS)
Margaria, Tiziana (Inventor); Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor); Steffen, Bernard (Inventor)
2010-01-01
Systems, methods and apparatus are provided through which in some embodiments, automata learning algorithms and techniques are implemented to generate a more complete set of scenarios for requirements based programming. More specifically, a CSP-based, syntax-oriented model construction, which requires the support of a theorem prover, is complemented by model extrapolation, via automata learning. This may support the systematic completion of the requirements, the nature of the requirement being partial, which provides focus on the most prominent scenarios. This may generalize requirement skeletons by extrapolation and may indicate by way of automatically generated traces where the requirement specification is too loose and additional information is required.
Embedding Task-Based Neural Models into a Connectome-Based Model of the Cerebral Cortex.
Ulloa, Antonio; Horwitz, Barry
2016-01-01
A number of recent efforts have used large-scale, biologically realistic, neural models to help understand the neural basis for the patterns of activity observed in both resting state and task-related functional neural imaging data. An example of the former is The Virtual Brain (TVB) software platform, which allows one to apply large-scale neural modeling in a whole brain framework. TVB provides a set of structural connectomes of the human cerebral cortex, a collection of neural processing units for each connectome node, and various forward models that can convert simulated neural activity into a variety of functional brain imaging signals. In this paper, we demonstrate how to embed a previously or newly constructed task-based large-scale neural model into the TVB platform. We tested our method on a previously constructed large-scale neural model (LSNM) of visual object processing that consisted of interconnected neural populations that represent, primary and secondary visual, inferotemporal, and prefrontal cortex. Some neural elements in the original model were "non-task-specific" (NS) neurons that served as noise generators to "task-specific" neurons that processed shapes during a delayed match-to-sample (DMS) task. We replaced the NS neurons with an anatomical TVB connectome model of the cerebral cortex comprising 998 regions of interest interconnected by white matter fiber tract weights. We embedded our LSNM of visual object processing into corresponding nodes within the TVB connectome. Reciprocal connections between TVB nodes and our task-based modules were included in this framework. We ran visual object processing simulations and showed that the TVB simulator successfully replaced the noise generation originally provided by NS neurons; i.e., the DMS tasks performed with the hybrid LSNM/TVB simulator generated equivalent neural and fMRI activity to that of the original task-based models. Additionally, we found partial agreement between the functional connectivities using the hybrid LSNM/TVB model and the original LSNM. Our framework thus presents a way to embed task-based neural models into the TVB platform, enabling a better comparison between empirical and computational data, which in turn can lead to a better understanding of how interacting neural populations give rise to human cognitive behaviors.
Client-Side Event Processing for Personalized Web Advertisement
NASA Astrophysics Data System (ADS)
Stühmer, Roland; Anicic, Darko; Sen, Sinan; Ma, Jun; Schmidt, Kay-Uwe; Stojanovic, Nenad
The market for Web advertisement is continuously growing and correspondingly, the number of approaches that can be used for realizing Web advertisement are increasing. However, current approaches fail to generate very personalized ads for a current Web user that is visiting a particular Web content. They mainly try to develop a profile based on the content of that Web page or on a long-term user's profile, by not taking into account current user's preferences. We argue that by discovering a user's interest from his current Web behavior we can support the process of ad generation, especially the relevance of an ad for the user. In this paper we present the conceptual architecture and implementation of such an approach. The approach is based on the extraction of simple events from the user interaction with a Web page and their combination in order to discover the user's interests. We use semantic technologies in order to build such an interpretation out of many simple events. We present results from preliminary evaluation studies. The main contribution of the paper is a very efficient, semantic-based client-side architecture for generating and combining Web events. The architecture ensures the agility of the whole advertisement system, by complexly processing events on the client. In general, this work contributes to the realization of new, event-driven applications for the (Semantic) Web.
Understanding force-generating microtubule systems through in vitro reconstitution
Kok, Maurits; Dogterom, Marileen
2016-01-01
ABSTRACT Microtubules switch between growing and shrinking states, a feature known as dynamic instability. The biochemical parameters underlying dynamic instability are modulated by a wide variety of microtubule-associated proteins that enable the strict control of microtubule dynamics in cells. The forces generated by controlled growth and shrinkage of microtubules drive a large range of processes, including organelle positioning, mitotic spindle assembly, and chromosome segregation. In the past decade, our understanding of microtubule dynamics and microtubule force generation has progressed significantly. Here, we review the microtubule-intrinsic process of dynamic instability, the effect of external factors on this process, and how the resulting forces act on various biological systems. Recently, reconstitution-based approaches have strongly benefited from extensive biochemical and biophysical characterization of individual components that are involved in regulating or transmitting microtubule-driven forces. We will focus on the current state of reconstituting increasingly complex biological systems and provide new directions for future developments. PMID:27715396
Thermally assisted nanosecond laser generation of ferric nanoparticles
NASA Astrophysics Data System (ADS)
Kurselis, K.; Kozheshkurt, V.; Kiyan, R.; Chichkov, B.; Sajti, L.
2018-03-01
A technique to increase nanosecond laser based production of ferric nanoparticles by elevating temperature of the iron target and controlling its surface exposure to oxygen is reported. High power near-infrared laser ablation of the iron target heated up to 600 °C enhances the particle generation efficiency by more than tenfold exceeding 6 μg/J. Temporal and thermal dependencies of the particle generation process indicate correlation of this enhancement with the oxidative processes that take place on the iron surface during the per spot interpulse delay. Nanoparticles, produced using the heat-assisted ablation technique, are examined using scanning electron and transmission electron microscopy confirming the presence of 1-100 nm nanoparticles with an exponential size distribution that contain multiple randomly oriented magnetite nanocrystallites. The described process enables the application of high power lasers and facilitates precise, uniform, and controllable direct deposition of ferric nanoparticle coatings at the industry-relevant rates.
Ultrafast acousto-optic mode conversion in optically birefringent ferroelectrics
NASA Astrophysics Data System (ADS)
Lejman, Mariusz; Vaudel, Gwenaelle; Infante, Ingrid C.; Chaban, Ievgeniia; Pezeril, Thomas; Edely, Mathieu; Nataf, Guillaume F.; Guennou, Mael; Kreisel, Jens; Gusev, Vitalyi E.; Dkhil, Brahim; Ruello, Pascal
2016-08-01
The ability to generate efficient giga-terahertz coherent acoustic phonons with femtosecond laser makes acousto-optics a promising candidate for ultrafast light processing, which faces electronic device limits intrinsic to complementary metal oxide semiconductor technology. Modern acousto-optic devices, including optical mode conversion process between ordinary and extraordinary light waves (and vice versa), remain limited to the megahertz range. Here, using coherent acoustic waves generated at tens of gigahertz frequency by a femtosecond laser pulse, we reveal the mode conversion process and show its efficiency in ferroelectric materials such as BiFeO3 and LiNbO3. Further to the experimental evidence, we provide a complete theoretical support to this all-optical ultrafast mechanism mediated by acousto-optic interaction. By allowing the manipulation of light polarization with gigahertz coherent acoustic phonons, our results provide a novel route for the development of next-generation photonic-based devices and highlight new capabilities in using ferroelectrics in modern photonics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harvey, Dustin Yewell
Echo™ is a MATLAB-based software package designed for robust and scalable analysis of complex data workflows. An alternative to tedious, error-prone conventional processes, Echo is based on three transformative principles for data analysis: self-describing data, name-based indexing, and dynamic resource allocation. The software takes an object-oriented approach to data analysis, intimately connecting measurement data with associated metadata. Echo operations in an analysis workflow automatically track and merge metadata and computation parameters to provide a complete history of the process used to generate final results, while automated figure and report generation tools eliminate the potential to mislabel those results. History reportingmore » and visualization methods provide straightforward auditability of analysis processes. Furthermore, name-based indexing on metadata greatly improves code readability for analyst collaboration and reduces opportunities for errors to occur. Echo efficiently manages large data sets using a framework that seamlessly allocates resources such that only the necessary computations to produce a given result are executed. Echo provides a versatile and extensible framework, allowing advanced users to add their own tools and data classes tailored to their own specific needs. Applying these transformative principles and powerful features, Echo greatly improves analyst efficiency and quality of results in many application areas.« less
Context-based automated defect classification system using multiple morphological masks
Gleason, Shaun S.; Hunt, Martin A.; Sari-Sarraf, Hamed
2002-01-01
Automatic detection of defects during the fabrication of semiconductor wafers is largely automated, but the classification of those defects is still performed manually by technicians. This invention includes novel digital image analysis techniques that generate unique feature vector descriptions of semiconductor defects as well as classifiers that use these descriptions to automatically categorize the defects into one of a set of pre-defined classes. Feature extraction techniques based on multiple-focus images, multiple-defect mask images, and segmented semiconductor wafer images are used to create unique feature-based descriptions of the semiconductor defects. These feature-based defect descriptions are subsequently classified by a defect classifier into categories that depend on defect characteristics and defect contextual information, that is, the semiconductor process layer(s) with which the defect comes in contact. At the heart of the system is a knowledge database that stores and distributes historical semiconductor wafer and defect data to guide the feature extraction and classification processes. In summary, this invention takes as its input a set of images containing semiconductor defect information, and generates as its output a classification for the defect that describes not only the defect itself, but also the location of that defect with respect to the semiconductor process layers.
Echtermeyer, Alexander; Amar, Yehia; Zakrzewski, Jacek; Lapkin, Alexei
2017-01-01
A recently described C(sp 3 )-H activation reaction to synthesise aziridines was used as a model reaction to demonstrate the methodology of developing a process model using model-based design of experiments (MBDoE) and self-optimisation approaches in flow. The two approaches are compared in terms of experimental efficiency. The self-optimisation approach required the least number of experiments to reach the specified objectives of cost and product yield, whereas the MBDoE approach enabled a rapid generation of a process model.
Concept maps: A tool for knowledge management and synthesis in web-based conversational learning.
Joshi, Ankur; Singh, Satendra; Jaswal, Shivani; Badyal, Dinesh Kumar; Singh, Tejinder
2016-01-01
Web-based conversational learning provides an opportunity for shared knowledge base creation through collaboration and collective wisdom extraction. Usually, the amount of generated information in such forums is very huge, multidimensional (in alignment with the desirable preconditions for constructivist knowledge creation), and sometimes, the nature of expected new information may not be anticipated in advance. Thus, concept maps (crafted from constructed data) as "process summary" tools may be a solution to improve critical thinking and learning by making connections between the facts or knowledge shared by the participants during online discussion This exploratory paper begins with the description of this innovation tried on a web-based interacting platform (email list management software), FAIMER-Listserv, and generated qualitative evidence through peer-feedback. This process description is further supported by a theoretical construct which shows how social constructivism (inclusive of autonomy and complexity) affects the conversational learning. The paper rationalizes the use of concept map as mid-summary tool for extracting information and further sense making out of this apparent intricacy.
The distinctiveness heuristic in false recognition and false recall.
McCabe, David P; Smith, Anderson D
2006-07-01
The effects of generative processing on false recognition and recall were examined in four experiments using the Deese-Roediger-McDermott false memory paradigm (Deese, 1959; Roediger & McDermott, 1995). In each experiment, a Generate condition in which subjects generated studied words from audio anagrams was compared to a Control condition in which subjects simply listened to studied words presented normally. Rates of false recognition and false recall were lower for critical lures associated with generated lists, than for critical lures associated with control lists, but only in between-subjects designs. False recall and recognition did not differ when generate and control conditions were manipulated within-subjects. This pattern of results is consistent with the distinctiveness heuristic (Schacter, Israel, & Racine, 1999), a metamemorial decision-based strategy whereby global changes in decision criteria lead to reductions of false memories. This retrieval-based monitoring mechanism appears to operate in a similar fashion in reducing false recognition and false recall.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tamrin, Mohd Izzuddin Mohd; Turaev, Sherzod; Sembok, Tengku Mohd Tengku
There are tremendous works in biotechnology especially in area of DNA molecules. The computer society is attempting to develop smaller computing devices through computational models which are based on the operations performed on the DNA molecules. A Watson-Crick automaton, a theoretical model for DNA based computation, has two reading heads, and works on double-stranded sequences of the input related by a complementarity relation similar with the Watson-Crick complementarity of DNA nucleotides. Over the time, several variants of Watson-Crick automata have been introduced and investigated. However, they cannot be used as suitable DNA based computational models for molecular stochastic processes andmore » fuzzy processes that are related to important practical problems such as molecular parsing, gene disease detection, and food authentication. In this paper we define new variants of Watson-Crick automata, called weighted Watson-Crick automata, developing theoretical models for molecular stochastic and fuzzy processes. We define weighted Watson-Crick automata adapting weight restriction mechanisms associated with formal grammars and automata. We also study the generative capacities of weighted Watson-Crick automata, including probabilistic and fuzzy variants. We show that weighted variants of Watson-Crick automata increase their generative power.« less
2017-01-01
Small push–pull molecules attract much attention as prospective donor materials for organic solar cells (OSCs). By chemical engineering, it is possible to combine a number of attractive properties such as broad absorption, efficient charge separation, and vacuum and solution processabilities in a single molecule. Here we report the synthesis and early time photophysics of such a molecule, TPA-2T-DCV-Me, based on the triphenylamine (TPA) donor core and dicyanovinyl (DCV) acceptor end group connected by a thiophene bridge. Using time-resolved photoinduced absorption and photoluminescence, we demonstrate that in blends with [70]PCBM the molecule works both as an electron donor and hole acceptor, thereby allowing for two independent channels of charge generation. The charge-generation process is followed by the recombination of interfacial charge transfer states that takes place on the subnanosecond time scale as revealed by time-resolved photoluminescence and nongeminate recombination as follows from the OSC performance. Our findings demonstrate the potential of TPA-DCV-based molecules as donor materials for both solution-processed and vacuum-deposited OSCs. PMID:28413568
Weighted Watson-Crick automata
NASA Astrophysics Data System (ADS)
Tamrin, Mohd Izzuddin Mohd; Turaev, Sherzod; Sembok, Tengku Mohd Tengku
2014-07-01
There are tremendous works in biotechnology especially in area of DNA molecules. The computer society is attempting to develop smaller computing devices through computational models which are based on the operations performed on the DNA molecules. A Watson-Crick automaton, a theoretical model for DNA based computation, has two reading heads, and works on double-stranded sequences of the input related by a complementarity relation similar with the Watson-Crick complementarity of DNA nucleotides. Over the time, several variants of Watson-Crick automata have been introduced and investigated. However, they cannot be used as suitable DNA based computational models for molecular stochastic processes and fuzzy processes that are related to important practical problems such as molecular parsing, gene disease detection, and food authentication. In this paper we define new variants of Watson-Crick automata, called weighted Watson-Crick automata, developing theoretical models for molecular stochastic and fuzzy processes. We define weighted Watson-Crick automata adapting weight restriction mechanisms associated with formal grammars and automata. We also study the generative capacities of weighted Watson-Crick automata, including probabilistic and fuzzy variants. We show that weighted variants of Watson-Crick automata increase their generative power.
An Interactive Preliminary Design System of High Speed Forebody and Inlet Flows
NASA Technical Reports Server (NTRS)
Liou, May-Fun; Benson, Thomas J.; Trefny, Charles J.
2010-01-01
This paper demonstrates a simulation-based aerodynamic design process of high speed inlet. A genetic algorithm is integrated into the design process to facilitate the single objective optimization. The objective function is the total pressure recovery and is obtained by using a PNS solver for its computing efficiency. The system developed uses existing software of geometry definition, mesh generation and CFD analysis. The process which produces increasingly desirable design in each genetic evolution over many generations is automatically carried out. A generic two-dimensional inlet is created as a showcase to demonstrate the capabilities of this tool. A parameterized study of geometric shape and size of the showcase is also presented.
Nonlinear, non-stationary image processing technique for eddy current NDE
NASA Astrophysics Data System (ADS)
Yang, Guang; Dib, Gerges; Kim, Jaejoon; Zhang, Lu; Xin, Junjun; Udpa, Lalita
2012-05-01
Automatic analysis of eddy current (EC) data has facilitated the analysis of large volumes of data generated in the inspection of steam generator tubes in nuclear power plants. The traditional procedure for analysis of EC data includes data calibration, pre-processing, region of interest (ROI) detection, feature extraction and classification. Accurate ROI detection has been enhanced by pre-processing, which involves reducing noise and other undesirable components as well as enhancing defect indications in the raw measurement. This paper presents the Hilbert-Huang Transform (HHT) for feature extraction and support vector machine (SVM) for classification. The performance is shown to significantly better than the existing rule based classification approach used in industry.
NASA Astrophysics Data System (ADS)
Okada, Takashi; Nishimura, Fumihiro; Xu, Zhanglian; Yonezawa, Susumu
2018-06-01
We propose a method of reduction-melting at 1000 °C, using a sodium-based flux, to recover lead from cathode-ray tube funnel glass. To recover the added sodium from the treated glass, we combined a reduction-melting process with a subsequent annealing step at 700 °C, generating water-soluble sodium compounds in the molten glass. Using this combined process, this study compares lead removal behavior and the generation of water-soluble sodium compounds (sodium silicates and carbonates) in order to gain fundamental information to enhance the recovery of both lead and sodium. We find that lead removal increases with increasing melting time, whereas the generation efficiency of water-soluble sodium increases and decreases periodically. In particular, near 90% lead removal, the generation of water-soluble sodium compounds decreased sharply, increasing again with the prolongation of melting time. This is due to the different crystallization and phase separation efficiencies of water-soluble sodium in molten glass, whose structure continuously changes with lead removal. Previous studies used a melting time of 60 min in the processes. However, in this study, we observe that a melting time of 180 min enhances the water-soluble sodium generation efficiency.
Post-treatment of reclaimed waste water based on an electrochemical advanced oxidation process
NASA Technical Reports Server (NTRS)
Verostko, Charles E.; Murphy, Oliver J.; Hitchens, G. D.; Salinas, Carlos E.; Rogers, Tom D.
1992-01-01
The purification of reclaimed water is essential to water reclamation technology life-support systems in lunar/Mars habitats. An electrochemical UV reactor is being developed which generates oxidants, operates at low temperatures, and requires no chemical expendables. The reactor is the basis for an advanced oxidation process in which electrochemically generated ozone and hydrogen peroxide are used in combination with ultraviolet light irradiation to produce hydroxyl radicals. Results from this process are presented which demonstrate concept feasibility for removal of organic impurities and disinfection of water for potable and hygiene reuse. Power, size requirements, Faradaic efficiency, and process reaction kinetics are discussed. At the completion of this development effort the reactor system will be installed in JSC's regenerative water recovery test facility for evaluation to compare this technique with other candidate processes.
Back to the Future: Consistency-Based Trajectory Tracking
NASA Technical Reports Server (NTRS)
Kurien, James; Nayak, P. Pandurand; Norvig, Peter (Technical Monitor)
2000-01-01
Given a model of a physical process and a sequence of commands and observations received over time, the task of an autonomous controller is to determine the likely states of the process and the actions required to move the process to a desired configuration. We introduce a representation and algorithms for incrementally generating approximate belief states for a restricted but relevant class of partially observable Markov decision processes with very large state spaces. The algorithm presented incrementally generates, rather than revises, an approximate belief state at any point by abstracting and summarizing segments of the likely trajectories of the process. This enables applications to efficiently maintain a partial belief state when it remains consistent with observations and revisit past assumptions about the process' evolution when the belief state is ruled out. The system presented has been implemented and results on examples from the domain of spacecraft control are presented.
Amyloid-β Production Via Cleavage of Amyloid-β Protein Precursor is Modulated by Cell Density
Zhang, Can; Browne, Andrew; DiVito, Jason R.; Stevenson, Jesse A.; Romano, Donna; Dong, Yuanlin; Xie, Zhongcong; Tanzi, Rudolph E.
2012-01-01
Mounting evidence suggests that Alzheimer disease (AD) is caused by the accumulation of the small peptide, Aβ, a proteolytic cleavage product of amyloid-β protein precursor (AβPP; or APP). Aβ is generated through a serial cleavage of APP by β- and γ-secretase. Aβ40 and Aβ42 are the two main components of amyloid plaques in AD brains, with Aβ42 being more prone to aggregation. APP can also be processed by α-secretase, which cleaves APP within the Aβ sequence, thereby preventing the generation of Aβ. Little is currently known regarding the effects of cell density on APP processing and Aβ generation. Here we assessed the effects of cell density on APP processing in neuronal and non-neuronal cell lines, as well as mouse primary cortical neurons. We found that decreased cell density significantly increases levels of Aβ40, Aβ42, total Aβ, and the ratio of Aβ42:Aβ40. These results also indicate that cell density is a significant modulator of APP processing. Overall, these findings carry profound implications for both previous and forthcoming studies aiming to assess the effects of various conditions and genetic/chemical factors, e.g. novel drugs on APP processing and Aβ generation in cell-based systems. Moreover, it is interesting to speculate whether cell density changes in vivo may also affect APP processing and Aβ levels in the AD brain. PMID:20847415
Dedicated nuclear facilities for electrolytic hydrogen production
NASA Technical Reports Server (NTRS)
Foh, S. E.; Escher, W. J. D.; Donakowski, T. D.
1979-01-01
An advanced technology, fully dedicated nuclear-electrolytic hydrogen production facility is presented. This plant will produce hydrogen and oxygen only and no electrical power will be generated for off-plant use. The conceptual design was based on hydrogen production to fill a pipeline at 1000 psi and a 3000 MW nuclear base, and the base-line facility nuclear-to-shaftpower and shaftpower-to-electricity subsystems, the water treatment subsystem, electricity-to-hydrogen subsystem, hydrogen compression, efficiency, and hydrogen production cost are discussed. The final conceptual design integrates a 3000 MWth high-temperature gas-cooled reactor operating at 980 C helium reactor-out temperature, direct dc electricity generation via acyclic generators, and high-current density, high-pressure electrolyzers based on the solid polymer electrolyte approach. All subsystems are close-coupled and optimally interfaced and pipeline hydrogen is produced at 1000 psi. Hydrogen costs were about half of the conventional nuclear electrolysis process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buckley, Andy; /Edinburgh U.; Butterworth, Jonathan
We review the physics basis, main features and use of general-purpose Monte Carlo event generators for the simulation of proton-proton collisions at the Large Hadron Collider. Topics included are: the generation of hard-scattering matrix elements for processes of interest, at both leading and next-to-leading QCD perturbative order; their matching to approximate treatments of higher orders based on the showering approximation; the parton and dipole shower formulations; parton distribution functions for event generators; non-perturbative aspects such as soft QCD collisions, the underlying event and diffractive processes; the string and cluster models for hadron formation; the treatment of hadron and tau decays;more » the inclusion of QED radiation and beyond-Standard-Model processes. We describe the principal features of the Ariadne, Herwig++, Pythia 8 and Sherpa generators, together with the Rivet and Professor validation and tuning tools, and discuss the physics philosophy behind the proper use of these generators and tools. This review is aimed at phenomenologists wishing to understand better how parton-level predictions are translated into hadron-level events as well as experimentalists wanting a deeper insight into the tools available for signal and background simulation at the LHC.« less
Correlation in photon pairs generated using four-wave mixing in a cold atomic ensemble
NASA Astrophysics Data System (ADS)
Ferdinand, Andrew Richard; Manjavacas, Alejandro; Becerra, Francisco Elohim
2017-04-01
Spontaneous four-wave mixing (FWM) in atomic ensembles can be used to generate narrowband entangled photon pairs at or near atomic resonances. While extensive research has been done to investigate the quantum correlations in the time and polarization of such photon pairs, the study and control of high dimensional quantum correlations contained in their spatial degrees of freedom has not been fully explored. In our work we experimentally investigate the generation of correlated light from FWM in a cold ensemble of cesium atoms as a function of the frequencies of the pump fields in the FWM process. In addition, we theoretically study the spatial correlations of the photon pairs generated in the FWM process, specifically the joint distribution of their orbital angular momentum (OAM). We investigate the width of the distribution of the OAM modes, known as the spiral bandwidth, and the purity of OAM correlations as a function of the properties of the pump fields, collected photons, and the atomic ensemble. These studies will guide experiments involving high dimensional entanglement of photons generated from this FWM process and OAM-based quantum communication with atomic ensembles. This work is supported by AFORS Grant FA9550-14-1-0300.
Facio, Flavia M; Sapp, Julie C; Linn, Amy; Biesecker, Leslie G
2012-10-10
Massively-parallel sequencing (MPS) technologies create challenges for informed consent of research participants given the enormous scale of the data and the wide range of potential results. We propose that the consent process in these studies be based on whether they use MPS to test a hypothesis or to generate hypotheses. To demonstrate the differences in these approaches to informed consent, we describe the consent processes for two MPS studies. The purpose of our hypothesis-testing study is to elucidate the etiology of rare phenotypes using MPS. The purpose of our hypothesis-generating study is to test the feasibility of using MPS to generate clinical hypotheses, and to approach the return of results as an experimental manipulation. Issues to consider in both designs include: volume and nature of the potential results, primary versus secondary results, return of individual results, duty to warn, length of interaction, target population, and privacy and confidentiality. The categorization of MPS studies as hypothesis-testing versus hypothesis-generating can help to clarify the issue of so-called incidental or secondary results for the consent process, and aid the communication of the research goals to study participants.
Modeling and Simulation of the Economics of Mining in the Bitcoin Market
Marchesi, Michele
2016-01-01
In January 3, 2009, Satoshi Nakamoto gave rise to the “Bitcoin Blockchain”, creating the first block of the chain hashing on his computer’s central processing unit (CPU). Since then, the hash calculations to mine Bitcoin have been getting more and more complex, and consequently the mining hardware evolved to adapt to this increasing difficulty. Three generations of mining hardware have followed the CPU’s generation. They are GPU’s, FPGA’s and ASIC’s generations. This work presents an agent-based artificial market model of the Bitcoin mining process and of the Bitcoin transactions. The goal of this work is to model the economy of the mining process, starting from GPU’s generation, the first with economic significance. The model reproduces some “stylized facts” found in real-time price series and some core aspects of the mining business. In particular, the computational experiments performed can reproduce the unit root property, the fat tail phenomenon and the volatility clustering of Bitcoin price series. In addition, under proper assumptions, they can reproduce the generation of Bitcoins, the hashing capability, the power consumption, and the mining hardware and electrical energy expenditures of the Bitcoin network. PMID:27768691
Esplandiu, Maria J; Farniya, Ali Afshar; Bachtold, Adrian
2015-11-24
We report a simple yet highly efficient chemical motor that can be controlled with visible light. The motor made from a noble metal and doped silicon acts as a pump, which is driven through a light-activated catalytic reaction process. We show that the actuation is based on electro-osmosis with the electric field generated by chemical reactions at the metal and silicon surfaces, whereas the contribution of diffusio-osmosis to the actuation is negligible. Surprisingly, the pump can be operated using water as fuel. This is possible because of the large ζ-potential of silicon, which makes the electro-osmotic fluid motion sizable even though the electric field generated by the reaction is weak. The electro-hydrodynamic process is greatly amplified with the addition of reactive species, such as hydrogen peroxide, which generates higher electric fields. Another remarkable finding is the tunability of silicon-based pumps. That is, it is possible to control the speed of the fluid with light. We take advantage of this property to manipulate the spatial distribution of colloidal microparticles in the liquid and to pattern colloidal microparticle structures at specific locations on a wafer surface. Silicon-based pumps hold great promise for controlled mass transport in fluids.
NASA Astrophysics Data System (ADS)
Ahmad, Faizan; Chen, Yiqiang; Hu, Lisha; Wang, Shuangquan; Wang, Jindong; Chen, Zhenyu; Jiang, Xinlong; Shen, Jianfei
2017-11-01
Currently available traditional as well as videogame-based cognitive assessment techniques are inappropriate due to several reasons. This paper presents a novel psychosocial game suite, BrainStorm, for non-invasive cross-generational cognitive capabilities data collection, which additionally provides cross-generational social support. A motivation behind the development of presented game suite is to provide an entertaining and exciting platform for its target users in order to collect gameplay-based cognitive capabilities data in a non-invasive manner. An extensive evaluation of the presented game suite demonstrated high acceptability and attraction for its target users. Besides, the data collection process is successfully reported as transparent and non-invasive.
Design Optimization of Gas Generator Hybrid Propulsion Boosters
NASA Technical Reports Server (NTRS)
Weldon, Vincent; Phillips, Dwight; Fink, Larry
1990-01-01
A methodology used in support of a study for NASA/MSFC to optimize the design of gas generator hybrid propulsion booster for uprating the National Space Transportation System (NSTS) is presented. The objective was to compare alternative configurations for this booster approach, optimizing each candidate concept on different bases, in order to develop data for a trade table on which a final decision was based. The methodology is capable of processing a large number of independent and dependent variables, adjusting the overall subsystems characteristics to arrive at a best compromise integrated design to meet various specific optimization criteria subject to selected constraints. For each system considered, a detailed weight statement was generated along with preliminary cost and reliability estimates.
Breckinridge Project, initial effort
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1982-01-01
The project cogeneration plant supplies electric power, process steam and treated boiler feedwater for use by the project plants. The plant consists of multiple turbine generators and steam generators connected to a common main steam header. The major plant systems which are required to produce steam, electrical power and treated feedwater are discussed individually. The systems are: steam, steam generator, steam generator fuel, condensate and feedwater deaeration, condensate and blowdown collection, cooling water, boiler feedwater treatment, coal handling, ash handling (fly ash and bottom ash), electrical, and control system. The plant description is based on the Phase Zero design basismore » established for Plant 31 in July of 1980 and the steam/condensate balance as presented on Drawing 31-E-B-1. Updating of steam requirements as more refined process information becomes available has generated some changes in the steam balance. Boiler operation with these updated requirements is reflected on Drawing 31-D-B-1A. The major impact of updating has been that less 600 psig steam generated within the process units requires more extraction steam from the turbine generators to close the 600 psig steam balance. Since the 900 psig steam generation from the boilers was fixed at 1,200,000 lb/hr, the additional extraction steam required to close the 600 psig steam balance decreased the quantity of electrical power available from the turbine generators. In the next phase of engineering work, the production of 600 psig steam will be augmented by increasing convection bank steam generation in the Plant 3 fired heaters by 140,000 to 150,000 lb/hr. This modification will allow full rated power generation from the turbine generators.« less
Thinking big: Towards ideal strains and processes for large-scale aerobic biofuels production
DOE Office of Scientific and Technical Information (OSTI.GOV)
McMillan, James D.; Beckham, Gregg T.
In this study, global concerns about anthropogenic climate change, energy security and independence, and environmental consequences of continued fossil fuel exploitation are driving significant public and private sector interest and financing to hasten development and deployment of processes to produce renewable fuels, as well as bio-based chemicals and materials, towards scales commensurate with current fossil fuel-based production. Over the past two decades, anaerobic microbial production of ethanol from first-generation hexose sugars derived primarily from sugarcane and starch has reached significant market share worldwide, with fermentation bioreactor sizes often exceeding the million litre scale. More recently, industrial-scale lignocellulosic ethanol plants aremore » emerging that produce ethanol from pentose and hexose sugars using genetically engineered microbes and bioreactor scales similar to first-generation biorefineries.« less
Thinking big: Towards ideal strains and processes for large-scale aerobic biofuels production
McMillan, James D.; Beckham, Gregg T.
2016-12-22
In this study, global concerns about anthropogenic climate change, energy security and independence, and environmental consequences of continued fossil fuel exploitation are driving significant public and private sector interest and financing to hasten development and deployment of processes to produce renewable fuels, as well as bio-based chemicals and materials, towards scales commensurate with current fossil fuel-based production. Over the past two decades, anaerobic microbial production of ethanol from first-generation hexose sugars derived primarily from sugarcane and starch has reached significant market share worldwide, with fermentation bioreactor sizes often exceeding the million litre scale. More recently, industrial-scale lignocellulosic ethanol plants aremore » emerging that produce ethanol from pentose and hexose sugars using genetically engineered microbes and bioreactor scales similar to first-generation biorefineries.« less
Terai, Asuka; Nakagawa, Masanori
2007-08-01
The purpose of this paper is to construct a model that represents the human process of understanding metaphors, focusing specifically on similes of the form an "A like B". Generally speaking, human beings are able to generate and understand many sorts of metaphors. This study constructs the model based on a probabilistic knowledge structure for concepts which is computed from a statistical analysis of a large-scale corpus. Consequently, this model is able to cover the many kinds of metaphors that human beings can generate. Moreover, the model implements the dynamic process of metaphor understanding by using a neural network with dynamic interactions. Finally, the validity of the model is confirmed by comparing model simulations with the results from a psychological experiment.
NASA Astrophysics Data System (ADS)
Nishino, Takayuki
The face hobbing process has been widely applied in automotive industry. But so far few analytical tools have been developed. This makes it difficult for us to optimize gear design. To settle this situation, this study aims at developing a computerized tool to predict the running performances such as loaded tooth contact pattern, static transmission error and so on. First, based upon kinematical analysis of a cutting machine, a mathematical description of tooth surface generation is given. Second, based upon the theory of gearing and differential geometry, conjugate tooth surfaces are studied. Then contact lines are generated. Third, load distribution along contact lines is formulated. Last, the numerical model is validated by measuring loaded transmission error and loaded tooth contact pattern.
Integral processing in beyond-Hartree-Fock calculations
NASA Technical Reports Server (NTRS)
Taylor, P. R.
1986-01-01
The increasing rate at which improvements in processing capacity outstrip improvements in input/output performance of large computers has led to recent attempts to bypass generation of a disk-based integral file. The direct self-consistent field (SCF) method of Almlof and co-workers represents a very successful implementation of this approach. This paper is concerned with the extension of this general approach to configuration interaction (CI) and multiconfiguration-self-consistent field (MCSCF) calculations. After a discussion of the particular types of molecular orbital (MO) integrals for which -- at least for most current generation machines -- disk-based storage seems unavoidable, it is shown how all the necessary integrals can be obtained as matrix elements of Coulomb and exchange operators that can be calculated using a direct approach. Computational implementations of such a scheme are discussed.
Proposal for a self-excited electrically driven surface plasmon polariton generator
NASA Astrophysics Data System (ADS)
Bordo, V. G.
2017-01-01
We propose a generator of surface plasmon polaritons (SPPs) which, unlike spasers or plasmon lasers, does not require stimulated emission in the system. Its principle of operation is based on a positive feedback which an ensemble of classical oscillating dipoles experiences from a reflective surface located in its near field. The generator design includes a nanocavity between two metal surfaces which contains metal nanoparticles in its interior. The whole structure is placed onto a prism surface that allows one to detect the generated SPPs in the Kretschmann configuration. The generation process is driven by a moderate DC voltage applied between the metal covers of the cavity. Both the generation criterion and the steady-state operation of the generator are investigated.
Application of Multimedia Design Principles to Visuals Used in Course-Books: An Evaluation Tool
ERIC Educational Resources Information Center
Kuzu, Abdullah; Akbulut, Yavuz; Sahin, Mehmet Can
2007-01-01
This paper introduces an evaluation tool prepared to examine the quality of visuals in course-books. The tool is based on Mayer's Cognitive Theory of Multimedia Learning (i.e. Generative Theory) and its principles regarding the correct use of illustrations within text. The reason to generate the tool, the development process along with the…
Stone, Vathsala I; Lane, Joseph P
2012-05-16
Government-sponsored science, technology, and innovation (STI) programs support the socioeconomic aspects of public policies, in addition to expanding the knowledge base. For example, beneficial healthcare services and devices are expected to result from investments in research and development (R&D) programs, which assume a causal link to commercial innovation. Such programs are increasingly held accountable for evidence of impact-that is, innovative goods and services resulting from R&D activity. However, the absence of comprehensive models and metrics skews evidence gathering toward bibliometrics about research outputs (published discoveries), with less focus on transfer metrics about development outputs (patented prototypes) and almost none on econometrics related to production outputs (commercial innovations). This disparity is particularly problematic for the expressed intent of such programs, as most measurable socioeconomic benefits result from the last category of outputs. This paper proposes a conceptual framework integrating all three knowledge-generating methods into a logic model, useful for planning, obtaining, and measuring the intended beneficial impacts through the implementation of knowledge in practice. Additionally, the integration of the Context-Input-Process-Product (CIPP) model of evaluation proactively builds relevance into STI policies and programs while sustaining rigor. The resulting logic model framework explicitly traces the progress of knowledge from inputs, following it through the three knowledge-generating processes and their respective knowledge outputs (discovery, invention, innovation), as it generates the intended socio-beneficial impacts. It is a hybrid model for generating technology-based innovations, where best practices in new product development merge with a widely accepted knowledge-translation approach. Given the emphasis on evidence-based practice in the medical and health fields and "bench to bedside" expectations for knowledge transfer, sponsors and grantees alike should find the model useful for planning, implementing, and evaluating innovation processes. High-cost/high-risk industries like healthcare require the market deployment of technology-based innovations to improve domestic society in a global economy. An appropriate balance of relevance and rigor in research, development, and production is crucial to optimize the return on public investment in such programs. The technology-innovation process needs a comprehensive operational model to effectively allocate public funds and thereby deliberately and systematically accomplish socioeconomic benefits.
2012-01-01
Background Government-sponsored science, technology, and innovation (STI) programs support the socioeconomic aspects of public policies, in addition to expanding the knowledge base. For example, beneficial healthcare services and devices are expected to result from investments in research and development (R&D) programs, which assume a causal link to commercial innovation. Such programs are increasingly held accountable for evidence of impact—that is, innovative goods and services resulting from R&D activity. However, the absence of comprehensive models and metrics skews evidence gathering toward bibliometrics about research outputs (published discoveries), with less focus on transfer metrics about development outputs (patented prototypes) and almost none on econometrics related to production outputs (commercial innovations). This disparity is particularly problematic for the expressed intent of such programs, as most measurable socioeconomic benefits result from the last category of outputs. Methods This paper proposes a conceptual framework integrating all three knowledge-generating methods into a logic model, useful for planning, obtaining, and measuring the intended beneficial impacts through the implementation of knowledge in practice. Additionally, the integration of the Context-Input-Process-Product (CIPP) model of evaluation proactively builds relevance into STI policies and programs while sustaining rigor. Results The resulting logic model framework explicitly traces the progress of knowledge from inputs, following it through the three knowledge-generating processes and their respective knowledge outputs (discovery, invention, innovation), as it generates the intended socio-beneficial impacts. It is a hybrid model for generating technology-based innovations, where best practices in new product development merge with a widely accepted knowledge-translation approach. Given the emphasis on evidence-based practice in the medical and health fields and “bench to bedside” expectations for knowledge transfer, sponsors and grantees alike should find the model useful for planning, implementing, and evaluating innovation processes. Conclusions High-cost/high-risk industries like healthcare require the market deployment of technology-based innovations to improve domestic society in a global economy. An appropriate balance of relevance and rigor in research, development, and production is crucial to optimize the return on public investment in such programs. The technology-innovation process needs a comprehensive operational model to effectively allocate public funds and thereby deliberately and systematically accomplish socioeconomic benefits. PMID:22591638
Quantum Random Number Generation Using a Quanta Image Sensor
Amri, Emna; Felk, Yacine; Stucki, Damien; Ma, Jiaju; Fossum, Eric R.
2016-01-01
A new quantum random number generation method is proposed. The method is based on the randomness of the photon emission process and the single photon counting capability of the Quanta Image Sensor (QIS). It has the potential to generate high-quality random numbers with remarkable data output rate. In this paper, the principle of photon statistics and theory of entropy are discussed. Sample data were collected with QIS jot device, and its randomness quality was analyzed. The randomness assessment method and results are discussed. PMID:27367698
Effect of Pulse Width on Oxygen-fed Ozonizer
NASA Astrophysics Data System (ADS)
Okada, Sho; Wang, Douyan; Namihira, Takao; Katsuki, Sunao; Akiyama, Hidenori
Though general ozonizers based on silent discharge (barrier discharge) have been used to supply ozone at many industrial situations, there is still some problem, such as improvements of ozone yield. In this work, ozone was generated by pulsed discharge in order to improve the characteristics of ozone generation. It is known that a pulse width gives strong effect to the improvement of energy efficiency in exhaust gas processing. In this paper, the effect of pulse duration on ozone generation by pulsed discharge in oxygen would be reported.
Sensitivity of Attitude Determination on the Model Assumed for ISAR Radar Mappings
NASA Astrophysics Data System (ADS)
Lemmens, S.; Krag, H.
2013-09-01
Inverse synthetic aperture radars (ISAR) are valuable instrumentations for assessing the state of a large object in low Earth orbit. The images generated by these radars can reach a sufficient quality to be used during launch support or contingency operations, e.g. for confirming the deployment of structures, determining the structural integrity, or analysing the dynamic behaviour of an object. However, the direct interpretation of ISAR images can be a demanding task due to the nature of the range-Doppler space in which these images are produced. Recently, a tool has been developed by the European Space Agency's Space Debris Office to generate radar mappings of a target in orbit. Such mappings are a 3D-model based simulation of how an ideal ISAR image would be generated by a ground based radar under given processing conditions. These radar mappings can be used to support a data interpretation process. E.g. by processing predefined attitude scenarios during an observation sequence and comparing them with actual observations, one can detect non-nominal behaviour. Vice versa, one can also estimate the attitude states of the target by fitting the radar mappings to the observations. It has been demonstrated for the latter use case that a coarse approximation of the target through an 3D-model is already sufficient to derive the attitude information from the generated mappings. The level of detail required for the 3D-model is determined by the process of generating ISAR images, which is based on the theory of scattering bodies. Therefore, a complex surface can return an intrinsically noisy ISAR image. E.g. when many instruments on a satellite are visible to the observer, the ISAR image can suffer from multipath reflections. In this paper, we will further analyse the sensitivity of the attitude fitting algorithms to variations in the dimensions and the level of detail of the underlying 3D model. Moreover, we investigate the ability to estimate the orientations of different spacecraft components with respect to each other from the fitting procedure.
Janda, Jaroslav; Nfonsam, Valentine; Calienes, Fernanda; Sligh, James E; Jandova, Jana
2016-05-01
Mitochondria are the major source of reactive oxygen species (ROS) in fibroblasts which are thought to be crucial regulators of wound healing with a potential to affect the expression of nuclear genes involved in this process. ROS generated by mitochondria are involved in all stages of tissue repair process but the regulation of ROS-generating system in fibroblasts still remains poorly understood. The purpose of this study was to better understand molecular mechanisms of how the regulation of ROS levels generated by mitochondria may influence the process of wound repair. Cybrid model system of mtDNA variations was used to study the functional consequences of altered ROS levels on wound healing responses in a uniform nuclear background of cultured ρ(0) fibroblasts. Mitochondrial ROS in cybrids were modulated by antioxidants that quench ROS to examine their ability to close the wound. Real-time PCR arrays were used to investigate whether ROS generated by specific mtDNA variants have the ability to alter expression of some key nuclear-encoded genes central to the wound healing response and oxidative stress. Our data suggest levels of mitochondrial ROS affect expression of some nuclear encoded genes central to wound healing response and oxidative stress and modulation of mitochondrial ROS by antioxidants positively affects in vitro process of wound closure. Thus, regulation of mitochondrial ROS-generating system in fibroblasts can be used as effective natural redox-based strategy to help treat non-healing wounds.
Reflections on Educational Reform in Cuba.
ERIC Educational Resources Information Center
Lindahl, Ronald A.
1998-01-01
Reviews Cuban educational reforms, highlighting 1959, 1976, and 1980s initiatives. Compares Cuba's progress with John Kotter's eight-step process based on establishing a sense of urgency, creating a guiding coalition, developing and communicating the change vision, empowering broad-based action, generating short-term wins, consolidating gains, and…
Deductive Glue Code Synthesis for Embedded Software Systems Based on Code Patterns
NASA Technical Reports Server (NTRS)
Liu, Jian; Fu, Jicheng; Zhang, Yansheng; Bastani, Farokh; Yen, I-Ling; Tai, Ann; Chau, Savio N.
2006-01-01
Automated code synthesis is a constructive process that can be used to generate programs from specifications. It can, thus, greatly reduce the software development cost and time. The use of formal code synthesis approach for software generation further increases the dependability of the system. Though code synthesis has many potential benefits, the synthesis techniques are still limited. Meanwhile, components are widely used in embedded system development. Applying code synthesis to component based software development (CBSD) process can greatly enhance the capability of code synthesis while reducing the component composition efforts. In this paper, we discuss the issues and techniques for applying deductive code synthesis techniques to CBSD. For deductive synthesis in CBSD, a rule base is the key for inferring appropriate component composition. We use the code patterns to guide the development of rules. Code patterns have been proposed to capture the typical usages of the components. Several general composition operations have been identified to facilitate systematic composition. We present the technique for rule development and automated generation of new patterns from existing code patterns. A case study of using this method in building a real-time control system is also presented.
Igamberdiev, A U
1999-04-01
Biological organization is based on the coherent energy transfer allowing for macromolecules to operate with high efficiency and realize computation. Computation is executed with virtually 100% efficiency via the coherent operation of molecular machines in which low-energy recognitions trigger energy-driven non-equilibrium dynamic processes. The recognition process is of quantum mechanical nature being a non-demolition measurement. It underlies the enzymatic conversion of a substrate into the product (an elementary metabolic phenomenon); the switching via separation of the direct and reverse routes in futile cycles provides the generation and complication of metabolic networks (coherence within cycles is maintained by the supramolecular organization of enzymes); the genetic level corresponding to the appearance of digital information is based on reflective arrows (catalysts realize their own self-reproduction) and operation of hypercycles. Every metabolic cycle via reciprocal regulation of both its halves can generate rhythms and spatial structures (resulting from the temporally organized depositions from the cycles). Via coherent events which percolate from the elementary submolecular level to organismic entities, self-assembly based on the molecular complementarity is realized and the dynamic informational field operating within the metabolic network is generated.
Solid waste management of a chemical-looping combustion plant using Cu-based oxygen carriers.
García-Labiano, Francisco; Gayán, Pilar; Adánez, Juan; De Diego, Luis F; Forero, Carmen R
2007-08-15
Waste management generated from a Chemical-Looping Combustion (CLC) plant using copper-based materials is analyzed by two ways: the recovery and recycling of the used material and the disposal of the waste. A copper recovery process coupled to the CLC plant is proposed to avoid the loss of active material generated by elutriation from the system. Solid residues obtained from a 10 kWth CLC prototype operated during 100 h with a CuO-Al2O3 oxygen carrier prepared by impregnation were used as raw material in the recovery process. Recovering efficiencies of approximately 80% were obtained in the process, where the final products were an eluate of Cu(NO3)2 and a solid. The eluate was used for preparation of new oxygen carriers by impregnation, which exhibited high reactivity for reduction and oxidation reactions as well as adequate physical and chemical properties to be used in a CLC plant. The proposed recovery process largely decreases the amount of natural resources (Cu and Al203) employed in a CLC power plant as well as the waste generated in the process. To determine the stability of the different solid streams during deposition in a landfill, these were characterized with respect to their leaching behavior according to the European Union normative. The solid residue finally obtained in the CLC plant coupled to the recovery process (composed by Al2O3 and CuAl2O4) can be classified as a stable nonreactive hazardous waste acceptable at landfills for nonhazardous wastes.
NASA Astrophysics Data System (ADS)
Garg, M.; Kim, H. Y.; Goulielmakis, E.
2018-05-01
Optical waveforms of light reproducible with subcycle precision underlie applications of lasers in ultrafast spectroscopies, quantum control of matter and light-based signal processing. Nonlinear upconversion of optical pulses via high-harmonic generation in gas media extends these capabilities to the extreme ultraviolet (EUV). However, the waveform reproducibility of the generated EUV pulses in gases is inherently sensitive to intensity and phase fluctuations of the driving field. We used photoelectron interferometry to study the effects of intensity and carrier-envelope phase of an intense single-cycle optical pulse on the field waveform of EUV pulses generated in quartz nanofilms, and contrasted the results with those obtained in gas argon. The EUV waveforms generated in quartz were found to be virtually immune to the intensity and phase of the driving field, implying a non-recollisional character of the underlying emission mechanism. Waveform-sensitive photonic applications and precision measurements of fundamental processes in optics will benefit from these findings.
Particle acceleration and magnetic field generation in SNR shocks
NASA Astrophysics Data System (ADS)
Suslov, M.; Diamond, P. H.; Malkov, M. A.
2006-04-01
We discuss the diffusive acceleration mechanism in SNR shocks in terms of its potential to accelerate CRs to 10^18 eV, as observations imply. One possibility, currently discussed in the literature, is to resonantly generate a turbulent magnetic field via accelerated particles in excess of the background field. We analyze some problems of this scenario and suggest a different mechanism, which is based on the generation of Alfven waves at the gyroradius scale at the background field level, with a subsequent transfer to longer scales via interaction with strong acoustic turbulence in the shock precursor. The acoustic turbulence in turn, may be generated by Drury instability or by parametric instability of the Alfven (A) waves. The essential idea is an A->A+S decay instability process, where one of the interacting scatterers (i.e. the sound, or S-waves) are driven by the Drury instability process. This rapidly generates longer wavelength Alfven waves, which in turn resonate with high energy CRs thus binding them to the shock and enabling their further acceleration.
Economy of Force: Continuous Process Improvement And The Air Service
2017-06-01
a household goods move, viewed from the perspective of a customer when interacting with a service organization, assists in demonstrating this...improvement (CPI) as well. The components of a process that deliver a value-added effect to a consumer of the goods or services generated by the process...CPI is largely about the “voice of the customer ” and all organizations, service or production based, have customers and processes. There are value
Batching alternatives for Phase I retrieval wastes to be processed in WRAP Module 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayancsik, B.A.
1994-10-13
During the next two decades, the transuranic (TRU) waste now stored in the 200 Area burial trenches and storage buildings is to be retrieved, processed in the Waste Receiving and Processing (WRAP) Module 1 facility, and shipped to a final disposal facility. The purpose of this document is to identify the criteria that can be used to batch suspect TRU waste, currently in retrievable storage, for processing through the WRAP Module 1 facility. These criteria are then used to generate a batch plan for Phase 1 Retrieval operations, which will retrieve the waste located in Trench 4C-04 of the 200more » West Area burial ground. The reasons for batching wastes for processing in WRAP Module 1 include reducing the exposure of workers and the environment to hazardous material and ionizing radiation; maximizing the efficiency of the retrieval, processing, and disposal processes by reducing costs, time, and space throughout the process; reducing analytical sampling and analysis; and reducing the amount of cleanup and decontamination between process runs. The criteria selected for batching the drums of retrieved waste entering WRAP Module 1 are based on the available records for the wastes sent to storage as well as knowledge of the processes that generated these wastes. The batching criteria identified in this document include the following: waste generator; type of process used to generate or package the waste; physical waste form; content of hazardous/dangerous chemicals in the waste; radiochemical type and quantity of waste; drum weight; and special waste types. These criteria were applied to the waste drums currently stored in Trench 4C-04. At least one batching scheme is shown for each of the criteria listed above.« less
Performing Art-Based Research: Innovation in Graduate Art Therapy Education
ERIC Educational Resources Information Center
Moon, Bruce L.; Hoffman, Nadia
2014-01-01
This article presents an innovation in art therapy research and education in which art-based performance is used to generate, embody, and creatively synthesize knowledge. An art therapy graduate student's art-based process of inquiry serves to demonstrate how art and performance may be used to identify the research question, to conduct a process…
Process Development for the Design and Manufacturing of Personalizable Mouth Sticks.
Berger, Veronika M; Pölzer, Stephan; Nussbaum, Gerhard; Ernst, Waltraud; Major, Zoltan
2017-01-01
To increase the independence of people with reduced hand/arm functionality, a process to generate personalizable mouth sticks was developed based on the participatory design principle. In a web tool, anybody can choose the geometry and the materials of their mouth piece, stick and tip. Manufacturing techniques (e.g. 3D printing) and materials used in the process are discussed and evaluated.
Wang, Jing; Li, Heng; Fu, Weizhen; Chen, Yao; Li, Liming; Lyu, Qing; Han, Tingting; Chai, Xinyu
2016-01-01
Retinal prostheses have the potential to restore partial vision. Object recognition in scenes of daily life is one of the essential tasks for implant wearers. Still limited by the low-resolution visual percepts provided by retinal prostheses, it is important to investigate and apply image processing methods to convey more useful visual information to the wearers. We proposed two image processing strategies based on Itti's visual saliency map, region of interest (ROI) extraction, and image segmentation. Itti's saliency model generated a saliency map from the original image, in which salient regions were grouped into ROI by the fuzzy c-means clustering. Then Grabcut generated a proto-object from the ROI labeled image which was recombined with background and enhanced in two ways--8-4 separated pixelization (8-4 SP) and background edge extraction (BEE). Results showed that both 8-4 SP and BEE had significantly higher recognition accuracy in comparison with direct pixelization (DP). Each saliency-based image processing strategy was subject to the performance of image segmentation. Under good and perfect segmentation conditions, BEE and 8-4 SP obtained noticeably higher recognition accuracy than DP, and under bad segmentation condition, only BEE boosted the performance. The application of saliency-based image processing strategies was verified to be beneficial to object recognition in daily scenes under simulated prosthetic vision. They are hoped to help the development of the image processing module for future retinal prostheses, and thus provide more benefit for the patients. Copyright © 2015 International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.
Huang, Xiao-Bin; Chen, Ye-Hong; Wang, Zhe
2016-05-24
In this paper, we propose an efficient scheme to fast generate three-qubit Greenberger-Horne-Zeilinger (GHZ) state by constructing shortcuts to adiabatic passage (STAP) based on the "Lewis-Riesenfeld (LR) invariants" in spatially separated cavities connected by optical fibers. Numerical simulations illustrate that the scheme is not only fast, but robust against the decoherence caused by atomic spontaneous emission, cavity losses and the fiber photon leakages. This might be useful to realize fast and noise-resistant quantum information processing for multi-qubit systems.
Web-based Toolkit for Dynamic Generation of Data Processors
NASA Astrophysics Data System (ADS)
Patel, J.; Dascalu, S.; Harris, F. C.; Benedict, K. K.; Gollberg, G.; Sheneman, L.
2011-12-01
All computation-intensive scientific research uses structured datasets, including hydrology and all other types of climate-related research. When it comes to testing their hypotheses, researchers might use the same dataset differently, and modify, transform, or convert it to meet their research needs. Currently, many researchers spend a good amount of time performing data processing and building tools to speed up this process. They might routinely repeat the same process activities for new research projects, spending precious time that otherwise could be dedicated to analyzing and interpreting the data. Numerous tools are available to run tests on prepared datasets and many of them work with datasets in different formats. However, there is still a significant need for applications that can comprehensively handle data transformation and conversion activities and help prepare the various processed datasets required by the researchers. We propose a web-based application (a software toolkit) that dynamically generates data processors capable of performing data conversions, transformations, and customizations based on user-defined mappings and selections. As a first step, the proposed solution allows the users to define various data structures and, in the next step, can select various file formats and data conversions for their datasets of interest. In a simple scenario, the core of the proposed web-based toolkit allows the users to define direct mappings between input and output data structures. The toolkit will also support defining complex mappings involving the use of pre-defined sets of mathematical, statistical, date/time, and text manipulation functions. Furthermore, the users will be allowed to define logical cases for input data filtering and sampling. At the end of the process, the toolkit is designed to generate reusable source code and executable binary files for download and use by the scientists. The application is also designed to store all data structures and mappings defined by a user (an author), and allow the original author to modify them using standard authoring techniques. The users can change or define new mappings to create new data processors for download and use. In essence, when executed, the generated data processor binary file can take an input data file in a given format and output this data, possibly transformed, in a different file format. If they so desire, the users will be able modify directly the source code in order to define more complex mappings and transformations that are not currently supported by the toolkit. Initially aimed at supporting research in hydrology, the toolkit's functions and features can be either directly used or easily extended to other areas of climate-related research. The proposed web-based data processing toolkit will be able to generate various custom software processors for data conversion and transformation in a matter of seconds or minutes, saving a significant amount of researchers' time and allowing them to focus on core research issues.
Solution-processed organic spin-charge converter.
Ando, Kazuya; Watanabe, Shun; Mooser, Sebastian; Saitoh, Eiji; Sirringhaus, Henning
2013-07-01
Conjugated polymers and small organic molecules are enabling new, flexible, large-area, low-cost optoelectronic devices, such as organic light-emitting diodes, transistors and solar cells. Owing to their exceptionally long spin lifetimes, these carbon-based materials could also have an important impact on spintronics, where carrier spins play a key role in transmitting, processing and storing information. However, to exploit this potential, a method for direct conversion of spin information into an electric signal is indispensable. Here we show that a pure spin current can be produced in a solution-processed conducting polymer by pumping spins through a ferromagnetic resonance in an adjacent magnetic insulator, and that this generates an electric voltage across the polymer film. We demonstrate that the experimental characteristics of the generated voltage are consistent with it being generated through an inverse spin Hall effect in the conducting polymer. In contrast with inorganic materials, the conducting polymer exhibits coexistence of high spin-current to charge-current conversion efficiency and long spin lifetimes. Our discovery opens a route for a new generation of molecular-structure-engineered spintronic devices, which could lead to important advances in plastic spintronics.
NASA Astrophysics Data System (ADS)
Zheng, Jigui; Huang, Yuping; Wu, Hongxing; Zheng, Ping
2016-07-01
Transverse-flux with high efficiency has been applied in Stirling engine and permanent magnet synchronous linear generator system, however it is restricted for large application because of low and complex process. A novel type of cylindrical, non-overlapping, transverse-flux, and permanent-magnet linear motor(TFPLM) is investigated, furthermore, a high power factor and less process complexity structure research is developed. The impact of magnetic leakage factor on power factor is discussed, by using the Finite Element Analysis(FEA) model of stirling engine and TFPLM, an optimization method for electro-magnetic design of TFPLM is proposed based on magnetic leakage factor. The relation between power factor and structure parameter is investigated, and a structure parameter optimization method is proposed taking power factor maximum as a goal. At last, the test bench is founded, starting experimental and generating experimental are performed, and a good agreement of simulation and experimental is achieved. The power factor is improved and the process complexity is decreased. This research provides the instruction to design high-power factor permanent-magnet linear generator.
Emerging CFD technologies and aerospace vehicle design
NASA Technical Reports Server (NTRS)
Aftosmis, Michael J.
1995-01-01
With the recent focus on the needs of design and applications CFD, research groups have begun to address the traditional bottlenecks of grid generation and surface modeling. Now, a host of emerging technologies promise to shortcut or dramatically simplify the simulation process. This paper discusses the current status of these emerging technologies. It will argue that some tools are already available which can have positive impact on portions of the design cycle. However, in most cases, these tools need to be integrated into specific engineering systems and process cycles to be used effectively. The rapidly maturing status of unstructured and Cartesian approaches for inviscid simulations makes suggests the possibility of highly automated Euler-boundary layer simulations with application to loads estimation and even preliminary design. Similarly, technology is available to link block structured mesh generation algorithms with topology libraries to avoid tedious re-meshing of topologically similar configurations. Work in algorithmic based auto-blocking suggests that domain decomposition and point placement operations in multi-block mesh generation may be properly posed as problems in Computational Geometry, and following this approach may lead to robust algorithmic processes for automatic mesh generation.
NASA Technical Reports Server (NTRS)
Chien, S.
1994-01-01
This paper describes work on the Multimission VICAR Planner (MVP) system to automatically construct executable image processing procedures for custom image processing requests for the JPL Multimission Image Processing Lab (MIPL). This paper focuses on two issues. First, large search spaces caused by complex plans required the use of hand encoded control information. In order to address this in a manner similar to that used by human experts, MVP uses a decomposition-based planner to implement hierarchical/skeletal planning at the higher level and then uses a classical operator based planner to solve subproblems in contexts defined by the high-level decomposition.
An Ontology-Based Conceptual Model For Accumulating And Reusing Knowledge In A DMAIC Process
NASA Astrophysics Data System (ADS)
Nguyen, ThanhDat; Kifor, Claudiu Vasile
2015-09-01
DMAIC (Define, Measure, Analyze, Improve, and Control) is an important process used to enhance quality of processes basing on knowledge. However, it is difficult to access DMAIC knowledge. Conventional approaches meet a problem arising from structuring and reusing DMAIC knowledge. The main reason is that DMAIC knowledge is not represented and organized systematically. In this article, we overcome the problem basing on a conceptual model that is a combination of DMAIC process, knowledge management, and Ontology engineering. The main idea of our model is to utilizing Ontologies to represent knowledge generated by each of DMAIC phases. We build five different knowledge bases for storing all knowledge of DMAIC phases with the support of necessary tools and appropriate techniques in Information Technology area. Consequently, these knowledge bases provide knowledge available to experts, managers, and web users during or after DMAIC execution in order to share and reuse existing knowledge.
Scale dependent inference in landscape genetics
Samuel A. Cushman; Erin L. Landguth
2010-01-01
Ecological relationships between patterns and processes are highly scale dependent. This paper reports the first formal exploration of how changing scale of research away from the scale of the processes governing gene flow affects the results of landscape genetic analysis. We used an individual-based, spatially explicit simulation model to generate patterns of genetic...
An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application
ERIC Educational Resources Information Center
Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth
2016-01-01
Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…
Generic E-Assessment Process Development Based on Reverse Engineering
ERIC Educational Resources Information Center
Hajjej, Fahima; Hlaoui, Yousra Bendaly; Ben Ayed, Leila Jemni
2017-01-01
The e-assessment, as an important part of any e-learning system, faces the same challenges and problems such as problems related to portability, reusability, adaptability, integration and interoperability. Therefore, we need an approach aiming to generate a general process of the e-assessment. The present study consists of the development of a…
Environmental Pollution: Is There Enough Public Concern to Lead to Action?
ERIC Educational Resources Information Center
Sharma, Navin C.; And Others
1975-01-01
Research indicates that the impetus to solve pollution problems may have to come from processes outside the realm of ordinary problem solving institutions. Mass media exposure and involvement in the political process are ineffective in generating antipollution sentiment. "Grass roots" movements based on informal communication may emerge to combat…
Use of GIS-Based Sampling to Inform Food Security Assessments and Decision Making in Kenya
NASA Astrophysics Data System (ADS)
Wahome, A.; Ndubi, A. O.; Ndungu, L. W.; Mugo, R. M.; Flores Cordova, A. I.
2017-12-01
Kenya relies on agricultural production for supporting local consumption and other processing value chains. With changing climate in a rain-fed dependent agricultural production system, cropping zones are shifting and proper decision making will require updated data. Where up-to-date data is not available it is important that it is generated and passed over to relevant stakeholders to inform their decision making. The process of generating this data should be cost effective and less time consuming. The Kenyan State Department of Agriculture (SDA) runs an insurance programme for maize farmers in a number of counties in Kenya. Previously, SDA was using a list of farmers to identify the crop fields for this insurance programme. However, the process of listing of all farmers in each Unit Area of Insurance (UAI) proved to be tedious and very costly, hence need for an alternative approach, but acceptable sampling methodology. Building on the existing cropland maps, SERVIR, a joint NASA-USAID initiative that brings Earth observations (EO) for improved environmental decision making in developing countries, specifically its hub in Eastern and Soutehrn Africa developed a High Resolution Map based on 10m Sentinel satellite images from which a GIS based sampling frame for identifying maize fields was developed. Sampling points were randomly generated in each UAI and navigated to using hand-held GPS units for identification of maize farmers. In the GIS-based identification of farmers SDA uses 1 day to cover an area covered in 1 week by list identification of farmers. Similarly, SDA spends approximately 3,000 USD per sub-county to locate maize fields using GIS-based sampling as compared 10,000 USD they used to spend before. This has resulted in 70% cost reduction.
Use of parallel computing in mass processing of laser data
NASA Astrophysics Data System (ADS)
Będkowski, J.; Bratuś, R.; Prochaska, M.; Rzonca, A.
2015-12-01
The first part of the paper includes a description of the rules used to generate the algorithm needed for the purpose of parallel computing and also discusses the origins of the idea of research on the use of graphics processors in large scale processing of laser scanning data. The next part of the paper includes the results of an efficiency assessment performed for an array of different processing options, all of which were substantially accelerated with parallel computing. The processing options were divided into the generation of orthophotos using point clouds, coloring of point clouds, transformations, and the generation of a regular grid, as well as advanced processes such as the detection of planes and edges, point cloud classification, and the analysis of data for the purpose of quality control. Most algorithms had to be formulated from scratch in the context of the requirements of parallel computing. A few of the algorithms were based on existing technology developed by the Dephos Software Company and then adapted to parallel computing in the course of this research study. Processing time was determined for each process employed for a typical quantity of data processed, which helped confirm the high efficiency of the solutions proposed and the applicability of parallel computing to the processing of laser scanning data. The high efficiency of parallel computing yields new opportunities in the creation and organization of processing methods for laser scanning data.
Competing power-generating technologies for the 21st century
NASA Astrophysics Data System (ADS)
Troost, G. K.
1994-04-01
Several new and advanced power-generating systems are presently being developed, e.g., fuel cells, advanced heat pumps, high-performance gas turbines. An analysis of these systems is presented and is based on projections of comparative studies and relevant trends. For advanced systems, a trade-off between efficiency gain and projected development cost is crucial. Projections for market conditions in the 21st century and, in particular, environmental issues are made in order to assess market-entry opportunities. Results from various case studies indicate challenging opportunities in process and metallurgical industries; several process-integrated configurations are being studied.
Holographic video at 40 frames per second for 4-million object points.
Tsang, Peter; Cheung, W-K; Poon, T-C; Zhou, C
2011-08-01
We propose a fast method for generating digital Fresnel holograms based on an interpolated wavefront-recording plane (IWRP) approach. Our method can be divided into two stages. First, a small, virtual IWRP is derived in a computational-free manner. Second, the IWRP is expanded into a Fresnel hologram with a pair of fast Fourier transform processes, which are realized with the graphic processing unit (GPU). We demonstrate state-of-the-art experimental results, capable of generating a 2048 x 2048 Fresnel hologram of around 4 × 10(6) object points at a rate of over 40 frames per second.
The TESS Science Processing Operations Center
NASA Technical Reports Server (NTRS)
Jenkins, Jon; Twicken, Joseph D.; McCauliff, Sean; Campbell, Jennifer; Sanderfer, Dwight; Lung, David; Mansouri-Samani, Masoud; Girouard, Forrest; Tenenbaum, Peter; Klaus, Todd;
2016-01-01
The Transiting Exoplanet Survey Satellite (TESS) will conduct a search for Earth’s closest cousins starting in late 2017. TESS will discover approx.1,000 small planets and measure the masses of at least 50 of these small worlds. The Science Processing Operations Center (SPOC) is being developed based on the Kepler science pipeline and will generate calibrated pixels and light curves on the NAS Pleiades supercomputer. The SPOC will search for periodic transit events and generate validation products for the transit-like features in the light curves. All TESS SPOC data products will be archived to the Mikulski Archive for Space Telescopes.
C code generation from Petri-net-based logic controller specification
NASA Astrophysics Data System (ADS)
Grobelny, Michał; Grobelna, Iwona; Karatkevich, Andrei
2017-08-01
The article focuses on programming of logic controllers. It is important that a programming code of a logic controller is executed flawlessly according to the primary specification. In the presented approach we generate C code for an AVR microcontroller from a rule-based logical model of a control process derived from a control interpreted Petri net. The same logical model is also used for formal verification of the specification by means of the model checking technique. The proposed rule-based logical model and formal rules of transformation ensure that the obtained implementation is consistent with the already verified specification. The approach is validated by practical experiments.
Lu, Emily; Elizondo-Riojas, Miguel-Angel; Chang, Jeffrey T; Volk, David E
2014-06-10
Next-generation sequencing results from bead-based aptamer libraries have demonstrated that traditional DNA/RNA alignment software is insufficient. This is particularly true for X-aptamers containing specialty bases (W, X, Y, Z, ...) that are identified by special encoding. Thus, we sought an automated program that uses the inherent design scheme of bead-based X-aptamers to create a hypothetical reference library and Markov modeling techniques to provide improved alignments. Aptaligner provides this feature as well as length error and noise level cutoff features, is parallelized to run on multiple central processing units (cores), and sorts sequences from a single chip into projects and subprojects.
A cloud-based X73 ubiquitous mobile healthcare system: design and implementation.
Ji, Zhanlin; Ganchev, Ivan; O'Droma, Máirtín; Zhang, Xin; Zhang, Xueji
2014-01-01
Based on the user-centric paradigm for next generation networks, this paper describes a ubiquitous mobile healthcare (uHealth) system based on the ISO/IEEE 11073 personal health data (PHD) standards (X73) and cloud computing techniques. A number of design issues associated with the system implementation are outlined. The system includes a middleware on the user side, providing a plug-and-play environment for heterogeneous wireless sensors and mobile terminals utilizing different communication protocols and a distributed "big data" processing subsystem in the cloud. The design and implementation of this system are envisaged as an efficient solution for the next generation of uHealth systems.
NASA Astrophysics Data System (ADS)
Hassan, Mahmoud A.
2004-02-01
Digital elevation models (DEMs) are important tools in the planning, design and maintenance of mobile communication networks. This research paper proposes a method for generating high accuracy DEMs based on SPOT satellite 1A stereo pair images, ground control points (GCP) and Erdas OrthoBASE Pro image processing software. DEMs with 0.2911 m mean error were achieved for the hilly and heavily populated city of Amman. The generated DEM was used to design a mobile communication network resulted in a minimum number of radio base transceiver stations, maximum number of covered regions and less than 2% of dead zones.
A Statistical Method to Distinguish Functional Brain Networks
Fujita, André; Vidal, Maciel C.; Takahashi, Daniel Y.
2017-01-01
One major problem in neuroscience is the comparison of functional brain networks of different populations, e.g., distinguishing the networks of controls and patients. Traditional algorithms are based on search for isomorphism between networks, assuming that they are deterministic. However, biological networks present randomness that cannot be well modeled by those algorithms. For instance, functional brain networks of distinct subjects of the same population can be different due to individual characteristics. Moreover, networks of subjects from different populations can be generated through the same stochastic process. Thus, a better hypothesis is that networks are generated by random processes. In this case, subjects from the same group are samples from the same random process, whereas subjects from different groups are generated by distinct processes. Using this idea, we developed a statistical test called ANOGVA to test whether two or more populations of graphs are generated by the same random graph model. Our simulations' results demonstrate that we can precisely control the rate of false positives and that the test is powerful to discriminate random graphs generated by different models and parameters. The method also showed to be robust for unbalanced data. As an example, we applied ANOGVA to an fMRI dataset composed of controls and patients diagnosed with autism or Asperger. ANOGVA identified the cerebellar functional sub-network as statistically different between controls and autism (p < 0.001). PMID:28261045
A Statistical Method to Distinguish Functional Brain Networks.
Fujita, André; Vidal, Maciel C; Takahashi, Daniel Y
2017-01-01
One major problem in neuroscience is the comparison of functional brain networks of different populations, e.g., distinguishing the networks of controls and patients. Traditional algorithms are based on search for isomorphism between networks, assuming that they are deterministic. However, biological networks present randomness that cannot be well modeled by those algorithms. For instance, functional brain networks of distinct subjects of the same population can be different due to individual characteristics. Moreover, networks of subjects from different populations can be generated through the same stochastic process. Thus, a better hypothesis is that networks are generated by random processes. In this case, subjects from the same group are samples from the same random process, whereas subjects from different groups are generated by distinct processes. Using this idea, we developed a statistical test called ANOGVA to test whether two or more populations of graphs are generated by the same random graph model. Our simulations' results demonstrate that we can precisely control the rate of false positives and that the test is powerful to discriminate random graphs generated by different models and parameters. The method also showed to be robust for unbalanced data. As an example, we applied ANOGVA to an fMRI dataset composed of controls and patients diagnosed with autism or Asperger. ANOGVA identified the cerebellar functional sub-network as statistically different between controls and autism ( p < 0.001).
McDermott, K B; Roediger, H L
1996-03-01
Three experiments examined whether a conceptual implicit memory test (specifically, category instance generation) would exhibit repetition effects similar to those found in free recall. The transfer appropriate processing account of dissociations among memory tests led us to predict that the tests would show parallel effects; this prediction was based upon the theory's assumption that conceptual tests will behave similarly as a function of various independent variables. In Experiment 1, conceptual repetition (i.e., following a target word [e.g., puzzles] with an associate [e.g., jigsaw]) did not enhance priming on the instance generation test relative to the condition of simply presenting the target word once, although this manipulation did affect free recall. In Experiment 2, conceptual repetition was achieved by following a picture with its corresponding word (or vice versa). In this case, there was an effect of conceptual repetition on free recall but no reliable effect on category instance generation or category cued recall. In addition, we obtained a picture superiority effect in free recall but not in category instance generation. In the third experiment, when the same study sequence was used as in Experiment 1, but with instructions that encouraged relational processing, priming on the category instance generation task was enhanced by conceptual repetition. Results demonstrate that conceptual memory tests can be dissociated and present problems for Roediger's (1990) transfer appropriate processing account of dissociations between explicit and implicit tests.
Fabrication High Resolution Metrology Target By Step And Repeat Method
NASA Astrophysics Data System (ADS)
Dusa, Mircea
1983-10-01
Based on the photolithography process generally used to generate high resolution masks for semiconductor I.C.S, we found a very useful industrial application of laser technology.First, we have generated high resolution metrology targets which are used in industrial measurement laser interferometers as difra.ction gratings. Secondi we have generated these targets using step and repeat machine, with He-Ne laser interferometer controlled state, as a pattern generator, due to suitable computer programming.Actually, high resolution metrology target, means two chromium plates, one of which is called the" rule" the other one the "vernier". In Fig.1 we have the configuration of the rule and the vernier. The rule has a succesion of 3 μM lines generated as a difraction grating on a 4 x 4 inch chromium blank. The vernier has several exposed fields( areas) having 3 - 15 μm lines, fields placed on very precise position on the chromium blank surface. High degree of uniformity, tight CD tolerances, low defect density required by the targets, creates specialised problems during processing. Details of the processing, together with experimental results will be presented. Before we start to enter into process details, we have to point out that the dimensional requirements of the reticle target, are quite similar or perhaps more strict than LSI master casks. These requirements presented in Fig.2.
Elaborative retrieval: Do semantic mediators improve memory?
Lehman, Melissa; Karpicke, Jeffrey D
2016-10-01
The elaborative retrieval account of retrieval-based learning proposes that retrieval enhances retention because the retrieval process produces the generation of semantic mediators that link cues to target information. We tested 2 assumptions that form the basis of this account: that semantic mediators are more likely to be generated during retrieval than during restudy and that the generation of mediators facilitates later recall of targets. Although these assumptions are often discussed in the context of retrieval processes, we noted that there was little prior empirical evidence to support either assumption. We conducted a series of experiments to measure the generation of mediators during retrieval and restudy and to examine the effect of the generation of mediators on later target recall. Across 7 experiments, we found that the generation of mediators was not more likely during retrieval (and may be more likely during restudy), and that the activation of mediators was unrelated to subsequent free recall of targets and was negatively related to cued recall of targets. The results pose challenges for both assumptions of the elaborative retrieval account. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
ARIA: Delivering state-of-the-art InSAR products to end users
NASA Astrophysics Data System (ADS)
Agram, P. S.; Owen, S. E.; Hua, H.; Manipon, G.; Sacco, G. F.; Bue, B. D.; Fielding, E. J.; Yun, S. H.; Simons, M.; Webb, F.; Rosen, P. A.; Lundgren, P.; Liu, Z.
2016-12-01
Advanced Rapid Imaging and Analysis (ARIA) Center for Natural Hazards aims to bring state-of-the-art geodetic imaging capabilities to an operational level in support of local, national, and international hazard response communities. ARIA project's first foray into operational generation of InSAR products was with Calimap Project, in collaboration with ASI-CIDOT, using X-band data from the Cosmo-SkyMed constellation. Over the last year, ARIA's processing infrastructure has been significantly upgraded to exploit the free stream of high quality C-band SAR data from ESA's Sentinel-1 mission and related algorithmic improvements to the ISCE software. ARIA's data system can now operationally generate geocoded unwrapped phase and coherence products in GIS-friendly formats from Sentinel-1 TOPS mode data in an automated fashion, and this capability is currently being exercised various study sites across the United States including Hawaii, Central California, Iceland and South America. The ARIA team, building on the experience gained from handling X-band data and C-band data, has also built an automated machine learning-based classifier to label the auto-generated interferograms based on phase unwrapping quality. These high quality "time-series ready" InSAR products generated using state-of-the-art processing algorithms can be accessed by end users using two different mechanisms - 1) a Faceted-search interface that includes browse imagery for quick visualization and 2) an ElasticSearch-based API to enable bulk automated download, post-processing and time-series analysis. In this talk, we will present InSAR results from various global events that ARIA system has responded to. We will also discuss the set of geospatial big data tools including GIS libraries and API tools, that end users will need to familiarize themselves with in order to maximize the utilization of continuous stream of InSAR products from the Sentinel-1 and NISAR missions that the ARIA project will generate.
EDGE3: A web-based solution for management and analysis of Agilent two color microarray experiments
Vollrath, Aaron L; Smith, Adam A; Craven, Mark; Bradfield, Christopher A
2009-01-01
Background The ability to generate transcriptional data on the scale of entire genomes has been a boon both in the improvement of biological understanding and in the amount of data generated. The latter, the amount of data generated, has implications when it comes to effective storage, analysis and sharing of these data. A number of software tools have been developed to store, analyze, and share microarray data. However, a majority of these tools do not offer all of these features nor do they specifically target the commonly used two color Agilent DNA microarray platform. Thus, the motivating factor for the development of EDGE3 was to incorporate the storage, analysis and sharing of microarray data in a manner that would provide a means for research groups to collaborate on Agilent-based microarray experiments without a large investment in software-related expenditures or extensive training of end-users. Results EDGE3 has been developed with two major functions in mind. The first function is to provide a workflow process for the generation of microarray data by a research laboratory or a microarray facility. The second is to store, analyze, and share microarray data in a manner that doesn't require complicated software. To satisfy the first function, EDGE3 has been developed as a means to establish a well defined experimental workflow and information system for microarray generation. To satisfy the second function, the software application utilized as the user interface of EDGE3 is a web browser. Within the web browser, a user is able to access the entire functionality, including, but not limited to, the ability to perform a number of bioinformatics based analyses, collaborate between research groups through a user-based security model, and access to the raw data files and quality control files generated by the software used to extract the signals from an array image. Conclusion Here, we present EDGE3, an open-source, web-based application that allows for the storage, analysis, and controlled sharing of transcription-based microarray data generated on the Agilent DNA platform. In addition, EDGE3 provides a means for managing RNA samples and arrays during the hybridization process. EDGE3 is freely available for download at . PMID:19732451
Vollrath, Aaron L; Smith, Adam A; Craven, Mark; Bradfield, Christopher A
2009-09-04
The ability to generate transcriptional data on the scale of entire genomes has been a boon both in the improvement of biological understanding and in the amount of data generated. The latter, the amount of data generated, has implications when it comes to effective storage, analysis and sharing of these data. A number of software tools have been developed to store, analyze, and share microarray data. However, a majority of these tools do not offer all of these features nor do they specifically target the commonly used two color Agilent DNA microarray platform. Thus, the motivating factor for the development of EDGE(3) was to incorporate the storage, analysis and sharing of microarray data in a manner that would provide a means for research groups to collaborate on Agilent-based microarray experiments without a large investment in software-related expenditures or extensive training of end-users. EDGE(3) has been developed with two major functions in mind. The first function is to provide a workflow process for the generation of microarray data by a research laboratory or a microarray facility. The second is to store, analyze, and share microarray data in a manner that doesn't require complicated software. To satisfy the first function, EDGE3 has been developed as a means to establish a well defined experimental workflow and information system for microarray generation. To satisfy the second function, the software application utilized as the user interface of EDGE(3) is a web browser. Within the web browser, a user is able to access the entire functionality, including, but not limited to, the ability to perform a number of bioinformatics based analyses, collaborate between research groups through a user-based security model, and access to the raw data files and quality control files generated by the software used to extract the signals from an array image. Here, we present EDGE(3), an open-source, web-based application that allows for the storage, analysis, and controlled sharing of transcription-based microarray data generated on the Agilent DNA platform. In addition, EDGE(3) provides a means for managing RNA samples and arrays during the hybridization process. EDGE(3) is freely available for download at http://edge.oncology.wisc.edu/.
Biomimetic design processes in architecture: morphogenetic and evolutionary computational design.
Menges, Achim
2012-03-01
Design computation has profound impact on architectural design methods. This paper explains how computational design enables the development of biomimetic design processes specific to architecture, and how they need to be significantly different from established biomimetic processes in engineering disciplines. The paper first explains the fundamental difference between computer-aided and computational design in architecture, as the understanding of this distinction is of critical importance for the research presented. Thereafter, the conceptual relation and possible transfer of principles from natural morphogenesis to design computation are introduced and the related developments of generative, feature-based, constraint-based, process-based and feedback-based computational design methods are presented. This morphogenetic design research is then related to exploratory evolutionary computation, followed by the presentation of two case studies focusing on the exemplary development of spatial envelope morphologies and urban block morphologies.
NASA Astrophysics Data System (ADS)
Mitra, Joydeep; Torres, Andres; Ma, Yuansheng; Pan, David Z.
2018-01-01
Directed self-assembly (DSA) has emerged as one of the most compelling next-generation patterning techniques for sub 7 nm via or contact layers. A key issue in enabling DSA as a mainstream patterning technique is the generation of grapho-epitaxy-based guiding pattern (GP) shapes to assemble the contact patterns on target with high fidelity and resolution. Current GP generation is mostly empirical, and limited to a very small number of via configurations. We propose the first model-based GP synthesis algorithm and methodology for on-target and robust DSA, on general via pattern configurations. The final postoptical proximity correction-printed GPs derived from our original synthesized GPs are resilient to process variations and continue to maintain the same DSA fidelity in terms of placement error and target shape.
Web-4D-QSAR: A web-based application to generate 4D-QSAR descriptors.
Ataide Martins, João Paulo; Rougeth de Oliveira, Marco Antônio; Oliveira de Queiroz, Mário Sérgio
2018-06-05
A web-based application is developed to generate 4D-QSAR descriptors using the LQTA-QSAR methodology, based on molecular dynamics (MD) trajectories and topology information retrieved from the GROMACS package. The LQTAGrid module calculates the intermolecular interaction energies at each grid point, considering probes and all aligned conformations resulting from MD simulations. These interaction energies are the independent variables or descriptors employed in a QSAR analysis. A friendly front end web interface, built using the Django framework and Python programming language, integrates all steps of the LQTA-QSAR methodology in a way that is transparent to the user, and in the backend, GROMACS and LQTAGrid are executed to generate 4D-QSAR descriptors to be used later in the process of QSAR model building. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.
Guzmán, R; Carpintero, G; Gordon, C; Orbe, L
2016-10-15
We demonstrate and compare two different photonic-based signal sources for generating the carrier wave in a wireless communication link operating in the millimeter-wave range. The first signal source uses the optical heterodyne technique to generate a 113 GHz carrier wave frequency, while the second employs a different technique based on a pulsed mode-locked source with 100 GHz repetition rate frequency. The two optical sources were fabricated in a multi-project wafer run from an active/passive generic integration platform process using standardized building blocks, including multimode interference reflectors which allow us to define the structures on chip, without the need for cleaved facet mirrors. We highlight the superior performance of the mode-locked sources over an optical heterodyne technique. Error-free transmission was achieved in this experiment.
NASA Astrophysics Data System (ADS)
Shafiq, Natis
Energy transfer (ET) based sensitization of silicon (Si) using proximal nanocrystal quantum dots (NQDs) has been studied extensively in recent years as a means to develop thin and flexible Si based solar cells. The driving force for this research activity is a reduction in materials cost. To date, the main method for determining the role of ET in sensitizing Si has been optical spectroscopic studies. The quantitative contribution from two modes of ET (namely, nonradiative and radiative) has been reported using time-resolved photoluminescence (TRPL) spectroscopy coupled with extensive theoretical modelling. Thus, optical techniques have established the potential for utilizing ET based sensitization of Si as a feasible way to develop novel NQD-Si hybrid solar cells. However, the ultimate measure of the efficiency of ET-based mechanisms is the generation of electron-hole pairs by the impinging photons. It is therefore important to perform electrical measurements. However, only a couple of studies have attempted electrical quantification of ET modes. A few studies have focused on photocurrent measurements, without considering industrially relevant photovoltaic (PV) systems. Therefore, there is a need to develop a systematic approach for the electrical quantification of ET-generated charges and to help engineer new PV architectures optimized for harnessing the full advantages of ET mechanisms. Within this context, the work presented in this dissertation aims to develop an experimental testing protocol that can be applied to different PV structures for quantifying ET contributions from electrical measurements. We fabricated bulk Si solar cells (SCs) as a test structure and utilized CdSe/ZnS NQDs for ET based sensitization. The NQD-bulk Si hybrid devices showed ˜30% PV enhancement after NQD deposition. We measured external quantum efficiency (EQE) of these devices to quantify ET-generated charges. Reflectance measurements were also performed to decouple contributions of intrinsic optical effects (i.e., anti-reflection) from NQD mediated ET processes. Our analysis indicates that the contribution of ET-generated charges cannot be detected by EQE measurements. Instead, changes in the optical properties (i.e., anti-reflection property) due to the NQD layer are found to be the primary source of the photocurrent enhancement. Based on this finding, we propose to minimize bulk Si absorption by using an ultrathin (˜300 nm) Si PV architecture which should enable measurements of ET-generated charges. We describe an optimized process flow for fabricating such ultrathin Si devices. The devices fabricated by this method behave like photo-detectors and show enhanced sensitivity under 1 Sun AM1.5G illumination. The geometry and process flow of these devices make it possible to incorporate NQDs for sensitization. Overall, this dissertation provides a protocol for the quantification of ET-generated charges and documents an optimized process flow for the development of an ultrathin Si solar cells.
Evidence-Based Practices and Implementation Science in Special Education
ERIC Educational Resources Information Center
Cook, Bryan G.; Odom, Samuel L.
2013-01-01
Establishing a process for identifying evidence-based practices (EBPs) in special education has been a significant advance for the field because it has the potential for generating more effective educational programs and producing more positive outcomes for students with disabilities. However, the potential benefit of EBPs is bounded by the…
Remote Labs and Game-Based Learning for Process Control
ERIC Educational Resources Information Center
Zualkernan, Imran A.; Husseini, Ghaleb A.; Loughlin, Kevin F.; Mohebzada, Jamshaid G.; El Gaml, Moataz
2013-01-01
Social networking platforms and computer games represent a natural informal learning environment for the current generation of learners in higher education. This paper explores the use of game-based learning in the context of an undergraduate chemical engineering remote laboratory. Specifically, students are allowed to manipulate chemical…
Web-Based Learning Design Tool
ERIC Educational Resources Information Center
Bruno, F. B.; Silva, T. L. K.; Silva, R. P.; Teixeira, F. G.
2012-01-01
Purpose: The purpose of this paper is to propose a web-based tool that enables the development and provision of learning designs and its reuse and re-contextualization as generative learning objects, aimed at developing educational materials. Design/methodology/approach: The use of learning objects can facilitate the process of production and…
Decision Making: New Paradigm for Education.
ERIC Educational Resources Information Center
Wales, Charles E.; And Others
1986-01-01
Defines education's new paradigm as schooling based on decision making, the critical thinking skills serving it, and the knowledge base supporting it. Outlines a model decision-making process using a hypothetical breakfast problem; a late riser chooses goals, generates ideas, develops an action plan, and implements and evaluates it. (4 references)…
An experimental study of factors affecting the selective inhibition of sintering process
NASA Astrophysics Data System (ADS)
Asiabanpour, Bahram
Selective Inhibition of Sintering (SIS) is a new rapid prototyping method that builds parts in a layer-by-layer fabrication basis. SIS works by joining powder particles through sintering in the part's body, and by sintering inhibition of some selected powder areas. The objective of this research has been to improve the new SIS process, which has been invented at USC. The process improvement is based on statistical design of experiments. To conduct the needed experiments a working machine and related path generator software were needed. The machine and its control software were made available prior to this research. The path generator algorithms and software had to be created. This program should obtain model geometry data from a CAD file and generate an appropriate path file for the printer nozzle. Also, the program should generate a simulation file for path file inspection using virtual prototyping. The activities related to path generator constitute the first part of this research, which has resulted in an efficient path generator. In addition, to reach an acceptable level of accuracy, strength, and surface quality in the fabricated parts, all effective factors in the SIS process should be identified and controlled. Simultaneous analytical and experimental studies were conducted to recognize effective factors and to control the SIS process. Also, it was known that polystyrene was the most appropriate polymer powder and saturated potassium iodide was the most effective inhibitor among the available candidate materials. In addition, statistical tools were applied to improve the desirable properties of the parts fabricated by the SIS process. An investigation of part strength was conducted using the Response Surface Methodology (RSM) and a region of acceptable operating conditions for the part strength was found. Then, through analysis of the experimental results, the impact of the factors on the final part surface quality and dimensional accuracy was modeled. After developing a desirability function model, process operating conditions for maximum desirability were identified. Finally, the desirability model was validated.
The silent base flow and the sound sources in a laminar jet.
Sinayoko, Samuel; Agarwal, Anurag
2012-03-01
An algorithm to compute the silent base flow sources of sound in a jet is introduced. The algorithm is based on spatiotemporal filtering of the flow field and is applicable to multifrequency sources. It is applied to an axisymmetric laminar jet and the resulting sources are validated successfully. The sources are compared to those obtained from two classical acoustic analogies, based on quiescent and time-averaged base flows. The comparison demonstrates how the silent base flow sources shed light on the sound generation process. It is shown that the dominant source mechanism in the axisymmetric laminar jet is "shear-noise," which is a linear mechanism. The algorithm presented here could be applied to fully turbulent flows to understand the aerodynamic noise-generation mechanism. © 2012 Acoustical Society of America
Aggregation Trade Offs in Family Based Recommendations
NASA Astrophysics Data System (ADS)
Berkovsky, Shlomo; Freyne, Jill; Coombe, Mac
Personalized information access tools are frequently based on collaborative filtering recommendation algorithms. Collaborative filtering recommender systems typically suffer from a data sparsity problem, where systems do not have sufficient user data to generate accurate and reliable predictions. Prior research suggested using group-based user data in the collaborative filtering recommendation process to generate group-based predictions and partially resolve the sparsity problem. Although group recommendations are less accurate than personalized recommendations, they are more accurate than general non-personalized recommendations, which are the natural fall back when personalized recommendations cannot be generated. In this work we present initial results of a study that exploits the browsing logs of real families of users gathered in an eHealth portal. The browsing logs allowed us to experimentally compare the accuracy of two group-based recommendation strategies: aggregated group models and aggregated predictions. Our results showed that aggregating individual models into group models resulted in more accurate predictions than aggregating individual predictions into group predictions.
Mau, T; Hartmann, V; Burmeister, J; Langguth, P; Häusler, H
2004-01-01
The use of steam in sterilization processes is limited by the implementation of heat-sensitive components inside the machines to be sterilized. Alternative low-temperature sterilization methods need to be found and their suitability evaluated. Vaporized Hydrogen Peroxide (VHP) technology was adapted for a production machine consisting of highly sensitive pressure sensors and thermo-labile air tube systems. This new kind of "cold" surface sterilization, known from the Barrier Isolator Technology, is based on the controlled release of hydrogen peroxide vapour into sealed enclosures. A mobile VHP generator was used to generate the hydrogen peroxide vapour. The unit was combined with the air conduction system of the production machine. Terminal vacuum pumps were installed to distribute the gas within the production machine and for its elimination. In order to control the sterilization process, different physical process monitors were incorporated. The validation of the process was based on biological indicators (Geobacillus stearothermophilus). The Limited Spearman Karber Method (LSKM) was used to statistically evaluate the sterilization process. The results show that it is possible to sterilize surfaces in a complex tube system with the use of gaseous hydrogen peroxide. A total microbial reduction of 6 log units was reached.
A "second generation" of ministry leadership.
Giammalvo, Peter J
2005-01-01
Catholic health care leaders differ from others in the field in that "they are expected to serve as Jesus served, teach as Jesus taught, and lead as Jesus led, in order to heal as Jesus healed." The Catholic health ministry today is led largely by laypeople-what might be called the "first generation" of lay leaders. This first generation was privileged in that it was tutored by and worked alongside women and men religious. Those religious are now mostly gone from the ministry, and that first generation of lay leaders will also be retiring in the not too distant future. Leadership will then pass to a "second generation," laypeople who have not worked alongside religious. How is this new generation to learn "to heal as Jesus healed"? Catholic Health East (CHE), Newtown Square, PA, has developed a program explicitly directed at the recruitment and development of second-generation leaders. In its efforts to fill a position, the system first assembles a preferred-candidate profile, based on 15 competencies, including seven core competencies. CHE then employs a recruitment process based on behavioral event interviewing. All involved stakeholders participate in the interviews.
Unbiased All-Optical Random-Number Generator
NASA Astrophysics Data System (ADS)
Steinle, Tobias; Greiner, Johannes N.; Wrachtrup, Jörg; Giessen, Harald; Gerhardt, Ilja
2017-10-01
The generation of random bits is of enormous importance in modern information science. Cryptographic security is based on random numbers which require a physical process for their generation. This is commonly performed by hardware random-number generators. These often exhibit a number of problems, namely experimental bias, memory in the system, and other technical subtleties, which reduce the reliability in the entropy estimation. Further, the generated outcome has to be postprocessed to "iron out" such spurious effects. Here, we present a purely optical randomness generator, based on the bistable output of an optical parametric oscillator. Detector noise plays no role and postprocessing is reduced to a minimum. Upon entering the bistable regime, initially the resulting output phase depends on vacuum fluctuations. Later, the phase is rigidly locked and can be well determined versus a pulse train, which is derived from the pump laser. This delivers an ambiguity-free output, which is reliably detected and associated with a binary outcome. The resulting random bit stream resembles a perfect coin toss and passes all relevant randomness measures. The random nature of the generated binary outcome is furthermore confirmed by an analysis of resulting conditional entropies.
The Monash University Interactive Simple Climate Model
NASA Astrophysics Data System (ADS)
Dommenget, D.
2013-12-01
The Monash university interactive simple climate model is a web-based interface that allows students and the general public to explore the physical simulation of the climate system with a real global climate model. It is based on the Globally Resolved Energy Balance (GREB) model, which is a climate model published by Dommenget and Floeter [2011] in the international peer review science journal Climate Dynamics. The model simulates most of the main physical processes in the climate system in a very simplistic way and therefore allows very fast and simple climate model simulations on a normal PC computer. Despite its simplicity the model simulates the climate response to external forcings, such as doubling of the CO2 concentrations very realistically (similar to state of the art climate models). The Monash simple climate model web-interface allows you to study the results of more than a 2000 different model experiments in an interactive way and it allows you to study a number of tutorials on the interactions of physical processes in the climate system and solve some puzzles. By switching OFF/ON physical processes you can deconstruct the climate and learn how all the different processes interact to generate the observed climate and how the processes interact to generate the IPCC predicted climate change for anthropogenic CO2 increase. The presentation will illustrate how this web-base tool works and what are the possibilities in teaching students with this tool are.
Cell-intrinsic mechanisms of temperature compensation in a grasshopper sensory receptor neuron
Roemschied, Frederic A; Eberhard, Monika JB; Schleimer, Jan-Hendrik; Ronacher, Bernhard; Schreiber, Susanne
2014-01-01
Changes in temperature affect biochemical reaction rates and, consequently, neural processing. The nervous systems of poikilothermic animals must have evolved mechanisms enabling them to retain their functionality under varying temperatures. Auditory receptor neurons of grasshoppers respond to sound in a surprisingly temperature-compensated manner: firing rates depend moderately on temperature, with average Q10 values around 1.5. Analysis of conductance-based neuron models reveals that temperature compensation of spike generation can be achieved solely relying on cell-intrinsic processes and despite a strong dependence of ion conductances on temperature. Remarkably, this type of temperature compensation need not come at an additional metabolic cost of spike generation. Firing rate-based information transfer is likely to increase with temperature and we derive predictions for an optimal temperature dependence of the tympanal transduction process fostering temperature compensation. The example of auditory receptor neurons demonstrates how neurons may exploit single-cell mechanisms to cope with multiple constraints in parallel. DOI: http://dx.doi.org/10.7554/eLife.02078.001 PMID:24843016
Hybrid Grid Techniques for Propulsion Applications
NASA Technical Reports Server (NTRS)
Koomullil, Roy P.; Soni, Bharat K.; Thornburg, Hugh J.
1996-01-01
During the past decade, computational simulation of fluid flow for propulsion activities has progressed significantly, and many notable successes have been reported in the literature. However, the generation of a high quality mesh for such problems has often been reported as a pacing item. Hence, much effort has been expended to speed this portion of the simulation process. Several approaches have evolved for grid generation. Two of the most common are structured multi-block, and unstructured based procedures. Structured grids tend to be computationally efficient, and have high aspect ratio cells necessary for efficently resolving viscous layers. Structured multi-block grids may or may not exhibit grid line continuity across the block interface. This relaxation of the continuity constraint at the interface is intended to ease the grid generation process, which is still time consuming. Flow solvers supporting non-contiguous interfaces require specialized interpolation procedures which may not ensure conservation at the interface. Unstructured or generalized indexing data structures offer greater flexibility, but require explicit connectivity information and are not easy to generate for three dimensional configurations. In addition, unstructured mesh based schemes tend to be less efficient and it is difficult to resolve viscous layers. Recently hybrid or generalized element solution and grid generation techniques have been developed with the objective of combining the attractive features of both structured and unstructured techniques. In the present work, recently developed procedures for hybrid grid generation and flow simulation are critically evaluated, and compared to existing structured and unstructured procedures in terms of accuracy and computational requirements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humme, J.T.; Tanaka, M.T.; Yokota, M.H.
1979-07-01
The purpose of this study was to determine the feasibility of geothermal resource utilization at the Puna Sugar Company cane sugar processing plant, located in Keaau, Hawaii. A proposed well site area was selected based on data from surface exploratory surveys. The liquid dominated well flow enters a binary thermal arrangement, which results in an acceptable quality steam for process use. Hydrogen sulfide in the well gases is incinerated, leaving sulfur dioxide in the waste gases. The sulfur dioxide in turn is recovered and used in the cane juice processing at the sugar factory. The clean geothermal steam from themore » binary system can be used directly for process requirements. It replaces steam generated by the firing of the waste fibrous product from cane sugar processing. The waste product, called bagasse, has a number of alternative uses, but an evaluation clearly indicated it should continue to be employed for steam generation. This steam, no longer required for process demands, can be directed to increased electric power generation. Revenues gained by the sale of this power to the utility, in addition to other savings developed through the utilization of geothermal energy, can offset the costs associated with hydrothermal utilization.« less
The standard-based open workflow system in GeoBrain (Invited)
NASA Astrophysics Data System (ADS)
Di, L.; Yu, G.; Zhao, P.; Deng, M.
2013-12-01
GeoBrain is an Earth science Web-service system developed and operated by the Center for Spatial Information Science and Systems, George Mason University. In GeoBrain, a standard-based open workflow system has been implemented to accommodate the automated processing of geospatial data through a set of complex geo-processing functions for advanced production generation. The GeoBrain models the complex geoprocessing at two levels, the conceptual and concrete. At the conceptual level, the workflows exist in the form of data and service types defined by ontologies. The workflows at conceptual level are called geo-processing models and cataloged in GeoBrain as virtual product types. A conceptual workflow is instantiated into a concrete, executable workflow when a user requests a product that matches a virtual product type. Both conceptual and concrete workflows are encoded in Business Process Execution Language (BPEL). A BPEL workflow engine, called BPELPower, has been implemented to execute the workflow for the product generation. A provenance capturing service has been implemented to generate the ISO 19115-compliant complete product provenance metadata before and after the workflow execution. The generation of provenance metadata before the workflow execution allows users to examine the usability of the final product before the lengthy and expensive execution takes place. The three modes of workflow executions defined in the ISO 19119, transparent, translucent, and opaque, are available in GeoBrain. A geoprocessing modeling portal has been developed to allow domain experts to develop geoprocessing models at the type level with the support of both data and service/processing ontologies. The geoprocessing models capture the knowledge of the domain experts and are become the operational offering of the products after a proper peer review of models is conducted. An automated workflow composition has been experimented successfully based on ontologies and artificial intelligence technology. The GeoBrain workflow system has been used in multiple Earth science applications, including the monitoring of global agricultural drought, the assessment of flood damage, the derivation of national crop condition and progress information, and the detection of nuclear proliferation facilities and events.
Information-based models for finance and insurance
NASA Astrophysics Data System (ADS)
Hoyle, Edward
2010-10-01
In financial markets, the information that traders have about an asset is reflected in its price. The arrival of new information then leads to price changes. The `information-based framework' of Brody, Hughston and Macrina (BHM) isolates the emergence of information, and examines its role as a driver of price dynamics. This approach has led to the development of new models that capture a broad range of price behaviour. This thesis extends the work of BHM by introducing a wider class of processes for the generation of the market filtration. In the BHM framework, each asset is associated with a collection of random cash flows. The asset price is the sum of the discounted expectations of the cash flows. Expectations are taken with respect (i) an appropriate measure, and (ii) the filtration generated by a set of so-called information processes that carry noisy or imperfect market information about the cash flows. To model the flow of information, we introduce a class of processes termed Lévy random bridges (LRBs), generalising the Brownian and gamma information processes of BHM. Conditioned on its terminal value, an LRB is identical in law to a Lévy bridge. We consider in detail the case where the asset generates a single cash flow X_T at a fixed date T. The flow of information about X_T is modelled by an LRB with random terminal value X_T. An explicit expression for the price process is found by working out the discounted conditional expectation of X_T with respect to the natural filtration of the LRB. New models are constructed using information processes related to the Poisson process, the Cauchy process, the stable-1/2 subordinator, the variance-gamma process, and the normal inverse-Gaussian process. These are applied to the valuation of credit-risky bonds, vanilla and exotic options, and non-life insurance liabilities.
ERIC Educational Resources Information Center
Bauer, Patricia J.; Larkina, Marina
2017-01-01
In accumulating knowledge, direct modes of learning are complemented by productive processes, including self-generation based on integration of separate episodes. Effects of the number of potentially relevant episodes on integration were examined in 4- to 8-year-olds (N = 121; racially/ethnically heterogeneous sample, English speakers, from large…
Reliable Early Classification on Multivariate Time Series with Numerical and Categorical Attributes
2015-05-22
design a procedure of feature extraction in REACT named MEG (Mining Equivalence classes with shapelet Generators) based on the concept of...Equivalence Classes Mining [12, 15]. MEG can efficiently and effectively generate the discriminative features. In addition, several strategies are proposed...technique of parallel computing [4] to propose a process of pa- rallel MEG for substantially reducing the computational overhead of discovering shapelet
A fast process development flow by applying design technology co-optimization
NASA Astrophysics Data System (ADS)
Chen, Yi-Chieh; Yeh, Shin-Shing; Ou, Tsong-Hua; Lin, Hung-Yu; Mai, Yung-Ching; Lin, Lawrence; Lai, Jun-Cheng; Lai, Ya Chieh; Xu, Wei; Hurat, Philippe
2017-03-01
Beyond 40 nm technology node, the pattern weak points and hotspot types increase dramatically. The typical patterns for lithography verification suffers huge turn-around-time (TAT) to handle the design complexity. Therefore, in order to speed up process development and increase pattern variety, accurate design guideline and realistic design combinations are required. This paper presented a flow for creating a cell-based layout, a lite realistic design, to early identify problematic patterns which will negatively affect the yield. A new random layout generating method, Design Technology Co-Optimization Pattern Generator (DTCO-PG), is reported in this paper to create cell-based design. DTCO-PG also includes how to characterize the randomness and fuzziness, so that it is able to build up the machine learning scheme which model could be trained by previous results, and then it generates patterns never seen in a lite design. This methodology not only increases pattern diversity but also finds out potential hotspot preliminarily. This paper also demonstrates an integrated flow from DTCO pattern generation to layout modification. Optical Proximity Correction, OPC and lithographic simulation is then applied to DTCO-PG design database to detect hotspots and then hotspots or weak points can be automatically fixed through the procedure or handled manually. This flow benefits the process evolution to have a faster development cycle time, more complexity pattern design, higher probability to find out potential hotspots in early stage, and a more holistic yield ramping operation.
NASA Astrophysics Data System (ADS)
Laban, Shaban; El-Desouky, Aly
2013-04-01
The monitoring of real-time systems is a challenging and complicated process. So, there is a continuous need to improve the monitoring process through the use of new intelligent techniques and algorithms for detecting exceptions, anomalous behaviours and generating the necessary alerts during the workflow monitoring of such systems. The interval-based or period-based theorems have been discussed, analysed, and used by many researches in Artificial Intelligence (AI), philosophy, and linguistics. As explained by Allen, there are 13 relations between any two intervals. Also, there have also been many studies of interval-based temporal reasoning and logics over the past decades. Interval-based theorems can be used for monitoring real-time interval-based data processing. However, increasing the number of processed intervals makes the implementation of such theorems a complex and time consuming process as the relationships between such intervals are increasing exponentially. To overcome the previous problem, this paper presents a Rule-based Interval State Machine Algorithm (RISMA) for processing, monitoring, and analysing the behaviour of interval-based data, received from real-time sensors. The proposed intelligent algorithm uses the Interval State Machine (ISM) approach to model any number of interval-based data into well-defined states as well as inferring them. An interval-based state transition model and methodology are presented to identify the relationships between the different states of the proposed algorithm. By using such model, the unlimited number of relationships between similar large numbers of intervals can be reduced to only 18 direct relationships using the proposed well-defined states. For testing the proposed algorithm, necessary inference rules and code have been designed and applied to the continuous data received in near real-time from the stations of International Monitoring System (IMS) by the International Data Centre (IDC) of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). The CLIPS expert system shell has been used as the main rule engine for implementing the algorithm rules. Python programming language and the module "PyCLIPS" are used for building the necessary code for algorithm implementation. More than 1.7 million intervals constitute the Concise List of Frames (CLF) from 20 different seismic stations have been used for evaluating the proposed algorithm and evaluating stations behaviour and performance. The initial results showed that proposed algorithm can help in better understanding of the operation and performance of those stations. Different important information, such as alerts and some station performance parameters, can be derived from the proposed algorithm. For IMS interval-based data and at any period of time it is possible to analyze station behavior, determine the missing data, generate necessary alerts, and to measure some of station performance attributes. The details of the proposed algorithm, methodology, implementation, experimental results, advantages, and limitations of this research are presented. Finally, future directions and recommendations are discussed.
A regenerative process for carbon dioxide removal and hydrogen production in IGCC
NASA Astrophysics Data System (ADS)
Hassanzadeh Khayyat, Armin
Advanced power generation technologies, such as Integrated Gasification-Combined Cycles (IGCC) processes, are among the leading contenders for power generation conversion because of their significantly higher efficiencies and potential environmental advantages, compared to conventional coal combustion processes. Although the increased in efficiency in the IGCC processes will reduce the emissions of carbon dioxide per unit of power generated, further reduction in CO2 emissions is crucial due to enforcement of green house gases (GHG) regulations. In IGCC processes to avoid efficiency losses, it is desirable to remove CO2 in the temperature range of 300° to 500°C, which makes regenerable MgO-based sorbents ideal for such operations. In this temperature range, CO2 removal results in the shifting of the water-gas shift (WGS) reaction towards significant reduction in carbon monoxide (CO), and enhancement in hydrogen production. However, regenerable, reactive and attrition resistant sorbents are required for such application. In this work, a highly reactive and attrition resistant regenerable MgO-based sorbent is prepared through dolomite modification, which can simultaneously remove carbon dioxide and enhance hydrogen production in a single reactor. The results of the experimental tests conducted in High-Pressure Thermogravimetric Analyzer (HP-TGA) and high-pressure packed-bed units indicate that in the temperature range of 300° to 500°C at 20 atm more than 95 molar percent of CO2 can be removed from the simulated coal gas, and the hydrogen concentration can be increased to above 70 percent. However, a declining trend is observed in the capacity of the sorbent exposed to long-term durability analysis, which appears to level off after about 20 cycles. Based on the physical and chemical analysis of the sorbent, a two-zone expanding grain model was applied to obtain an excellent fit to the carbonation reaction rate data at various operating conditions. The modeling results indicate that more than 90 percent purification of hydrogen is achievable, either by increasing the activity of the sorbent towards water-gas shift reaction or by mixing the sorbent bed with a commercialized water-gas shift catalyst. The preliminary economical evaluation of the MgO-based process indicates that this process can be economically viable compared to the commercially available WGS/Selexol(TM) processes.
Geometry modeling and multi-block grid generation for turbomachinery configurations
NASA Technical Reports Server (NTRS)
Shih, Ming H.; Soni, Bharat K.
1992-01-01
An interactive 3D grid generation code, Turbomachinery Interactive Grid genERation (TIGER), was developed for general turbomachinery configurations. TIGER features the automatic generation of multi-block structured grids around multiple blade rows for either internal, external, or internal-external turbomachinery flow fields. Utilization of the Bezier's curves achieves a smooth grid and better orthogonality. TIGER generates the algebraic grid automatically based on geometric information provided by its built-in pseudo-AI algorithm. However, due to the large variation of turbomachinery configurations, this initial grid may not always be as good as desired. TIGER therefore provides graphical user interactions during the process which allow the user to design, modify, as well as manipulate the grid, including the capability of elliptic surface grid generation.
Mouawad, O; Amrani, F; Kibler, B; Picot-Clémente, J; Strutynski, C; Fatome, J; Désévédavy, F; Gadret, G; Jules, J-C; Heintz, O; Lesniewska, E; Smektala, F
2014-10-06
We analyze optical and structural aging in As₂S₃ microstructured optical fibers (MOFs) that may have an impact on mid-infrared supercontinuum generation. A strong alteration of optical transparency at the fundamental OH absorption peak is measured for high-purity As₂S₃ MOF stored in atmospheric conditions. The surface evolution and inherent deviation of corresponding chemical composition confirm that the optical and chemical properties of MOFs degrade upon exposure to ambient conditions because of counteractive surface process. This phenomenon substantially reduces the optical quality of the MOFs and therefore restrains the spectral expansion of generated supercontinuum. This aging process is well confirmed by the good matching between previous experimental results and the reported numerical simulations based on the generalized nonlinear Schrödinger equation.
He, W; Zhao, S; Liu, X; Dong, S; Lv, J; Liu, D; Wang, J; Meng, Z
2013-12-04
Large-scale next-generation sequencing (NGS)-based resequencing detects sequence variations, constructs evolutionary histories, and identifies phenotype-related genotypes. However, NGS-based resequencing studies generate extraordinarily large amounts of data, making computations difficult. Effective use and analysis of these data for NGS-based resequencing studies remains a difficult task for individual researchers. Here, we introduce ReSeqTools, a full-featured toolkit for NGS (Illumina sequencing)-based resequencing analysis, which processes raw data, interprets mapping results, and identifies and annotates sequence variations. ReSeqTools provides abundant scalable functions for routine resequencing analysis in different modules to facilitate customization of the analysis pipeline. ReSeqTools is designed to use compressed data files as input or output to save storage space and facilitates faster and more computationally efficient large-scale resequencing studies in a user-friendly manner. It offers abundant practical functions and generates useful statistics during the analysis pipeline, which significantly simplifies resequencing analysis. Its integrated algorithms and abundant sub-functions provide a solid foundation for special demands in resequencing projects. Users can combine these functions to construct their own pipelines for other purposes.
Design keys for paper-based concentration gradient generators.
Schaumburg, Federico; Urteaga, Raúl; Kler, Pablo A; Berli, Claudio L A
2018-08-03
The generation of concentration gradients is an essential operation for several analytical processes implemented on microfluidic paper-based analytical devices. The dynamic gradient formation is based on the transverse dispersion of chemical species across co-flowing streams. In paper channels, this transverse flux of molecules is dominated by mechanical dispersion, which is substantially different than molecular diffusion, which is the mechanism acting in conventional microchannels. Therefore, the design of gradient generators on paper requires strategies different from those used in traditional microfluidics. This work considers the foundations of transverse dispersion in porous substrates to investigate the optimal design of microfluidic paper-based concentration gradient generators (μPGGs) by computer simulations. A set of novel and versatile μPGGs were designed in the format of numerical prototypes, and virtual experiments were run to explore the ranges of operation and the overall performance of such devices. Then physical prototypes were fabricated and experimentally tested in our lab. Finally, some basic rules for the design of optimized μPGGs are proposed. Apart from improving the efficiency of mixers, diluters and μPGGs, the results of this investigation are relevant to attain highly controlled concentration fields on paper-based devices. Copyright © 2018 Elsevier B.V. All rights reserved.
An Improved Effective Cost Review Process for Value Engineering
Joo, D. S.; Park, J. I.
2014-01-01
Second-look value engineering (VE) is an approach that aims to lower the costs of products for which target costs are not being met during the production stage. Participants in second-look VE typically come up with a variety of ideas for cost cutting, but the outcomes often depend on their levels of experience, and not many good alternatives are available during the production stage. Nonetheless, good ideas have been consistently generated by VE experts. This paper investigates past second-look VE cases and the thinking processes of VE experts and proposes a cost review process as a systematic means of investigating cost-cutting ideas. This cost review process includes the use of an idea checklist and a specification review process. In addition to presenting the process, this paper reports on its feasibility, based on its introduction into a VE training course as part of a pilot study. The results indicate that the cost review process is effective in generating ideas for later analysis. PMID:25580459
An improved effective cost review process for value engineering.
Joo, D S; Park, J I
2014-01-01
Second-look value engineering (VE) is an approach that aims to lower the costs of products for which target costs are not being met during the production stage. Participants in second-look VE typically come up with a variety of ideas for cost cutting, but the outcomes often depend on their levels of experience, and not many good alternatives are available during the production stage. Nonetheless, good ideas have been consistently generated by VE experts. This paper investigates past second-look VE cases and the thinking processes of VE experts and proposes a cost review process as a systematic means of investigating cost-cutting ideas. This cost review process includes the use of an idea checklist and a specification review process. In addition to presenting the process, this paper reports on its feasibility, based on its introduction into a VE training course as part of a pilot study. The results indicate that the cost review process is effective in generating ideas for later analysis.
Materials, Processes, and Environmental Engineering Network
NASA Technical Reports Server (NTRS)
White, Margo M.
1993-01-01
Attention is given to the Materials, Processes, and Environmental Engineering Network (MPEEN), which was developed as a central holding facility for materials testing information generated by the Materials and Processes Laboratory of NASA-Marshall. It contains information from other NASA centers and outside agencies, and also includes the NASA Environmental Information System (NEIS) and Failure Analysis Information System (FAIS) data. The data base is NEIS, which is accessible through MPEEN. Environmental concerns are addressed regarding materials identified by the NASA Operational Environment Team (NOET) to be hazardous to the environment. The data base also contains the usage and performance characteristics of these materials.
NASA Astrophysics Data System (ADS)
Xia, Bing
Ultrafast optical signal processing, which shares the same fundamental principles of electrical signal processing, can realize numerous important functionalities required in both academic research and industry. Due to the extremely fast processing speed, all-optical signal processing and pulse shaping have been widely used in ultrafast telecommunication networks, photonically-assisted RFlmicro-meter waveform generation, microscopy, biophotonics, and studies on transient and nonlinear properties of atoms and molecules. In this thesis, we investigate two types of optical spectrally-periodic (SP) filters that can be fabricated on planar lightwave circuits (PLC) to perform pulse repetition rate multiplication (PRRM) and arbitrary optical waveform generation (AOWG). First, we present a direct temporal domain approach for PRRM using SP filters. We show that the repetition rate of an input pulse train can be multiplied by a factor N using an optical filter with a free spectral range that does not need to be constrained to an integer multiple of N. Furthermore, the amplitude of each individual output pulse can be manipulated separately to form an arbitrary envelope at the output by optimizing the impulse response of the filter. Next, we use lattice-form Mach-Zehnder interferometers (LF-MZI) to implement the temporal domain approach for PRRM. The simulation results show that PRRM with uniform profiles, binary-code profiles and triangular profiles can be achieved. Three silica based LF-MZIs are designed and fabricated, which incorporate multi-mode interference (MMI) couplers and phase shifters. The experimental results show that 40 GHz pulse trains with a uniform envelope pattern, a binary code pattern "1011" and a binary code pattern "1101" are generated from a 10 GHz input pulse train. Finally, we investigate 2D ring resonator arrays (RRA) for ultraf ast optical signal processing. We design 2D RRAs to generate a pair of pulse trains with different binary-code patterns simultaneously from a single pulse train at a low repetition rate. We also design 2D RRAs for AOWG using the modified direct temporal domain approach. To demonstrate the approach, we provide numerical examples to illustrate the generation of two very different waveforms (square waveform and triangular waveform) from the same hyperbolic secant input pulse train. This powerful technique based on SP filters can be very useful for ultrafast optical signal processing and pulse shaping.
NASA Tech Briefs, October 2010
NASA Technical Reports Server (NTRS)
2010-01-01
Topics covered include: Hybrid Architecture Active Wavefront Sensing and Control; Carbon-Nanotube-Based Chemical Gas Sensor; Aerogel-Positronium Technology for the Detection of Small Quantities of Organic and/or Toxic Materials; Graphene-Based Reversible Nano-Switch/Sensor Schottky Diode; Inductive Non-Contact Position Sensor; High-Temperature Surface-Acoustic-Wave Transducer; Grid-Sphere Electrodes for Contact with Ionospheric Plasma; Enabling IP Header Compression in COTS Routers via Frame Relay on a Simplex Link; Ka-Band SiGe Receiver Front-End MMIC for Transponder Applications; Robust Optimization Design Algorithm for High-Frequency TWTs; Optimal and Local Connectivity Between Neuron and Synapse Array in the Quantum Dot/Silicon Brain; Method and Circuit for In-Situ Health Monitoring of Solar Cells in Space; BGen: A UML Behavior Network Generator Tool; Platform for Post-Processing Waveform-Based NDE; Electrochemical Hydrogen Peroxide Generator; Fabrication of Single, Vertically Aligned Carbon Nanotubes in 3D Nanoscale Architectures; Process to Create High-Fidelity Lunar Dust Simulants; Lithium-Ion Electrolytes Containing Phosphorous-Based, Flame-Retardant Additives; InGaP Heterojunction Barrier Solar Cells; Straight-Pore Microfilter with Efficient Regeneration; Determining Shear Stress Distribution in a Laminate; Self-Adjusting Liquid Injectors for Combustors; Handling Qualities Prediction of an F-16XL-Based Reduced Sonic Boom Aircraft; Tele-Robotic ATHLETE Controller for Kinematics - TRACK; Three-Wheel Brush-Wheel Sampler; Heterodyne Interferometer Angle Metrology; Aligning Astronomical Telescopes via Identification of Stars; Generation of Optical Combs in a WGM Resonator from a Bichromatic Pump; Large-Format AlGaN PIN Photodiode Arrays for UV Images; Fiber-Coupled Planar Light-Wave Circuit for Seed Laser Control in High Spectral Resolution Lidar Systems; On Calculating the Zero-Gravity Surface Figure of a Mirror; Optical Modification of Casimir Forces for Improved Function of Micro- and Nano-Scale Devices; Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems; Core and Off-Core Processes in Systems Engineering; Digital Reconstruction Supporting Investigation of Mishaps; and Template Matching Approach to Signal Prediction.
Model-based quality assessment and base-calling for second-generation sequencing data.
Bravo, Héctor Corrada; Irizarry, Rafael A
2010-09-01
Second-generation sequencing (sec-gen) technology can sequence millions of short fragments of DNA in parallel, making it capable of assembling complex genomes for a small fraction of the price and time of previous technologies. In fact, a recently formed international consortium, the 1000 Genomes Project, plans to fully sequence the genomes of approximately 1200 people. The prospect of comparative analysis at the sequence level of a large number of samples across multiple populations may be achieved within the next five years. These data present unprecedented challenges in statistical analysis. For instance, analysis operates on millions of short nucleotide sequences, or reads-strings of A,C,G, or T's, between 30 and 100 characters long-which are the result of complex processing of noisy continuous fluorescence intensity measurements known as base-calling. The complexity of the base-calling discretization process results in reads of widely varying quality within and across sequence samples. This variation in processing quality results in infrequent but systematic errors that we have found to mislead downstream analysis of the discretized sequence read data. For instance, a central goal of the 1000 Genomes Project is to quantify across-sample variation at the single nucleotide level. At this resolution, small error rates in sequencing prove significant, especially for rare variants. Sec-gen sequencing is a relatively new technology for which potential biases and sources of obscuring variation are not yet fully understood. Therefore, modeling and quantifying the uncertainty inherent in the generation of sequence reads is of utmost importance. In this article, we present a simple model to capture uncertainty arising in the base-calling procedure of the Illumina/Solexa GA platform. Model parameters have a straightforward interpretation in terms of the chemistry of base-calling allowing for informative and easily interpretable metrics that capture the variability in sequencing quality. Our model provides these informative estimates readily usable in quality assessment tools while significantly improving base-calling performance. © 2009, The International Biometric Society.
Digital Signal Processing and Generation for a DC Current Transformer for Particle Accelerators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zorzetti, Silvia
2013-01-01
The thesis topic, digital signal processing and generation for a DC current transformer, focuses on the most fundamental beam diagnostics in the field of particle accelerators, the measurement of the beam intensity, or beam current. The technology of a DC current transformer (DCCT) is well known, and used in many areas, including particle accelerator beam instrumentation, as non-invasive (shunt-free) method to monitor the DC current in a conducting wire, or in our case, the current of charged particles travelling inside an evacuated metal pipe. So far, custom and commercial DCCTs are entirely based on analog technologies and signal processing, whichmore » makes them inflexible, sensitive to component aging, and difficult to maintain and calibrate.« less
NASA Astrophysics Data System (ADS)
Kubalska, J. L.; Preuss, R.
2013-12-01
Digital Surface Models (DSM) are used in GIS data bases as single product more often. They are also necessary to create other products such as3D city models, true-ortho and object-oriented classification. This article presents results of DSM generation for classification of vegetation in urban areas. Source data allowed producing DSM with using of image matching method and ALS data. The creation of DSM from digital images, obtained by Ultra Cam-D digital Vexcel camera, was carried out in Match-T by INPHO. This program optimizes the configuration of images matching process, which ensures high accuracy and minimize gap areas. The analysis of the accuracy of this process was made by comparison of DSM generated in Match-T with DSM generated from ALS data. Because of further purpose of generated DSM it was decided to create model in GRID structure with cell size of 1 m. With this parameter differential model from both DSMs was also built that allowed determining the relative accuracy of the compared models. The analysis indicates that the generation of DSM with multi-image matching method is competitive for the same surface model creation from ALS data. Thus, when digital images with high overlap are available, the additional registration of ALS data seems to be unnecessary.
Context-Sensitive Spelling Correction of Consumer-Generated Content on Health Care
Chen, Rudan; Zhao, Xianyang; Xu, Wei; Cheng, Wenqing; Lin, Simon
2015-01-01
Background Consumer-generated content, such as postings on social media websites, can serve as an ideal source of information for studying health care from a consumer’s perspective. However, consumer-generated content on health care topics often contains spelling errors, which, if not corrected, will be obstacles for downstream computer-based text analysis. Objective In this study, we proposed a framework with a spelling correction system designed for consumer-generated content and a novel ontology-based evaluation system which was used to efficiently assess the correction quality. Additionally, we emphasized the importance of context sensitivity in the correction process, and demonstrated why correction methods designed for electronic medical records (EMRs) failed to perform well with consumer-generated content. Methods First, we developed our spelling correction system based on Google Spell Checker. The system processed postings acquired from MedHelp, a biomedical bulletin board system (BBS), and saved misspelled words (eg, sertaline) and corresponding corrected words (eg, sertraline) into two separate sets. Second, to reduce the number of words needing manual examination in the evaluation process, we respectively matched the words in the two sets with terms in two biomedical ontologies: RxNorm and Systematized Nomenclature of Medicine -- Clinical Terms (SNOMED CT). The ratio of words which could be matched and appropriately corrected was used to evaluate the correction system’s overall performance. Third, we categorized the misspelled words according to the types of spelling errors. Finally, we calculated the ratio of abbreviations in the postings, which remarkably differed between EMRs and consumer-generated content and could largely influence the overall performance of spelling checkers. Results An uncorrected word and the corresponding corrected word was called a spelling pair, and the two words in the spelling pair were its members. In our study, there were 271 spelling pairs detected, among which 58 (21.4%) pairs had one or two members matched in the selected ontologies. The ratio of appropriate correction in the 271 overall spelling errors was 85.2% (231/271). The ratio of that in the 58 spelling pairs was 86% (50/58), close to the overall ratio. We also found that linguistic errors took up 31.4% (85/271) of all errors detected, and only 0.98% (210/21,358) of words in the postings were abbreviations, which was much lower than the ratio in the EMRs (33.6%). Conclusions We conclude that our system can accurately correct spelling errors in consumer-generated content. Context sensitivity is indispensable in the correction process. Additionally, it can be confirmed that consumer-generated content differs from EMRs in that consumers seldom use abbreviations. Also, the evaluation method, taking advantage of biomedical ontology, can effectively estimate the accuracy of the correction system and reduce manual examination time. PMID:26232246
Context-Sensitive Spelling Correction of Consumer-Generated Content on Health Care.
Zhou, Xiaofang; Zheng, An; Yin, Jiaheng; Chen, Rudan; Zhao, Xianyang; Xu, Wei; Cheng, Wenqing; Xia, Tian; Lin, Simon
2015-07-31
Consumer-generated content, such as postings on social media websites, can serve as an ideal source of information for studying health care from a consumer's perspective. However, consumer-generated content on health care topics often contains spelling errors, which, if not corrected, will be obstacles for downstream computer-based text analysis. In this study, we proposed a framework with a spelling correction system designed for consumer-generated content and a novel ontology-based evaluation system which was used to efficiently assess the correction quality. Additionally, we emphasized the importance of context sensitivity in the correction process, and demonstrated why correction methods designed for electronic medical records (EMRs) failed to perform well with consumer-generated content. First, we developed our spelling correction system based on Google Spell Checker. The system processed postings acquired from MedHelp, a biomedical bulletin board system (BBS), and saved misspelled words (eg, sertaline) and corresponding corrected words (eg, sertraline) into two separate sets. Second, to reduce the number of words needing manual examination in the evaluation process, we respectively matched the words in the two sets with terms in two biomedical ontologies: RxNorm and Systematized Nomenclature of Medicine -- Clinical Terms (SNOMED CT). The ratio of words which could be matched and appropriately corrected was used to evaluate the correction system's overall performance. Third, we categorized the misspelled words according to the types of spelling errors. Finally, we calculated the ratio of abbreviations in the postings, which remarkably differed between EMRs and consumer-generated content and could largely influence the overall performance of spelling checkers. An uncorrected word and the corresponding corrected word was called a spelling pair, and the two words in the spelling pair were its members. In our study, there were 271 spelling pairs detected, among which 58 (21.4%) pairs had one or two members matched in the selected ontologies. The ratio of appropriate correction in the 271 overall spelling errors was 85.2% (231/271). The ratio of that in the 58 spelling pairs was 86% (50/58), close to the overall ratio. We also found that linguistic errors took up 31.4% (85/271) of all errors detected, and only 0.98% (210/21,358) of words in the postings were abbreviations, which was much lower than the ratio in the EMRs (33.6%). We conclude that our system can accurately correct spelling errors in consumer-generated content. Context sensitivity is indispensable in the correction process. Additionally, it can be confirmed that consumer-generated content differs from EMRs in that consumers seldom use abbreviations. Also, the evaluation method, taking advantage of biomedical ontology, can effectively estimate the accuracy of the correction system and reduce manual examination time.
Refrigeration generation using expander-generator units
NASA Astrophysics Data System (ADS)
Klimenko, A. V.; Agababov, V. S.; Koryagin, A. V.; Baidakova, Yu. O.
2016-05-01
The problems of using the expander-generator unit (EGU) to generate refrigeration, along with electricity were considered. It is shown that, on the level of the temperatures of refrigeration flows using the EGU, one can provide the refrigeration supply of the different consumers: ventilation and air conditioning plants and industrial refrigerators and freezers. The analysis of influence of process parameters on the cooling power of the EGU, which depends on the parameters of the gas expansion process in the expander and temperatures of cooled environment, was carried out. The schematic diagram of refrigeration generation plant based on EGU is presented. The features and advantages of EGU to generate refrigeration compared with thermotransformer of steam compressive and absorption types were shown, namely: there is no need to use the energy generated by burning fuel to operate the EGU; beneficial use of the heat delivered to gas from the flow being cooled in equipment operating on gas; energy production along with refrigeration generation, which makes it possible to create, using EGU, the trigeneration plants without using the energy power equipment. It is shown that the level of the temperatures of refrigeration flows, which can be obtained by using the EGU on existing technological decompression stations of the transported gas, allows providing the refrigeration supply of various consumers. The information that the refrigeration capacity of an expander-generator unit not only depends on the parameters of the process of expansion of gas flowing in the expander (flow rate, temperatures and pressures at the inlet and outlet) but it is also determined by the temperature needed for a consumer and the initial temperature of the flow of the refrigeration-carrier being cooled. The conclusion was made that the expander-generator units can be used to create trigeneration plants both at major power plants and at small energy.
Analysis of Generator Oscillation Characteristics Based on Multiple Synchronized Phasor Measurements
NASA Astrophysics Data System (ADS)
Hashiguchi, Takuhei; Yoshimoto, Masamichi; Mitani, Yasunori; Saeki, Osamu; Tsuji, Kiichiro
In recent years, there has been considerable interest in the on-line measurement, such as observation of power system dynamics and evaluation of machine parameters. On-line methods are particularly attractive since the machine’s service need not be interrupted and parameter estimation is performed by processing measurements obtained during the normal operation of the machine. Authors placed PMU (Phasor Measurement Unit) connected to 100V outlets in some Universities in the 60Hz power system and examine oscillation characteristics in power system. PMU is synchronized based on the global positioning system (GPS) and measured data are transmitted via Internet. This paper describes an application of PMU for generator oscillation analysis. The purpose of this paper is to show methods for processing phase difference and to estimate damping coeffcient and natural angular frequency from phase difference at steady state.
Design optimization of gas generator hybrid propulsion boosters
NASA Technical Reports Server (NTRS)
Weldon, Vincent; Phillips, Dwight U.; Fink, Lawrence E.
1990-01-01
A methodology used in support of a contract study for NASA/MSFC to optimize the design of gas generator hybrid propulsion booster for uprating the National Space Transportation System (NSTS) is presented. The objective was to compare alternative configurations for this booster approach, optimizing each candidate concept on different bases, in order to develop data for a trade table on which a final decision was based. The methodology is capable of processing a large number of independent and dependent variables, adjusting the overall subsystems characteristics to arrive at a best compromise integrated design to meet various specified optimization criteria subject to selected constraints. For each system considered, a detailed weight statement was generated along with preliminary cost and reliability estimates.
NASA Astrophysics Data System (ADS)
Rasztovits, S.; Dorninger, P.
2013-07-01
Terrestrial Laser Scanning (TLS) is an established method to reconstruct the geometrical surface of given objects. Current systems allow for fast and efficient determination of 3D models with high accuracy and richness in detail. Alternatively, 3D reconstruction services are using images to reconstruct the surface of an object. While the instrumental expenses for laser scanning systems are high, upcoming free software services as well as open source software packages enable the generation of 3D models using digital consumer cameras. In addition, processing TLS data still requires an experienced user while recent web-services operate completely automatically. An indisputable advantage of image based 3D modeling is its implicit capability for model texturing. However, the achievable accuracy and resolution of the 3D models is lower than those of laser scanning data. Within this contribution, we investigate the results of automated web-services for image based 3D model generation with respect to a TLS reference model. For this, a copper sculpture was acquired using a laser scanner and using image series of different digital cameras. Two different webservices, namely Arc3D and AutoDesk 123D Catch were used to process the image data. The geometric accuracy was compared for the entire model and for some highly structured details. The results are presented and interpreted based on difference models. Finally, an economical comparison of the generation of the models is given considering the interactive and processing time costs.
Li, Yunxiang; Ouyang, Shuxin; Xu, Hua; Wang, Xin; Bi, Yingpu; Zhang, Yuanfang; Ye, Jinhua
2016-10-03
Efficient generation of active oxygen-related radicals plays an essential role in boosting advanced oxidation process. To promote photocatalytic oxidation for gaseous pollutant over g-C 3 N 4 , a solid-gas interfacial Fenton reaction is coupled into alkalinized g-C 3 N 4 -based photocatalyst to effectively convert photocatalytic generation of H 2 O 2 into oxygen-related radicals. This system includes light energy as power, alkalinized g-C 3 N 4 -based photocatalyst as an in situ and robust H 2 O 2 generator, and surface-decorated Fe 3+ as a trigger of H 2 O 2 conversion, which attains highly efficient and universal activity for photodegradation of volatile organic compounds (VOCs). Taking the photooxidation of isopropanol as model reaction, this system achieves a photoactivity of 2-3 orders of magnitude higher than that of pristine g-C 3 N 4 , which corresponds to a high apparent quantum yield of 49% at around 420 nm. In-situ electron spin resonance (ESR) spectroscopy and sacrificial-reagent incorporated photocatalytic characterizations indicate that the notable photoactivity promotion could be ascribed to the collaboration between photocarriers (electrons and holes) and Fenton process to produce abundant and reactive oxygen-related radicals. The strategy of coupling solid-gas interfacial Fenton process into semiconductor-based photocatalysis provides a facile and promising solution to the remediation of air pollution via solar energy.
An Indoor Positioning-Based Mobile Payment System Using Bluetooth Low Energy Technology
Winata, Doni
2018-01-01
The development of information technology has paved the way for faster and more convenient payment process flows and new methodology for the design and implementation of next generation payment systems. The growth of smartphone usage nowadays has fostered a new and popular mobile payment environment. Most of the current generation smartphones support Bluetooth Low Energy (BLE) technology to communicate with nearby BLE-enabled devices. It is plausible to construct an Over-the-Air BLE-based mobile payment system as one of the payment methods for people living in modern societies. In this paper, a secure indoor positioning-based mobile payment authentication protocol with BLE technology and the corresponding mobile payment system design are proposed. The proposed protocol consists of three phases: initialization phase, session key construction phase, and authentication phase. When a customer moves toward the POS counter area, the proposed mobile payment system will automatically detect the position of the customer to confirm whether the customer is ready for the checkout process. Once the system has identified the customer is standing within the payment-enabled area, the payment system will invoke authentication process between POS and the customer’s smartphone through BLE communication channel to generate a secure session key and establish an authenticated communication session to perform the payment transaction accordingly. A prototype is implemented to assess the performance of the proposed design for mobile payment system. In addition, security analysis is conducted to evaluate the security strength of the proposed protocol. PMID:29587399
An Indoor Positioning-Based Mobile Payment System Using Bluetooth Low Energy Technology.
Yohan, Alexander; Lo, Nai-Wei; Winata, Doni
2018-03-25
The development of information technology has paved the way for faster and more convenient payment process flows and new methodology for the design and implementation of next generation payment systems. The growth of smartphone usage nowadays has fostered a new and popular mobile payment environment. Most of the current generation smartphones support Bluetooth Low Energy (BLE) technology to communicate with nearby BLE-enabled devices. It is plausible to construct an Over-the-Air BLE-based mobile payment system as one of the payment methods for people living in modern societies. In this paper, a secure indoor positioning-based mobile payment authentication protocol with BLE technology and the corresponding mobile payment system design are proposed. The proposed protocol consists of three phases: initialization phase, session key construction phase, and authentication phase. When a customer moves toward the POS counter area, the proposed mobile payment system will automatically detect the position of the customer to confirm whether the customer is ready for the checkout process. Once the system has identified the customer is standing within the payment-enabled area, the payment system will invoke authentication process between POS and the customer's smartphone through BLE communication channel to generate a secure session key and establish an authenticated communication session to perform the payment transaction accordingly. A prototype is implemented to assess the performance of the proposed design for mobile payment system. In addition, security analysis is conducted to evaluate the security strength of the proposed protocol.
Before the N400: effects of lexical-semantic violations in visual cortex.
Dikker, Suzanne; Pylkkanen, Liina
2011-07-01
There exists an increasing body of research demonstrating that language processing is aided by context-based predictions. Recent findings suggest that the brain generates estimates about the likely physical appearance of upcoming words based on syntactic predictions: words that do not physically look like the expected syntactic category show increased amplitudes in the visual M100 component, the first salient MEG response to visual stimulation. This research asks whether violations of predictions based on lexical-semantic information might similarly generate early visual effects. In a picture-noun matching task, we found early visual effects for words that did not accurately describe the preceding pictures. These results demonstrate that, just like syntactic predictions, lexical-semantic predictions can affect early visual processing around ∼100ms, suggesting that the M100 response is not exclusively tuned to recognizing visual features relevant to syntactic category analysis. Rather, the brain might generate predictions about upcoming visual input whenever it can. However, visual effects of lexical-semantic violations only occurred when a single lexical item could be predicted. We argue that this may be due to the fact that in natural language processing, there is typically no straightforward mapping between lexical-semantic fields (e.g., flowers) and visual or auditory forms (e.g., tulip, rose, magnolia). For syntactic categories, in contrast, certain form features do reliably correlate with category membership. This difference may, in part, explain why certain syntactic effects typically occur much earlier than lexical-semantic effects. Copyright © 2011 Elsevier Inc. All rights reserved.
Multi-Topic Tracking Model for dynamic social network
NASA Astrophysics Data System (ADS)
Li, Yuhua; Liu, Changzheng; Zhao, Ming; Li, Ruixuan; Xiao, Hailing; Wang, Kai; Zhang, Jun
2016-07-01
The topic tracking problem has attracted much attention in the last decades. However, existing approaches rarely consider network structures and textual topics together. In this paper, we propose a novel statistical model based on dynamic bayesian network, namely Multi-Topic Tracking Model for Dynamic Social Network (MTTD). It takes influence phenomenon, selection phenomenon, document generative process and the evolution of textual topics into account. Specifically, in our MTTD model, Gibbs Random Field is defined to model the influence of historical status of users in the network and the interdependency between them in order to consider the influence phenomenon. To address the selection phenomenon, a stochastic block model is used to model the link generation process based on the users' interests to topics. Probabilistic Latent Semantic Analysis (PLSA) is used to describe the document generative process according to the users' interests. Finally, the dependence on the historical topic status is also considered to ensure the continuity of the topic itself in topic evolution model. Expectation Maximization (EM) algorithm is utilized to estimate parameters in the proposed MTTD model. Empirical experiments on real datasets show that the MTTD model performs better than Popular Event Tracking (PET) and Dynamic Topic Model (DTM) in generalization performance, topic interpretability performance, topic content evolution and topic popularity evolution performance.
Applying Semantics in Dataset Summarization for Solar Data Ingest Pipelines
NASA Astrophysics Data System (ADS)
Michaelis, J.; McGuinness, D. L.; Zednik, S.; West, P.; Fox, P. A.
2012-12-01
One goal in studying phenomena of the solar corona (e.g., flares, coronal mass ejections) is to create and refine predictive models of space weather - which have broad implications for terrestrial activity (e.g., communication grid reliability). The High Altitude Observatory (HAO) [1] presently maintains an infrastructure for generating time-series visualizations of the solar corona. Through raw data gathered at the Mauna Loa Solar Observatory (MLSO) in Hawaii, HAO performs follow-up processing and quality control steps to derive visualization sets consumable by scientists. Individual visualizations will acquire several properties during their derivation, including: (i) the source instrument at MLSO used to obtain the raw data, (ii) the time the data was gathered, (iii) processing steps applied by HAO to generate the visualization, and (iv) quality metrics applied over both the raw and processed data. In parallel to MLSO's standard data gathering, time stamped observation logs are maintained by MLSO staff, which covers content of potential relevance to data gathered (such as local weather and instrument conditions). In this setting, while a significant amount of solar data is gathered, only small sections will typically be of interest to consuming parties. Additionally, direct presentation of solar data collections could overwhelm consumers (particularly those with limited background in the data structuring). This work explores how multidimensional analysis based navigation can be used to generate summary views of data collections, based on two operations: (i) grouping visualization entries based on similarity metrics (e.g., data gathered between 23:15-23:30 6-21-2012), or (ii) filtering entries (e.g., data with a quality score of UGLY, on a scale of GOOD, BAD, or UGLY). Here, semantic encodings of solar visualization collections (based on the Resource Description Framework (RDF) Datacube vocabulary [2]) are being utilized, based on the flexibility of the RDF model for supporting the following use cases: (i) Temporal alignment of time-stamped MLSO observations with raw data gathered at MLSO. (ii) Linking of multiple visualization entries to common (and structurally complex) workflow structures - designed to capture the visualization generation process. To provide real-world use cases for the described approach, a semantic summarization system is being developed for data gathered from HAO's Coronal Multi-channel Polarimeter (CoMP) and Chromospheric Helium-I Imaging Photometer (CHIP) pipelines. Web Links: [1] http://mlso.hao.ucar.edu/ [2] http://www.w3.org/TR/vocab-data-cube/
Expert consensus on best evaluative practices in community-based rehabilitation.
Grandisson, Marie; Thibeault, Rachel; Hébert, Michèle; Cameron, Debra
2016-01-01
The objective of this study was to generate expert consensus on best evaluative practices for community-based rehabilitation (CBR). This consensus includes key features of the evaluation process and methods, and discussion of whether a shared framework should be used to report findings and, if so, which framework should play this role. A Delphi study with two predefined rounds was conducted. Experts in CBR from a wide range of geographical areas and disciplinary backgrounds were recruited to complete the questionnaires. Both quantitative and qualitative analyses were performed to generate the recommendations for best practices in CBR evaluation. A panel of 42 experts reached consensus on 13 recommendations for best evaluative practices in CBR. In regard to the critical qualities of sound CBR evaluation processes, panellists emphasized that these processes should be inclusive, participatory, empowering and respectful of local cultures and languages. The group agreed that evaluators should consider the use of mixed methods and participatory tools, and should combine indicators from a universal list of CBR indicators with locally generated ones. The group also agreed that a common framework should guide CBR evaluations, and that this framework should be a flexible combination between the CBR Matrix and the CBR Principles. An expert panel reached consensus on key features of best evaluative practices in CBR. Knowledge transfer initiatives are now required to develop guidelines, tools and training opportunities to facilitate CBR program evaluations. CBR evaluation processes should strive to be inclusive, participatory, empowering and respectful of local cultures and languages. CBR evaluators should strongly consider using mixed methods, participatory tools, a combination of indicators generated with the local community and with others from a bank of CBR indicators. CBR evaluations should be situated within a shared, but flexible, framework. This shared framework could combine the CBR Matrix and the CBR Principles.
Plasma processes for producing silanes and derivatives thereof
Laine, Richard M; Massey, Dean Richard; Peterson, Peter Young
2014-03-25
The invention is generally related to process for generating one or more molecules having the formula Si.sub.xH.sub.y, Si.sub.xD.sub.y, Si.sub.xH.sub.yD.sub.z, and mixtures thereof, where x,y and z are integers .gtoreq.1, H is hydrogen and D is deuterium, such as silane, comprising the steps of: providing a silicon containing material, wherein the silicon containing material includes at least 20 weight percent silicon atoms based on the total weight of the silicon containing material; generating a plasma capable of vaporizing a silicon atom, sputtering a silicon atom, or both using a plasma generating device; and contacting the plasma to the silicon containing material in a chamber having an atmosphere that includes at least about 0.5 mole percent hydrogen atoms and/or deuterium atoms based on the total moles of atoms in the atmosphere; so that a molecule having the formula Si.sub.xH.sub.y; (e.g., silane) is generated. The process preferably includes a step of removing one or more impurities from the Si.sub.xH.sub.y (e.g., the silane) to form a clean Si.sub.xH.sub.y, Si.sub.xD.sub.y, Si.sub.xH.sub.yD.sub.z (e.g., silane). The process may also include a step of reacting the Si.sub.xH.sub.y, Si.sub.xD.sub.y, Si.sub.xH.sub.yD.sub.z (e.g., the silane) to produce a high purity silicon containing material such as electronic grade metallic silicon, photovoltaic grade metallic silicon, or both.
Methodologies for Development of Patient Specific Bone Models from Human Body CT Scans
NASA Astrophysics Data System (ADS)
Chougule, Vikas Narayan; Mulay, Arati Vinayak; Ahuja, Bharatkumar Bhagatraj
2016-06-01
This work deals with development of algorithm for physical replication of patient specific human bone and construction of corresponding implants/inserts RP models by using Reverse Engineering approach from non-invasive medical images for surgical purpose. In medical field, the volumetric data i.e. voxel and triangular facet based models are primarily used for bio-modelling and visualization, which requires huge memory space. On the other side, recent advances in Computer Aided Design (CAD) technology provides additional facilities/functions for design, prototyping and manufacturing of any object having freeform surfaces based on boundary representation techniques. This work presents a process to physical replication of 3D rapid prototyping (RP) physical models of human bone from various CAD modeling techniques developed by using 3D point cloud data which is obtained from non-invasive CT/MRI scans in DICOM 3.0 format. This point cloud data is used for construction of 3D CAD model by fitting B-spline curves through these points and then fitting surface between these curve networks by using swept blend techniques. This process also can be achieved by generating the triangular mesh directly from 3D point cloud data without developing any surface model using any commercial CAD software. The generated STL file from 3D point cloud data is used as a basic input for RP process. The Delaunay tetrahedralization approach is used to process the 3D point cloud data to obtain STL file. CT scan data of Metacarpus (human bone) is used as the case study for the generation of the 3D RP model. A 3D physical model of the human bone is generated on rapid prototyping machine and its virtual reality model is presented for visualization. The generated CAD model by different techniques is compared for the accuracy and reliability. The results of this research work are assessed for clinical reliability in replication of human bone in medical field.
Form-To-Expectation Matching Effects on First-Pass Eye Movement Measures During Reading
Farmer, Thomas A.; Yan, Shaorong; Bicknell, Klinton; Tanenhaus, Michael K.
2015-01-01
Recent EEG/MEG studies suggest that when contextual information is highly predictive of some property of a linguistic signal, expectations generated from context can be translated into surprisingly low-level estimates of the physical form-based properties likely to occur in subsequent portions of the unfolding signal. Whether form-based expectations are generated and assessed during natural reading, however, remains unclear. We monitored eye movements while participants read phonologically typical and atypical nouns in noun-predictive contexts (Experiment 1), demonstrating that when a noun is strongly expected, fixation durations on first-pass eye movement measures, including first fixation duration, gaze duration, and go-past times, are shorter for nouns with category typical form-based features. In Experiments 2 and 3, typical and atypical nouns were placed in sentential contexts normed to create expectations of variable strength for a noun. Context and typicality interacted significantly at gaze duration. These results suggest that during reading, form-based expectations that are translated from higher-level category-based expectancies can facilitate the processing of a word in context, and that their effect on lexical processing is graded based on the strength of category expectancy. PMID:25915072
Device and method to enhance availability of cluster-based processing systems
NASA Technical Reports Server (NTRS)
Lupia, David J. (Inventor); Ramos, Jeremy (Inventor); Samson, Jr., John R. (Inventor)
2010-01-01
An electronic computing device including at least one processing unit that implements a specific fault signal upon experiencing an associated fault, a control unit that generates a specific recovery signal upon receiving the fault signal from the at least one processing unit, and at least one input memory unit. The recovery signal initiates specific recovery processes in the at least one processing unit. The input memory buffers input data signals input to the at least one processing unit that experienced the fault during the recovery period.
A real-time spectroscopic sensor for monitoring laser welding processes.
Sibillano, Teresa; Ancona, Antonio; Berardi, Vincenzo; Lugarà, Pietro Mario
2009-01-01
In this paper we report on the development of a sensor for real time monitoring of laser welding processes based on spectroscopic techniques. The system is based on the acquisition of the optical spectra emitted from the laser generated plasma plume and their use to implement an on-line algorithm for both the calculation of the plasma electron temperature and the analysis of the correlations between selected spectral lines. The sensor has been patented and it is currently available on the market.
Amyloid-β production via cleavage of amyloid-β protein precursor is modulated by cell density.
Zhang, Can; Browne, Andrew; Divito, Jason R; Stevenson, Jesse A; Romano, Donna; Dong, Yuanlin; Xie, Zhongcong; Tanzi, Rudolph E
2010-01-01
Mounting evidence suggests that Alzheimer's disease (AD) is caused by the accumulation of the small peptide, amyloid-β (Aβ), a proteolytic cleavage product of amyloid-β protein precursor (AβPP). Aβ is generated through a serial cleavage of AβPP by β- and γ-secretase. Aβ40 and Aβ42 are the two main components of amyloid plaques in AD brains, with Aβ42 being more prone to aggregation. AβPP can also be processed by α-secretase, which cleaves AβPP within the Aβ sequence, thereby preventing the generation of Aβ. Little is currently known regarding the effects of cell density on AβPP processing and Aβ generation. Here we assessed the effects of cell density on AβPP processing in neuronal and non-neuronal cell lines, as well as mouse primary cortical neurons. We found that decreased cell density significantly increases levels of Aβ40, Aβ42, total Aβ, and the ratio of Aβ42: Aβ40. These results also indicate that cell density is a significant modulator of AβPP processing. Overall, these findings carry profound implications for both previous and forthcoming studies aiming to assess the effects of various conditions and genetic/chemical factors, e.g., novel drugs on AβPP processing and Aβ generation in cell-based systems. Moreover, it is interesting to speculate whether cell density changes in vivo may also affect AβPP processing and Aβ levels in the AD brain.
Sequential microfluidic droplet processing for rapid DNA extraction.
Pan, Xiaoyan; Zeng, Shaojiang; Zhang, Qingquan; Lin, Bingcheng; Qin, Jianhua
2011-11-01
This work describes a novel droplet-based microfluidic device, which enables sequential droplet processing for rapid DNA extraction. The microdevice consists of a droplet generation unit, two reagent addition units and three droplet splitting units. The loading/washing/elution steps required for DNA extraction were carried out by sequential microfluidic droplet processing. The movement of superparamagnetic beads, which were used as extraction supports, was controlled with magnetic field. The microdevice could generate about 100 droplets per min, and it took about 1 min for each droplet to perform the whole extraction process. The extraction efficiency was measured to be 46% for λ-DNA, and the extracted DNA could be used in subsequent genetic analysis such as PCR, demonstrating the potential of the device for fast DNA extraction. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; He, Yongqun
2015-01-01
It is time-consuming to build an ontology with many terms and axioms. Thus it is desired to automate the process of ontology development. Ontology Design Patterns (ODPs) provide a reusable solution to solve a recurrent modeling problem in the context of ontology engineering. Because ontology terms often follow specific ODPs, the Ontology for Biomedical Investigations (OBI) developers proposed a Quick Term Templates (QTTs) process targeted at generating new ontology classes following the same pattern, using term templates in a spreadsheet format. Inspired by the ODPs and QTTs, the Ontorat web application is developed to automatically generate new ontology terms, annotations of terms, and logical axioms based on a specific ODP(s). The inputs of an Ontorat execution include axiom expression settings, an input data file, ID generation settings, and a target ontology (optional). The axiom expression settings can be saved as a predesigned Ontorat setting format text file for reuse. The input data file is generated based on a template file created by a specific ODP (text or Excel format). Ontorat is an efficient tool for ontology expansion. Different use cases are described. For example, Ontorat was applied to automatically generate over 1,000 Japan RIKEN cell line cell terms with both logical axioms and rich annotation axioms in the Cell Line Ontology (CLO). Approximately 800 licensed animal vaccines were represented and annotated in the Vaccine Ontology (VO) by Ontorat. The OBI team used Ontorat to add assay and device terms required by ENCODE project. Ontorat was also used to add missing annotations to all existing Biobank specific terms in the Biobank Ontology. A collection of ODPs and templates with examples are provided on the Ontorat website and can be reused to facilitate ontology development. With ever increasing ontology development and applications, Ontorat provides a timely platform for generating and annotating a large number of ontology terms by following design patterns. http://ontorat.hegroup.org/.
Trombert-Paviot, B; Rodrigues, J M; Rogers, J E; Baud, R; van der Haring, E; Rassinoux, A M; Abrial, V; Clavel, L; Idir, H
2000-09-01
Generalised architecture for languages, encyclopedia and nomenclatures in medicine (GALEN) has developed a new generation of terminology tools based on a language independent model describing the semantics and allowing computer processing and multiple reuses as well as natural language understanding systems applications to facilitate the sharing and maintaining of consistent medical knowledge. During the European Union 4 Th. framework program project GALEN-IN-USE and later on within two contracts with the national health authorities we applied the modelling and the tools to the development of a new multipurpose coding system for surgical procedures named CCAM in a minority language country, France. On one hand, we contributed to a language independent knowledge repository and multilingual semantic dictionaries for multicultural Europe. On the other hand, we support the traditional process for creating a new coding system in medicine which is very much labour consuming by artificial intelligence tools using a medically oriented recursive ontology and natural language processing. We used an integrated software named CLAW (for classification workbench) to process French professional medical language rubrics produced by the national colleges of surgeons domain experts into intermediate dissections and to the Grail reference ontology model representation. From this language independent concept model representation, on one hand, we generate with the LNAT natural language generator controlled French natural language to support the finalization of the linguistic labels (first generation) in relation with the meanings of the conceptual system structure. On the other hand, the Claw classification manager proves to be very powerful to retrieve the initial domain experts rubrics list with different categories of concepts (second generation) within a semantic structured representation (third generation) bridge to the electronic patient record detailed terminology.
Wherry, E John; Golovina, Tatiana N; Morrison, Susan E; Sinnathamby, Gomathinayagam; McElhaugh, Michael J; Shockey, David C; Eisenlohr, Laurence C
2006-02-15
The proteasome is primarily responsible for the generation of MHC class I-restricted CTL epitopes. However, some epitopes, such as NP(147-155) of the influenza nucleoprotein (NP), are presented efficiently in the presence of proteasome inhibitors. The pathways used to generate such apparently "proteasome-independent" epitopes remain poorly defined. We have examined the generation of NP(147-155) and a second proteasome-dependent NP epitope, NP(50-57), using cells adapted to growth in the presence of proteasome inhibitors and also through protease overexpression. We observed that: 1) Ag processing and presentation proceeds in proteasome-inhibitor adapted cells but may become more dependent, at least in part, on nonproteasomal protease(s), 2) tripeptidyl peptidase II does not substitute for the proteasome in the generation of NP(147-155), 3) overexpression of leucine aminopeptidase, thymet oligopeptidase, puromycin-sensitive aminopeptidase, and bleomycin hydrolase, has little impact on the processing and presentation of NP(50-57) or NP(147-155), and 4) proteasome-inhibitor treatment altered the specificity of substrate cleavage by the proteasome using cell-free digests favoring NP(147-155) epitope preservation. Based on these results, we propose a central role for the proteasome in epitope generation even in the presence of proteasome inhibitors, although such inhibitors will likely alter cleavage patterns and may increase the dependence of the processing pathway on postproteasomal enzymes.
A Learning Framework for Winner-Take-All Networks with Stochastic Synapses.
Mostafa, Hesham; Cauwenberghs, Gert
2018-06-01
Many recent generative models make use of neural networks to transform the probability distribution of a simple low-dimensional noise process into the complex distribution of the data. This raises the question of whether biological networks operate along similar principles to implement a probabilistic model of the environment through transformations of intrinsic noise processes. The intrinsic neural and synaptic noise processes in biological networks, however, are quite different from the noise processes used in current abstract generative networks. This, together with the discrete nature of spikes and local circuit interactions among the neurons, raises several difficulties when using recent generative modeling frameworks to train biologically motivated models. In this letter, we show that a biologically motivated model based on multilayer winner-take-all circuits and stochastic synapses admits an approximate analytical description. This allows us to use the proposed networks in a variational learning setting where stochastic backpropagation is used to optimize a lower bound on the data log likelihood, thereby learning a generative model of the data. We illustrate the generality of the proposed networks and learning technique by using them in a structured output prediction task and a semisupervised learning task. Our results extend the domain of application of modern stochastic network architectures to networks where synaptic transmission failure is the principal noise mechanism.
Trombert-Paviot, B; Rodrigues, J M; Rogers, J E; Baud, R; van der Haring, E; Rassinoux, A M; Abrial, V; Clavel, L; Idir, H
1999-01-01
GALEN has developed a new generation of terminology tools based on a language independent concept reference model using a compositional formalism allowing computer processing and multiple reuses. During the 4th framework program project Galen-In-Use we applied the modelling and the tools to the development of a new multipurpose coding system for surgical procedures (CCAM) in France. On one hand we contributed to a language independent knowledge repository for multicultural Europe. On the other hand we support the traditional process for creating a new coding system in medicine which is very much labour consuming by artificial intelligence tools using a medically oriented recursive ontology and natural language processing. We used an integrated software named CLAW to process French professional medical language rubrics produced by the national colleges of surgeons into intermediate dissections and to the Grail reference ontology model representation. From this language independent concept model representation on one hand we generate controlled French natural language to support the finalization of the linguistic labels in relation with the meanings of the conceptual system structure. On the other hand the classification manager of third generation proves to be very powerful to retrieve the initial professional rubrics with different categories of concepts within a semantic network.
Ashley, Laura; Armitage, Gerry; Taylor, Julie
2017-03-01
Failure Modes and Effects Analysis (FMEA) is a prospective quality assurance methodology increasingly used in healthcare, which identifies potential vulnerabilities in complex, high-risk processes and generates remedial actions. We aimed, for the first time, to apply FMEA in a social care context to evaluate the process for recognising and referring children exposed to domestic abuse within one Midlands city safeguarding area in England. A multidisciplinary, multi-agency team of 10 front-line professionals undertook the FMEA, using a modified methodology, over seven group meetings. The FMEA included mapping out the process under evaluation to identify its component steps, identifying failure modes (potential errors) and possible causes for each step and generating corrective actions. In this article, we report the output from the FMEA, including illustrative examples of the failure modes and corrective actions generated. We also present an analysis of feedback from the FMEA team and provide future recommendations for the use of FMEA in appraising social care processes and practice. Although challenging, the FMEA was unequivocally valuable for team members and generated a significant number of corrective actions locally for the safeguarding board to consider in its response to children exposed to domestic abuse. © 2016 John Wiley & Sons Ltd.
Aligning observed and modelled behaviour based on workflow decomposition
NASA Astrophysics Data System (ADS)
Wang, Lu; Du, YuYue; Liu, Wei
2017-09-01
When business processes are mostly supported by information systems, the availability of event logs generated from these systems, as well as the requirement of appropriate process models are increasing. Business processes can be discovered, monitored and enhanced by extracting process-related information. However, some events cannot be correctly identified because of the explosion of the amount of event logs. Therefore, a new process mining technique is proposed based on a workflow decomposition method in this paper. Petri nets (PNs) are used to describe business processes, and then conformance checking of event logs and process models is investigated. A decomposition approach is proposed to divide large process models and event logs into several separate parts that can be analysed independently; while an alignment approach based on a state equation method in PN theory enhances the performance of conformance checking. Both approaches are implemented in programmable read-only memory (ProM). The correctness and effectiveness of the proposed methods are illustrated through experiments.
Alberto, Marta E; Butera, Valeria; Russo, Nino
2011-08-01
The platination processes of DNA bases with second- and third-generation Pt(II) anticancer drugs have been investigated using density functional theory (DFT) combined with the conductor-like dielectric continuum model (CPCM) approach, in order to describe their binding mechanisms and to obtain detailed data on the reaction energy profiles. Although there is no doubt that a Pt-N7 bond forms during initial attack, the energetic profiles for the formation of the monofunctional adducts are not known. Herein, a direct comparison between the rate of formation of the monofunctional adducts of the second- and third-generation anticancer drugs with guanine (G) and adenine (A) DNA bases has been made in order to spotlight possible common or different behavior. The guanine as target for platination process is confirmed to be preferred over adenine for all the investigated compounds and for both the hydrolyzed forms considered in our investigation. The preference for G purine base is dominated by electronic factors and promoted by a more favorable hydrogen-bonds pattern, confirming the important role played by H-bonds in determining both structural and kinetic control on the purine platination process. © 2011 American Chemical Society
Mitochondrial respiratory chain complexes as sources and targets of thiol-based redox-regulation.
Dröse, Stefan; Brandt, Ulrich; Wittig, Ilka
2014-08-01
The respiratory chain of the inner mitochondrial membrane is a unique assembly of protein complexes that transfers the electrons of reducing equivalents extracted from foodstuff to molecular oxygen to generate a proton-motive force as the primary energy source for cellular ATP-synthesis. Recent evidence indicates that redox reactions are also involved in regulating mitochondrial function via redox-modification of specific cysteine-thiol groups in subunits of respiratory chain complexes. Vice versa the generation of reactive oxygen species (ROS) by respiratory chain complexes may have an impact on the mitochondrial redox balance through reversible and irreversible thiol-modification of specific target proteins involved in redox signaling, but also pathophysiological processes. Recent evidence indicates that thiol-based redox regulation of the respiratory chain activity and especially S-nitrosylation of complex I could be a strategy to prevent elevated ROS production, oxidative damage and tissue necrosis during ischemia-reperfusion injury. This review focuses on the thiol-based redox processes involving the respiratory chain as a source as well as a target, including a general overview on mitochondria as highly compartmentalized redox organelles and on methods to investigate the redox state of mitochondrial proteins. This article is part of a Special Issue entitled: Thiol-Based Redox Processes. Copyright © 2014 Elsevier B.V. All rights reserved.