NASA Technical Reports Server (NTRS)
White, D. R.
1976-01-01
A high-vacuum complex composed of an atmospheric decontamination system, sample-processing chambers, storage chambers, and a transfer system was built to process and examine lunar material while maintaining quarantine status. Problems identified, equipment modifications, and procedure changes made for Apollo 11 and 12 sample processing are presented. The sample processing experiences indicate that only a few operating personnel are required to process the sample efficiently, safely, and rapidly in the high-vacuum complex. The high-vacuum complex was designed to handle the many contingencies, both quarantine and scientific, associated with handling an unknown entity such as the lunar sample. Lunar sample handling necessitated a complex system that could not respond rapidly to changing scientific requirements as the characteristics of the lunar sample were better defined. Although the complex successfully handled the processing of Apollo 11 and 12 lunar samples, the scientific requirement for vacuum samples was deleted after the Apollo 12 mission just as the vacuum system was reaching its full potential.
Controls for Burning Solid Wastes
ERIC Educational Resources Information Center
Toro, Richard F.; Weinstein, Norman J.
1975-01-01
Modern thermal solid waste processing systems are becoming more complex, incorporating features that require instrumentation and control systems to a degree greater than that previously required just for proper combustion control. With the advent of complex, sophisticated, thermal processing systems, TV monitoring and computer control should…
Data reduction complex analog-to-digital data processing requirements for onsite test facilities
NASA Technical Reports Server (NTRS)
Debbrecht, J. D.
1976-01-01
The analog to digital processing requirements of onsite test facilities are described. The source and medium of all input data to the Data Reduction Complex (DRC) and the destination and medium of all output products of the analog-to-digital processing are identified. Additionally, preliminary input and output data formats are presented along with the planned use of the output products.
NASA Technical Reports Server (NTRS)
1979-01-01
This document addresses requirements for post-test data reduction in support of the Orbital Flight Tests (OFT) mission evaluation team, specifically those which are planned to be implemented in the ODRC (Orbiter Data Reduction Complex). Only those requirements which have been previously baselined by the Data Systems and Analysis Directorate configuration control board are included. This document serves as the control document between Institutional Data Systems Division and the Integration Division for OFT mission evaluation data processing requirements, and shall be the basis for detailed design of ODRC data processing systems.
Automated Derivation of Complex System Constraints from User Requirements
NASA Technical Reports Server (NTRS)
Foshee, Mark; Murey, Kim; Marsh, Angela
2010-01-01
The Payload Operations Integration Center (POIC) located at the Marshall Space Flight Center has the responsibility of integrating US payload science requirements for the International Space Station (ISS). All payload operations must request ISS system resources so that the resource usage will be included in the ISS on-board execution timelines. The scheduling of resources and building of the timeline is performed using the Consolidated Planning System (CPS). The ISS resources are quite complex due to the large number of components that must be accounted for. The planners at the POIC simplify the process for Payload Developers (PD) by providing the PDs with a application that has the basic functionality PDs need as well as list of simplified resources in the User Requirements Collection (URC) application. The planners maintained a mapping of the URC resources to the CPS resources. The process of manually converting PD's science requirements from a simplified representation to a more complex CPS representation is a time-consuming and tedious process. The goal is to provide a software solution to allow the planners to build a mapping of the complex CPS constraints to the basic URC constraints and automatically convert the PD's requirements into systems requirements during export to CPS.
In vivo characterization of the Drosophila mRNA 3′ end processing core cleavage complex
Michalski, Daniel; Steiniger, Mindy
2015-01-01
A core cleavage complex (CCC) consisting of CPSF73, CPSF100, and Symplekin is required for cotranscriptional 3′ end processing of all metazoan pre-mRNAs, yet little is known about the in vivo molecular interactions within this complex. The CCC is a component of two distinct complexes, the cleavage/polyadenylation complex and the complex that processes nonpolyadenylated histone pre-mRNAs. RNAi-depletion of CCC factors in Drosophila culture cells causes reduction of CCC processing activity on histone mRNAs, resulting in read through transcription. In contrast, RNAi-depletion of factors only required for histone mRNA processing allows use of downstream cryptic polyadenylation signals to produce polyadenylated histone mRNAs. We used Dmel-2 tissue culture cells stably expressing tagged CCC components to determine that amino acids 272–1080 of Symplekin and the C-terminal approximately 200 amino acids of both CPSF73 and CPSF100 are required for efficient CCC formation in vivo. Additional experiments reveal that the C-terminal 241 amino acids of CPSF100 are sufficient for histone mRNA processing indicating that the first 524 amino acids of CPSF100 are dispensable for both CCC formation and histone mRNA 3′ end processing. CCCs containing deletions of Symplekin lacking the first 271 amino acids resulted in dramatic increased use of downstream polyadenylation sites for histone mRNA 3′ end processing similar to RNAi-depletion of histone-specific 3′ end processing factors FLASH, SLBP, and U7 snRNA. We propose a model in which CCC formation is mediated by CPSF73, CPSF100, and Symplekin C-termini, and the N-terminal region of Symplekin facilitates cotranscriptional 3′ end processing of histone mRNAs. PMID:26081560
Goal neglect and knowledge chunking in the construction of novel behaviour☆
Bhandari, Apoorva; Duncan, John
2014-01-01
Task complexity is critical in cognitive efficiency and fluid intelligence. To examine functional limits in task complexity, we examine the phenomenon of goal neglect, where participants with low fluid intelligence fail to follow task rules that they otherwise understand. Though neglect is known to increase with task complexity, here we show that – in contrast to previous accounts – the critical factor is not the total complexity of all task rules. Instead, when the space of task requirements can be divided into separate sub-parts, neglect is controlled by the complexity of each component part. The data also show that neglect develops and stabilizes over the first few performance trials, i.e. as instructions are first used to generate behaviour. In all complex behaviour, a critical process is combination of task events with retrieved task requirements to create focused attentional episodes dealing with each decision in turn. In large part, we suggest, fluid intelligence may reflect this process of converting complex requirements into effective attentional episodes. PMID:24141034
CNC Machining Of The Complex Copper Electrodes
NASA Astrophysics Data System (ADS)
Popan, Ioan Alexandru; Balc, Nicolae; Popan, Alina
2015-07-01
This paper presents the machining process of the complex copper electrodes. Machining of the complex shapes in copper is difficult because this material is soft and sticky. This research presents the main steps for processing those copper electrodes at a high dimensional accuracy and a good surface quality. Special tooling solutions are required for this machining process and optimal process parameters have been found for the accurate CNC equipment, using smart CAD/CAM software.
Chandler, Jacqueline; Rycroft-Malone, Jo; Hawkes, Claire; Noyes, Jane
2016-02-01
To examine the application of core concepts from Complexity Theory to explain the findings from a process evaluation undertaken in a trial evaluating implementation strategies for recommendations about reducing surgical fasting times. The proliferation of evidence-based guidance requires a greater focus on its implementation. Theory is required to explain the complex processes across the multiple healthcare organizational levels. This social healthcare context involves the interaction between professionals, patients and the organizational systems in care delivery. Complexity Theory may provide an explanatory framework to explain the complexities inherent in implementation in social healthcare contexts. A secondary thematic analysis of qualitative process evaluation data informed by Complexity Theory. Seminal texts applying Complexity Theory to the social context were annotated, key concepts extracted and core Complexity Theory concepts identified. These core concepts were applied as a theoretical lens to provide an explanation of themes from a process evaluation of a trial evaluating the implementation of strategies to reduce surgical fasting times. Sampled substantive texts provided a representative spread of theoretical development and application of Complexity Theory from late 1990's-2013 in social science, healthcare, management and philosophy. Five Complexity Theory core concepts extracted were 'self-organization', 'interaction', 'emergence', 'system history' and 'temporality'. Application of these concepts suggests routine surgical fasting practice is habituated in the social healthcare system and therefore it cannot easily be reversed. A reduction to fasting times requires an incentivised new approach to emerge in the surgical system's priority of completing the operating list. The application of Complexity Theory provides a useful explanation for resistance to change fasting practice. Its utility in implementation research warrants further attention and evaluation. © 2015 John Wiley & Sons Ltd.
SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES
Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems, due to the complex nature of the problems, the need for complex assessments, and complicated ...
Minimized state complexity of quantum-encoded cryptic processes
NASA Astrophysics Data System (ADS)
Riechers, Paul M.; Mahoney, John R.; Aghamohammadi, Cina; Crutchfield, James P.
2016-05-01
The predictive information required for proper trajectory sampling of a stochastic process can be more efficiently transmitted via a quantum channel than a classical one. This recent discovery allows quantum information processing to drastically reduce the memory necessary to simulate complex classical stochastic processes. It also points to a new perspective on the intrinsic complexity that nature must employ in generating the processes we observe. The quantum advantage increases with codeword length: the length of process sequences used in constructing the quantum communication scheme. In analogy with the classical complexity measure, statistical complexity, we use this reduced communication cost as an entropic measure of state complexity in the quantum representation. Previously difficult to compute, the quantum advantage is expressed here in closed form using spectral decomposition. This allows for efficient numerical computation of the quantum-reduced state complexity at all encoding lengths, including infinite. Additionally, it makes clear how finite-codeword reduction in state complexity is controlled by the classical process's cryptic order, and it allows asymptotic analysis of infinite-cryptic-order processes.
Goal neglect and knowledge chunking in the construction of novel behaviour.
Bhandari, Apoorva; Duncan, John
2014-01-01
Task complexity is critical in cognitive efficiency and fluid intelligence. To examine functional limits in task complexity, we examine the phenomenon of goal neglect, where participants with low fluid intelligence fail to follow task rules that they otherwise understand. Though neglect is known to increase with task complexity, here we show that - in contrast to previous accounts - the critical factor is not the total complexity of all task rules. Instead, when the space of task requirements can be divided into separate sub-parts, neglect is controlled by the complexity of each component part. The data also show that neglect develops and stabilizes over the first few performance trials, i.e. as instructions are first used to generate behaviour. In all complex behaviour, a critical process is combination of task events with retrieved task requirements to create focused attentional episodes dealing with each decision in turn. In large part, we suggest, fluid intelligence may reflect this process of converting complex requirements into effective attentional episodes. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.
Enhanced Pure-Tone Pitch Discrimination among Persons with Autism but not Asperger Syndrome
ERIC Educational Resources Information Center
Bonnel, Anna; McAdams, Stephen; Smith, Bennett; Berthiaume, Claude; Bertone, Armando; Ciocca, Valter; Burack, Jacob A.; Mottron, Laurent
2010-01-01
Persons with Autism spectrum disorders (ASD) display atypical perceptual processing in visual and auditory tasks. In vision, Bertone, Mottron, Jelenic, and Faubert (2005) found that enhanced and diminished visual processing is linked to the level of neural complexity required to process stimuli, as proposed in the neural complexity hypothesis.…
A SYSTEMATIC PROCEDURE FOR DESIGNING PROCESSES WITH MULTIPLE ENVIRONMENTAL OBJECTIVES
Evaluation and analysis of multiple objectives are very important in designing environmentally benign processes. They require a systematic procedure for solving multi-objective decision-making problems due to the complex nature of the problems and the need for complex assessment....
Experimentally modeling stochastic processes with less memory by the use of a quantum processor
Palsson, Matthew S.; Gu, Mile; Ho, Joseph; Wiseman, Howard M.; Pryde, Geoff J.
2017-01-01
Computer simulation of observable phenomena is an indispensable tool for engineering new technology, understanding the natural world, and studying human society. However, the most interesting systems are often so complex that simulating their future behavior demands storing immense amounts of information regarding how they have behaved in the past. For increasingly complex systems, simulation becomes increasingly difficult and is ultimately constrained by resources such as computer memory. Recent theoretical work shows that quantum theory can reduce this memory requirement beyond ultimate classical limits, as measured by a process’ statistical complexity, C. We experimentally demonstrate this quantum advantage in simulating stochastic processes. Our quantum implementation observes a memory requirement of Cq = 0.05 ± 0.01, far below the ultimate classical limit of C = 1. Scaling up this technique would substantially reduce the memory required in simulations of more complex systems. PMID:28168218
QMU as an approach to strengthening the predictive capabilities of complex models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gray, Genetha Anne.; Boggs, Paul T.; Grace, Matthew D.
2010-09-01
Complex systems are made up of multiple interdependent parts, and the behavior of the entire system cannot always be directly inferred from the behavior of the individual parts. They are nonlinear and system responses are not necessarily additive. Examples of complex systems include energy, cyber and telecommunication infrastructures, human and animal social structures, and biological structures such as cells. To meet the goals of infrastructure development, maintenance, and protection for cyber-related complex systems, novel modeling and simulation technology is needed. Sandia has shown success using M&S in the nuclear weapons (NW) program. However, complex systems represent a significant challenge andmore » relative departure from the classical M&S exercises, and many of the scientific and mathematical M&S processes must be re-envisioned. Specifically, in the NW program, requirements and acceptable margins for performance, resilience, and security are well-defined and given quantitatively from the start. The Quantification of Margins and Uncertainties (QMU) process helps to assess whether or not these safety, reliability and performance requirements have been met after a system has been developed. In this sense, QMU is used as a sort of check that requirements have been met once the development process is completed. In contrast, performance requirements and margins may not have been defined a priori for many complex systems, (i.e. the Internet, electrical distribution grids, etc.), particularly not in quantitative terms. This project addresses this fundamental difference by investigating the use of QMU at the start of the design process for complex systems. Three major tasks were completed. First, the characteristics of the cyber infrastructure problem were collected and considered in the context of QMU-based tools. Second, UQ methodologies for the quantification of model discrepancies were considered in the context of statistical models of cyber activity. Third, Bayesian methods for optimal testing in the QMU framework were developed. This completion of this project represent an increased understanding of how to apply and use the QMU process as a means for improving model predictions of the behavior of complex systems. 4« less
Using nocturnal cold air drainage flow to monitor ecosystem processes in complex terrain
Thomas G. Pypker; Michael H. Unsworth; Alan C. Mix; William Rugh; Troy Ocheltree; Karrin Alstad; Barbara J. Bond
2007-01-01
This paper presents initial investigations of a new approach to monitor ecosystem processes in complex terrain on large scales. Metabolic processes in mountainous ecosystems are poorly represented in current ecosystem monitoring campaigns because the methods used for monitoring metabolism at the ecosystem scale (e.g., eddy covariance) require flat study sites. Our goal...
Structure and Randomness of Continuous-Time, Discrete-Event Processes
NASA Astrophysics Data System (ADS)
Marzen, Sarah E.; Crutchfield, James P.
2017-10-01
Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomness; the statistical complexity gives the cost of predicting the process. We calculate, for the first time, the entropy rate and statistical complexity of stochastic processes generated by finite unifilar hidden semi-Markov models—memoryful, state-dependent versions of renewal processes. Calculating these quantities requires introducing novel mathematical objects (ɛ -machines of hidden semi-Markov processes) and new information-theoretic methods to stochastic processes.
Not so Complex: Iteration in the Complex Plane
ERIC Educational Resources Information Center
O'Dell, Robin S.
2014-01-01
The simple process of iteration can produce complex and beautiful figures. In this article, Robin O'Dell presents a set of tasks requiring students to use the geometric interpretation of complex number multiplication to construct linear iteration rules. When the outputs are plotted in the complex plane, the graphs trace pleasing designs…
Measuring the Level of Complexity of Scientific Inquiries: The LCSI Index
ERIC Educational Resources Information Center
Eilam, Efrat
2015-01-01
The study developed and applied an index for measuring the level of complexity of full authentic scientific inquiry. Complexity is a fundamental attribute of real life scientific research. The level of complexity is an overall reflection of complex cognitive and metacognitive processes which are required for navigating the authentic inquiry…
Bertone, Armando; Mottron, Laurent; Jelenic, Patricia; Faubert, Jocelyn
2005-10-01
Visuo-perceptual processing in autism is characterized by intact or enhanced performance on static spatial tasks and inferior performance on dynamic tasks, suggesting a deficit of dorsal visual stream processing in autism. However, previous findings by Bertone et al. indicate that neuro-integrative mechanisms used to detect complex motion, rather than motion perception per se, may be impaired in autism. We present here the first demonstration of concurrent enhanced and decreased performance in autism on the same visuo-spatial static task, wherein the only factor dichotomizing performance was the neural complexity required to discriminate grating orientation. The ability of persons with autism was found to be superior for identifying the orientation of simple, luminance-defined (or first-order) gratings but inferior for complex, texture-defined (or second-order) gratings. Using a flicker contrast sensitivity task, we demonstrated that this finding is probably not due to abnormal information processing at a sub-cortical level (magnocellular and parvocellular functioning). Together, these findings are interpreted as a clear indication of altered low-level perceptual information processing in autism, and confirm that the deficits and assets observed in autistic visual perception are contingent on the complexity of the neural network required to process a given type of visual stimulus. We suggest that atypical neural connectivity, resulting in enhanced lateral inhibition, may account for both enhanced and decreased low-level information processing in autism.
Integrated Information Increases with Fitness in the Evolution of Animats
Edlund, Jeffrey A.; Chaumont, Nicolas; Hintze, Arend; Koch, Christof; Tononi, Giulio; Adami, Christoph
2011-01-01
One of the hallmarks of biological organisms is their ability to integrate disparate information sources to optimize their behavior in complex environments. How this capability can be quantified and related to the functional complexity of an organism remains a challenging problem, in particular since organismal functional complexity is not well-defined. We present here several candidate measures that quantify information and integration, and study their dependence on fitness as an artificial agent (“animat”) evolves over thousands of generations to solve a navigation task in a simple, simulated environment. We compare the ability of these measures to predict high fitness with more conventional information-theoretic processing measures. As the animat adapts by increasing its “fit” to the world, information integration and processing increase commensurately along the evolutionary line of descent. We suggest that the correlation of fitness with information integration and with processing measures implies that high fitness requires both information processing as well as integration, but that information integration may be a better measure when the task requires memory. A correlation of measures of information integration (but also information processing) and fitness strongly suggests that these measures reflect the functional complexity of the animat, and that such measures can be used to quantify functional complexity even in the absence of fitness data. PMID:22028639
Lu, X; Welsh, T M; Peterlin, B M
1993-01-01
The human immunodeficiency virus type 1 long terminal repeat sets up two different transcription complexes, which have been called processive and nonprocessive complexes. By mutating and substituting cis-acting sequences, we mapped elements of the human immunodeficiency virus long terminal repeat that are responsible for creating each transcription complex. Whereas processive complexes are efficiently assembled by upstream promoter elements in the absence of the TATA box, nonprocessive complexes absolutely require the TATA box. Moreover, the TATA box alone can set up these nonprocessive complexes, and nonprocessive but not processive complexes are trans activated by Tat. Finally, a strong DNA-binding site between the TATA box and trans-activation-responsive region interferes with either the assembly or movement of these nonprocessive complexes and diminishes the effects of Tat. Thus, Tat affects a critical step in the formation of elongation-competent transcription complexes. Images PMID:8445708
The Evolutionarily Conserved Protein LAS1 Is Required for Pre-rRNA Processing at Both Ends of ITS2
Schillewaert, Stéphanie; Wacheul, Ludivine; Lhomme, Frédéric
2012-01-01
Ribosome synthesis entails the formation of mature rRNAs from long precursor molecules, following a complex pre-rRNA processing pathway. Why the generation of mature rRNA ends is so complicated is unclear. Nor is it understood how pre-rRNA processing is coordinated at distant sites on pre-rRNA molecules. Here we characterized, in budding yeast and human cells, the evolutionarily conserved protein Las1. We found that, in both species, Las1 is required to process ITS2, which separates the 5.8S and 25S/28S rRNAs. In yeast, Las1 is required for pre-rRNA processing at both ends of ITS2. It is required for Rrp6-dependent formation of the 5.8S rRNA 3′ end and for Rat1-dependent formation of the 25S rRNA 5′ end. We further show that the Rat1-Rai1 5′-3′ exoribonuclease (exoRNase) complex functionally connects processing at both ends of the 5.8S rRNA. We suggest that pre-rRNA processing is coordinated at both ends of 5.8S rRNA and both ends of ITS2, which are brought together by pre-rRNA folding, by an RNA processing complex. Consistently, we note the conspicuous presence of ∼7- or 8-nucleotide extensions on both ends of 5.8S rRNA precursors and at the 5′ end of pre-25S RNAs suggestive of a protected spacer fragment of similar length. PMID:22083961
Munir, Samina K; Kay, Stephen
2005-08-01
A multi-site study, conducted in two English and two Danish intensive care units, investigates the complexity of work processes in intensive care, and the implications of this complexity for information management with regards to clinical information systems. Data were collected via observations, shadowing of clinical staff, interviews and questionnaires. The construction of role activity diagrams enabled the capture of critical care work processes. Upon analysing these diagrams, it was found that intensive care work processes consist of 'simplified-complexity', these processes are changed with the introduction of information systems for the everyday use and management of all clinical information. The prevailing notion of complexity surrounding critical care clinical work processes was refuted and found to be misleading; in reality, it is not the work processes that cause the complexity, the complexity is rooted in the way in which clinical information is used and managed. This study emphasises that the potential for clinical information systems that consider integrating all clinical information requirements is not only immense but also very plausible.
Improving the Aircraft Design Process Using Web-Based Modeling and Simulation
NASA Technical Reports Server (NTRS)
Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.; Follen, Gregory J. (Technical Monitor)
2000-01-01
Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and multifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.
Improving the Aircraft Design Process Using Web-based Modeling and Simulation
NASA Technical Reports Server (NTRS)
Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.
2003-01-01
Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.
Syntactic Recursion Facilitates and Working Memory Predicts Recursive Theory of Mind
Arslan, Burcu; Hohenberger, Annette; Verbrugge, Rineke
2017-01-01
In this study, we focus on the possible roles of second-order syntactic recursion and working memory in terms of simple and complex span tasks in the development of second-order false belief reasoning. We tested 89 Turkish children in two age groups, one younger (4;6–6;5 years) and one older (6;7–8;10 years). Although second-order syntactic recursion is significantly correlated with the second-order false belief task, results of ordinal logistic regressions revealed that the main predictor of second-order false belief reasoning is complex working memory span. Unlike simple working memory and second-order syntactic recursion tasks, the complex working memory task required processing information serially with additional reasoning demands that require complex working memory strategies. Based on our results, we propose that children’s second-order theory of mind develops when they have efficient reasoning rules to process embedded beliefs serially, thus overcoming a possible serial processing bottleneck. PMID:28072823
Integrating a Genetic Algorithm Into a Knowledge-Based System for Ordering Complex Design Processes
NASA Technical Reports Server (NTRS)
Rogers, James L.; McCulley, Collin M.; Bloebaum, Christina L.
1996-01-01
The design cycle associated with large engineering systems requires an initial decomposition of the complex system into design processes which are coupled through the transference of output data. Some of these design processes may be grouped into iterative subcycles. In analyzing or optimizing such a coupled system, it is essential to be able to determine the best ordering of the processes within these subcycles to reduce design cycle time and cost. Many decomposition approaches assume the capability is available to determine what design processes and couplings exist and what order of execution will be imposed during the design cycle. Unfortunately, this is often a complex problem and beyond the capabilities of a human design manager. A new feature, a genetic algorithm, has been added to DeMAID (Design Manager's Aid for Intelligent Decomposition) to allow the design manager to rapidly examine many different combinations of ordering processes in an iterative subcycle and to optimize the ordering based on cost, time, and iteration requirements. Two sample test cases are presented to show the effects of optimizing the ordering with a genetic algorithm.
ERIC Educational Resources Information Center
Plant, Jennifer L.; Corden, Mark; Mourad, Michelle; O'Brien, Bridget C.; van Schaik, Sandrijn M.
2013-01-01
;Self-directed learning requires self-assessment of learning needs and performance, a complex process that requires collecting and interpreting data from various sources. Learners' approaches to self-assessment likely vary depending on the learner and the context. The aim of this study was to gain insight into how learners process external…
Local wavelet transform: a cost-efficient custom processor for space image compression
NASA Astrophysics Data System (ADS)
Masschelein, Bart; Bormans, Jan G.; Lafruit, Gauthier
2002-11-01
Thanks to its intrinsic scalability features, the wavelet transform has become increasingly popular as decorrelator in image compression applications. Throuhgput, memory requirements and complexity are important parameters when developing hardware image compression modules. An implementation of the classical, global wavelet transform requires large memory sizes and implies a large latency between the availability of the input image and the production of minimal data entities for entropy coding. Image tiling methods, as proposed by JPEG2000, reduce the memory sizes and the latency, but inevitably introduce image artefacts. The Local Wavelet Transform (LWT), presented in this paper, is a low-complexity wavelet transform architecture using a block-based processing that results in the same transformed images as those obtained by the global wavelet transform. The architecture minimizes the processing latency with a limited amount of memory. Moreover, as the LWT is an instruction-based custom processor, it can be programmed for specific tasks, such as push-broom processing of infinite-length satelite images. The features of the LWT makes it appropriate for use in space image compression, where high throughput, low memory sizes, low complexity, low power and push-broom processing are important requirements.
Using Concepts from Complexity Science to Accelerate Curricular Revision
ERIC Educational Resources Information Center
Goldman, Ellen F.; Mintz, Matthew L.
2017-01-01
Curricular revision can be an arduous and challenging process. The literature favors a rational planned process for doing so, but offers little advice regarding how to proceed when the time required for such an approach is not available. This article describes our use of four concepts from complexity science to revise a medical school curriculum…
Challenges in Developing Models Describing Complex Soil Systems
NASA Astrophysics Data System (ADS)
Simunek, J.; Jacques, D.
2014-12-01
Quantitative mechanistic models that consider basic physical, mechanical, chemical, and biological processes have the potential to be powerful tools to integrate our understanding of complex soil systems, and the soil science community has often called for models that would include a large number of these diverse processes. However, once attempts have been made to develop such models, the response from the community has not always been overwhelming, especially after it discovered that these models are consequently highly complex, requiring not only a large number of parameters, not all of which can be easily (or at all) measured and/or identified, and which are often associated with large uncertainties, but also requiring from their users deep knowledge of all/most of these implemented physical, mechanical, chemical and biological processes. Real, or perceived, complexity of these models then discourages users from using them even for relatively simple applications, for which they would be perfectly adequate. Due to the nonlinear nature and chemical/biological complexity of the soil systems, it is also virtually impossible to verify these types of models analytically, raising doubts about their applicability. Code inter-comparisons, which is then likely the most suitable method to assess code capabilities and model performance, requires existence of multiple models of similar/overlapping capabilities, which may not always exist. It is thus a challenge not only to developed models describing complex soil systems, but also to persuade the soil science community in using them. As a result, complex quantitative mechanistic models are still an underutilized tool in soil science research. We will demonstrate some of the challenges discussed above on our own efforts in developing quantitative mechanistic models (such as HP1/2) for complex soil systems.
Automated Derivation of Complex System Constraints from User Requirements
NASA Technical Reports Server (NTRS)
Muery, Kim; Foshee, Mark; Marsh, Angela
2006-01-01
International Space Station (ISS) payload developers submit their payload science requirements for the development of on-board execution timelines. The ISS systems required to execute the payload science operations must be represented as constraints for the execution timeline. Payload developers use a software application, User Requirements Collection (URC), to submit their requirements by selecting a simplified representation of ISS system constraints. To fully represent the complex ISS systems, the constraints require a level of detail that is beyond the insight of the payload developer. To provide the complex representation of the ISS system constraints, HOSC operations personnel, specifically the Payload Activity Requirements Coordinators (PARC), manually translate the payload developers simplified constraints into detailed ISS system constraints used for scheduling the payload activities in the Consolidated Planning System (CPS). This paper describes the implementation for a software application, User Requirements Integration (URI), developed to automate the manual ISS constraint translation process.
The neural correlates of morphological complexity processing: Detecting structure in pseudowords.
Schuster, Swetlana; Scharinger, Mathias; Brooks, Colin; Lahiri, Aditi; Hartwigsen, Gesa
2018-06-01
Morphological complexity is a highly debated issue in visual word recognition. Previous neuroimaging studies have shown that speakers are sensitive to degrees of morphological complexity. Two-step derived complex words (bridging through bridge N > bridge V > bridging) led to more enhanced activation in the left inferior frontal gyrus than their 1-step derived counterparts (running through run V > running). However, it remains unclear whether sensitivity to degrees of morphological complexity extends to pseudowords. If this were the case, it would indicate that abstract knowledge of morphological structure is independent of lexicality. We addressed this question by investigating the processing of two sets of pseudowords in German. Both sets contained morphologically viable two-step derived pseudowords differing in the number of derivational steps required to access an existing lexical representation and therefore the degree of structural analysis expected during processing. Using a 2 × 2 factorial design, we found lexicality effects to be distinct from processing signatures relating to structural analysis in pseudowords. Semantically-driven processes such as lexical search showed a more frontal distribution while combinatorial processes related to structural analysis engaged more parietal parts of the network. Specifically, more complex pseudowords showed increased activation in parietal regions (right superior parietal lobe and left precuneus) relative to pseudowords that required less structural analysis to arrive at an existing lexical representation. As the two sets were matched on cohort size and surface form, these results highlight the role of internal levels of morphological structure even in forms that do not possess a lexical representation. © 2018 Wiley Periodicals, Inc.
Enhanced pure-tone pitch discrimination among persons with autism but not Asperger syndrome.
Bonnel, Anna; McAdams, Stephen; Smith, Bennett; Berthiaume, Claude; Bertone, Armando; Ciocca, Valter; Burack, Jacob A; Mottron, Laurent
2010-07-01
Persons with Autism spectrum disorders (ASD) display atypical perceptual processing in visual and auditory tasks. In vision, Bertone, Mottron, Jelenic, and Faubert (2005) found that enhanced and diminished visual processing is linked to the level of neural complexity required to process stimuli, as proposed in the neural complexity hypothesis. Based on these findings, Samson, Mottron, Jemel, Belin, and Ciocca (2006) proposed to extend the neural complexity hypothesis to the auditory modality. They hypothesized that persons with ASD should display enhanced performance for simple tones that are processed in primary auditory cortical regions, but diminished performance for complex tones that require additional processing in associative auditory regions, in comparison to typically developing individuals. To assess this hypothesis, we designed four auditory discrimination experiments targeting pitch, non-vocal and vocal timbre, and loudness. Stimuli consisted of spectro-temporally simple and complex tones. The participants were adolescents and young adults with autism, Asperger syndrome, and typical developmental histories, all with IQs in the normal range. Consistent with the neural complexity hypothesis and enhanced perceptual functioning model of ASD (Mottron, Dawson, Soulières, Hubert, & Burack, 2006), the participants with autism, but not with Asperger syndrome, displayed enhanced pitch discrimination for simple tones. However, no discrimination-thresholds differences were found between the participants with ASD and the typically developing persons across spectrally and temporally complex conditions. These findings indicate that enhanced pure-tone pitch discrimination may be a cognitive correlate of speech-delay among persons with ASD. However, auditory discrimination among this group does not appear to be directly contingent on the spectro-temporal complexity of the stimuli. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
Integrated copper-containing wastewater treatment using xanthate process.
Chang, Yi-Kuo; Chang, Juu-En; Lin, Tzong-Tzeng; Hsu, Yu-Ming
2002-09-02
Although, the xanthate process has been shown to be an effective method for heavy metal removal from contaminated water, a heavy metal contaminated residual sludge is produced by the treatment process and the metal-xanthate sludge must be handled in accordance with the Taiwan EPA's waste disposal requirements. This work employed potassium ethyl xanthate (KEX) to remove copper ions from wastewater. The toxicity characteristic leaching procedure (TCLP) and semi-dynamic leaching test (SDLT) were used to determine the leaching potential and stability characteristics of the residual copper xanthate (Cu-EX) complexes. Results from metal removal experiments showed that KEX was suitable for the treatment of copper-containing wastewater over a wide copper concentration range (50, 100, 500, and 1000 mg/l) to the level that meets the Taiwan EPA's effluent regulations (3mg/l). The TCLP results of the residual Cu-EX complexes could meet the current regulations and thus the Cu-EX complexes could be treated as a non-hazardous material. Besides, the results of SDLT indicated that the complexes exhibited an excellent performance for stabilizing metals under acidic conditions, even slight chemical changes of the complexes occurred during extraction. The xanthate process, mixing KEX with copper-bearing solution to form Cu-EX precipitates, offered a comprehensive strategy for solving both copper-containing wastewater problems and subsequent sludge disposal requirements.
The Influence of Cultural Factors on Trust in Automation
ERIC Educational Resources Information Center
Chien, Shih-Yi James
2016-01-01
Human interaction with automation is a complex process that requires both skilled operators and complex system designs to effectively enhance overall performance. Although automation has successfully managed complex systems throughout the world for over half a century, inappropriate reliance on automation can still occur, such as the recent…
Modeling Complex Workflow in Molecular Diagnostics
Gomah, Mohamed E.; Turley, James P.; Lu, Huimin; Jones, Dan
2010-01-01
One of the hurdles to achieving personalized medicine has been implementing the laboratory processes for performing and reporting complex molecular tests. The rapidly changing test rosters and complex analysis platforms in molecular diagnostics have meant that many clinical laboratories still use labor-intensive manual processing and testing without the level of automation seen in high-volume chemistry and hematology testing. We provide here a discussion of design requirements and the results of implementation of a suite of lab management tools that incorporate the many elements required for use of molecular diagnostics in personalized medicine, particularly in cancer. These applications provide the functionality required for sample accessioning and tracking, material generation, and testing that are particular to the evolving needs of individualized molecular diagnostics. On implementation, the applications described here resulted in improvements in the turn-around time for reporting of more complex molecular test sets, and significant changes in the workflow. Therefore, careful mapping of workflow can permit design of software applications that simplify even the complex demands of specialized molecular testing. By incorporating design features for order review, software tools can permit a more personalized approach to sample handling and test selection without compromising efficiency. PMID:20007844
Mahoney, John R; Aghamohammadi, Cina; Crutchfield, James P
2016-02-15
A stochastic process' statistical complexity stands out as a fundamental property: the minimum information required to synchronize one process generator to another. How much information is required, though, when synchronizing over a quantum channel? Recent work demonstrated that representing causal similarity as quantum state-indistinguishability provides a quantum advantage. We generalize this to synchronization and offer a sequence of constructions that exploit extended causal structures, finding substantial increase of the quantum advantage. We demonstrate that maximum compression is determined by the process' cryptic order--a classical, topological property closely allied to Markov order, itself a measure of historical dependence. We introduce an efficient algorithm that computes the quantum advantage and close noting that the advantage comes at a cost-one trades off prediction for generation complexity.
Overview of DYMCAS, the Y-12 Material Control And Accountability System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alspaugh, D. H.
2001-07-01
This paper gives an overview of DYMCAS, the material control and accountability information system for the Y-12 National Security Complex. A common misconception, even within the DOE community, understates the nature and complexity of material control and accountability (MC and A) systems, likening them to parcel delivery systems tracking packages at various locations or banking systems that account for money, down to the penny. A major point set forth in this paper is that MC and A systems such as DYMCAS can be and often are very complex. Given accountability reporting requirements and the critical and sensitive nature of themore » task, no MC and A system can be simple. The complexity of site-level accountability systems, however, varies dramatically depending on the amounts, kinds, and forms of nuclear materials and the kinds of processing performed at the site. Some accountability systems are tailored to unique and highly complex site-level materials and material processing and, consequently, are highly complex systems. Sites with less complexity require less complex accountability systems, and where processes and practices are the same or similar, sites on the mid-to-low end of the complexity scale can effectively utilize a standard accountability system. In addition to being complex, a unique feature of DYMCAS is its integration with the site production control and manufacturing system. This paper will review the advantages of such integration, as well as related challenges, and make the point that the effectiveness of complex MC and A systems can be significantly enhanced through appropriate systems integration.« less
Geometric and Algebraic Approaches in the Concept of Complex Numbers
ERIC Educational Resources Information Center
Panaoura, A.; Elia, I.; Gagatsis, A.; Giatilis, G.-P.
2006-01-01
This study explores pupils' performance and processes in tasks involving equations and inequalities of complex numbers requiring conversions from a geometric representation to an algebraic representation and conversions in the reverse direction, and also in complex numbers problem solving. Data were collected from 95 pupils of the final grade from…
The Bright Side of Being Blue: Depression as an Adaptation for Analyzing Complex Problems
ERIC Educational Resources Information Center
Andrews, Paul W.; Thomson, J. Anderson, Jr.
2009-01-01
Depression is the primary emotional condition for which help is sought. Depressed people often report persistent rumination, which involves analysis, and complex social problems in their lives. Analysis is often a useful approach for solving complex problems, but it requires slow, sustained processing, so disruption would interfere with problem…
Mitchell, Rachel L C; Vidaki, Kleio; Lavidor, Michal
2016-10-01
For complex linguistic strings such as idioms, making a decision as to the correct meaning may require complex top-down cognitive control such as the suppression of incorrect alternative meanings. In the study presented here, we used transcranial direct current stimulation to test the hypothesis that a domain general dorsolateral prefrontal cognitive control network is involved in constraining the complex processing involved. Specifically, we sought to test prominent theoretical stances on the division of labour across dorsolateral prefrontal cortex in the left- and right-hemispheres of the brain, including the role of salience and fine vs. coarse semantic coding. 32 healthy young adult participants were randomly allocated to one of two stimulation montage groups (LH anodal/RH cathodal or RH anodal/LH cathodal). Participants were tested twice, completing a semantic decision task after either receiving active or sham stimulation. The semantic decision task required participants to judge the relatedness of an idiom and a target word. The target word was figuratively related, literally related, or unrelated to the idiom. Control non-literal non-idiomatic sentences were also included that only had a literal meaning. The results showed that left-hemisphere dorsolateral prefrontal cortex is highly involved in processing figurative language, whereas both left- and right- dorsolateral prefrontal cortex contributed to literal language processing. In comparison, semantic processing for the non-idiomatic control sentences did not require domain general cognitive control as it relates to suppression of the rejected alternative meaning. The results are discussed in terms of the interplay between need for domain general cognitive control in understanding the meaning of complex sentences, hemispheric differences in semantic processing, and salience detection. Copyright © 2016 Elsevier Ltd. All rights reserved.
Expert systems for superalloy studies
NASA Technical Reports Server (NTRS)
Workman, Gary L.; Kaukler, William F.
1990-01-01
There are many areas in science and engineering which require knowledge of an extremely complex foundation of experimental results in order to design methodologies for developing new materials or products. Superalloys are an area which fit well into this discussion in the sense that they are complex combinations of elements which exhibit certain characteristics. Obviously the use of superalloys in high performance, high temperature systems such as the Space Shuttle Main Engine is of interest to NASA. The superalloy manufacturing process is complex and the implementation of an expert system within the design process requires some thought as to how and where it should be implemented. A major motivation is to develop a methodology to assist metallurgists in the design of superalloy materials using current expert systems technology. Hydrogen embrittlement is disasterous to rocket engines and the heuristics can be very complex. Attacking this problem as one module in the overall design process represents a significant step forward. In order to describe the objectives of the first phase implementation, the expert system was designated Hydrogen Environment Embrittlement Expert System (HEEES).
Saturn S-2 production operations techniques: Production welding. Volume 1: Bulkhead welding
NASA Technical Reports Server (NTRS)
Abel, O. G.
1970-01-01
The complex Saturn S-2 welding processes and procedures required considerable development and refinement to establish a production capability that could consistently produce aluminum alloy welds within specified requirements. The special processes and techniques are defined that were established for the welding of gore-to-gore and manhole- or closeout-to-gore.
NASA Technical Reports Server (NTRS)
2001-01-01
REI Systems, Inc. developed a software solution that uses the Internet to eliminate the paperwork typically required to document and manage complex business processes. The data management solution, called Electronic Handbooks (EHBs), is presently used for the entire SBIR program processes at NASA. The EHB-based system is ideal for programs and projects whose users are geographically distributed and are involved in complex management processes and procedures. EHBs provide flexible access control and increased communications while maintaining security for systems of all sizes. Through Internet Protocol- based access, user authentication and user-based access restrictions, role-based access control, and encryption/decryption, EHBs provide the level of security required for confidential data transfer. EHBs contain electronic forms and menus, which can be used in real time to execute the described processes. EHBs use standard word processors that generate ASCII HTML code to set up electronic forms that are viewed within a web browser. EHBs require no end-user software distribution, significantly reducing operating costs. Each interactive handbook simulates a hard-copy version containing chapters with descriptions of participants' roles in the online process.
A Framework to Determine New System Requirements Under Design Parameter and Demand Uncertainties
2015-04-30
relegates quantitative complexities of decision-making to the method and designates trade-space exploration to the practitioner. We demonstrate the...quantitative complexities of decision-making to the method and designates trade-space exploration to the practitioner. We demonstrate the approach...play a critical role in determining new system requirements. Scope and Method of Approach The early stages of the design process have substantial
Automated synthesis of image processing procedures using AI planning techniques
NASA Technical Reports Server (NTRS)
Chien, Steve; Mortensen, Helen
1994-01-01
This paper describes the Multimission VICAR (Video Image Communication and Retrieval) Planner (MVP) (Chien 1994) system, which uses artificial intelligence planning techniques (Iwasaki & Friedland, 1985, Pemberthy & Weld, 1992, Stefik, 1981) to automatically construct executable complex image processing procedures (using models of the smaller constituent image processing subprograms) in response to image processing requests made to the JPL Multimission Image Processing Laboratory (MIPL). The MVP system allows the user to specify the image processing requirements in terms of the various types of correction required. Given this information, MVP derives unspecified required processing steps and determines appropriate image processing programs and parameters to achieve the specified image processing goals. This information is output as an executable image processing program which can then be executed to fill the processing request.
Thinking graphically: Connecting vision and cognition during graph comprehension.
Ratwani, Raj M; Trafton, J Gregory; Boehm-Davis, Deborah A
2008-03-01
Task analytic theories of graph comprehension account for the perceptual and conceptual processes required to extract specific information from graphs. Comparatively, the processes underlying information integration have received less attention. We propose a new framework for information integration that highlights visual integration and cognitive integration. During visual integration, pattern recognition processes are used to form visual clusters of information; these visual clusters are then used to reason about the graph during cognitive integration. In 3 experiments, the processes required to extract specific information and to integrate information were examined by collecting verbal protocol and eye movement data. Results supported the task analytic theories for specific information extraction and the processes of visual and cognitive integration for integrative questions. Further, the integrative processes scaled up as graph complexity increased, highlighting the importance of these processes for integration in more complex graphs. Finally, based on this framework, design principles to improve both visual and cognitive integration are described. PsycINFO Database Record (c) 2008 APA, all rights reserved
Primordial Evolution in the Finitary Process Soup
NASA Astrophysics Data System (ADS)
Görnerup, Olof; Crutchfield, James P.
A general and basic model of primordial evolution—a soup of reacting finitary and discrete processes—is employed to identify and analyze fundamental mechanisms that generate and maintain complex structures in prebiotic systems. The processes—ɛ-machines as defined in computational mechanics—and their interaction networks both provide well defined notions of structure. This enables us to quantitatively demonstrate hierarchical self-organization in the soup in terms of complexity. We found that replicating processes evolve the strategy of successively building higher levels of organization by autocatalysis. Moreover, this is facilitated by local components that have low structural complexity, but high generality. In effect, the finitary process soup spontaneously evolves a selection pressure that favors such components. In light of the finitary process soup's generality, these results suggest a fundamental law of hierarchical systems: global complexity requires local simplicity.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-30
... Relating to the Complex Order Auction Process October 24, 2012. I. Introduction On August 30, 2012, the... Complex Order RFR Auction,'' to: (i) Include the side of the market in the request for response (``RFR'') message sent to Trading Permit Holders at the start of a Complex Order Auction (``COA''); and (ii) require...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-30
... Organizations; C2 Options Exchange, Incorporated; Order Approving Proposed Rule Change Relating to the Complex...,\\2\\ a proposed rule change to modify C2 Rule 6.13(c), ``Process for Complex Order RFR Auction,'' to... at the start of a Complex Order Auction (``COA''); and (ii) require responses to an RFR message...
de Lusignan, Simon; Cashman, Josephine; Poh, Norman; Michalakidis, Georgios; Mason, Aaron; Desombre, Terry; Krause, Paul
2012-01-01
Medical research increasingly requires the linkage of data from different sources. Conducting a requirements analysis for a new application is an established part of software engineering, but rarely reported in the biomedical literature; and no generic approaches have been published as to how to link heterogeneous health data. Literature review, followed by a consensus process to define how requirements for research, using, multiple data sources might be modeled. We have developed a requirements analysis: i-ScheDULEs - The first components of the modeling process are indexing and create a rich picture of the research study. Secondly, we developed a series of reference models of progressive complexity: Data flow diagrams (DFD) to define data requirements; unified modeling language (UML) use case diagrams to capture study specific and governance requirements; and finally, business process models, using business process modeling notation (BPMN). These requirements and their associated models should become part of research study protocols.
Lignocellulose hydrolysis by multienzyme complexes
USDA-ARS?s Scientific Manuscript database
Lignocellulosic biomass is the most abundant renewable resource on the planet. Converting this material into a usable fuel is a multi-step process, the rate-limiting step being enzymatic hydrolysis of organic polymers into monomeric sugars. While the substrate can be complex and require a multitud...
Transformations of software design and code may lead to reduced errors
NASA Technical Reports Server (NTRS)
Connelly, E. M.
1983-01-01
The capability of programmers and non-programmers to specify problem solutions by developing example-solutions and also for the programmers by writing computer programs was investigated; each method of specification was accomplished at various levels of problem complexity. The level of difficulty of each problem was reflected by the number of steps needed by the user to develop a solution. Machine processing of the user inputs permitted inferences to be developed about the algorithms required to solve a particular problem. The interactive feedback of processing results led users to a more precise definition of the desired solution. Two participant groups (programmers and bookkeepers/accountants) working with three levels of problem complexity and three levels of processor complexity were used. The experimental task employed required specification of a logic for solution of a Navy task force problem.
NASA Astrophysics Data System (ADS)
Puzyrkov, Dmitry; Polyakov, Sergey; Podryga, Viktoriia; Markizov, Sergey
2018-02-01
At the present stage of computer technology development it is possible to study the properties and processes in complex systems at molecular and even atomic levels, for example, by means of molecular dynamics methods. The most interesting are problems related with the study of complex processes under real physical conditions. Solving such problems requires the use of high performance computing systems of various types, for example, GRID systems and HPC clusters. Considering the time consuming computational tasks, the need arises of software for automatic and unified monitoring of such computations. A complex computational task can be performed over different HPC systems. It requires output data synchronization between the storage chosen by a scientist and the HPC system used for computations. The design of the computational domain is also quite a problem. It requires complex software tools and algorithms for proper atomistic data generation on HPC systems. The paper describes the prototype of a cloud service, intended for design of atomistic systems of large volume for further detailed molecular dynamic calculations and computational management for this calculations, and presents the part of its concept aimed at initial data generation on the HPC systems.
Unstructured Cartesian/prismatic grid generation for complex geometries
NASA Technical Reports Server (NTRS)
Karman, Steve L., Jr.
1995-01-01
The generation of a hybrid grid system for discretizing complex three dimensional (3D) geometries is described. The primary grid system is an unstructured Cartesian grid automatically generated using recursive cell subdivision. This grid system is sufficient for computing Euler solutions about extremely complex 3D geometries. A secondary grid system, using triangular-prismatic elements, may be added for resolving the boundary layer region of viscous flows near surfaces of solid bodies. This paper describes the grid generation processes used to generate each grid type. Several example grids are shown, demonstrating the ability of the method to discretize complex geometries, with very little pre-processing required by the user.
Investigation of model-based physical design restrictions (Invited Paper)
NASA Astrophysics Data System (ADS)
Lucas, Kevin; Baron, Stanislas; Belledent, Jerome; Boone, Robert; Borjon, Amandine; Couderc, Christophe; Patterson, Kyle; Riviere-Cazaux, Lionel; Rody, Yves; Sundermann, Frank; Toublan, Olivier; Trouiller, Yorick; Urbani, Jean-Christophe; Wimmer, Karl
2005-05-01
As lithography and other patterning processes become more complex and more non-linear with each generation, the task of physical design rules necessarily increases in complexity also. The goal of the physical design rules is to define the boundary between the physical layout structures which will yield well from those which will not. This is essentially a rule-based pre-silicon guarantee of layout correctness. However the rapid increase in design rule requirement complexity has created logistical problems for both the design and process functions. Therefore, similar to the semiconductor industry's transition from rule-based to model-based optical proximity correction (OPC) due to increased patterning complexity, opportunities for improving physical design restrictions by implementing model-based physical design methods are evident. In this paper we analyze the possible need and applications for model-based physical design restrictions (MBPDR). We first analyze the traditional design rule evolution, development and usage methodologies for semiconductor manufacturers. Next we discuss examples of specific design rule challenges requiring new solution methods in the patterning regime of low K1 lithography and highly complex RET. We then evaluate possible working strategies for MBPDR in the process development and product design flows, including examples of recent model-based pre-silicon verification techniques. Finally we summarize with a proposed flow and key considerations for MBPDR implementation.
New levels of language processing complexity and organization revealed by granger causation.
Gow, David W; Caplan, David N
2012-01-01
Granger causation analysis of high spatiotemporal resolution reconstructions of brain activation offers a new window on the dynamic interactions between brain areas that support language processing. Premised on the observation that causes both precede and uniquely predict their effects, this approach provides an intuitive, model-free means of identifying directed causal interactions in the brain. It requires the analysis of all non-redundant potentially interacting signals, and has shown that even "early" processes such as speech perception involve interactions of many areas in a strikingly large network that extends well beyond traditional left hemisphere perisylvian cortex that play out over hundreds of milliseconds. In this paper we describe this technique and review several general findings that reframe the way we think about language processing and brain function in general. These include the extent and complexity of language processing networks, the central role of interactive processing dynamics, the role of processing hubs where the input from many distinct brain regions are integrated, and the degree to which task requirements and stimulus properties influence processing dynamics and inform our understanding of "language-specific" localized processes.
Moribe, Kunikazu; Tozuka, Yuichi; Yamamoto, Keiji
2008-02-14
Supercritical fluid technique have been exploited in extraction, separation and crystallization processes. In the field of pharmaceutics, supercritical carbon dioxide (scCO(2)) has been used for the purpose of micronization, polymorphic control, and preparation of solid dispersion and complexes. Particle design of active pharmaceutical ingredients is important to make the solid dosage forms with suitable physicochemical properties. Control of the characteristic properties of particles, such as size, shape, crystal structure and morphology is required to optimize the formulation. For solubility enhancement of poorly water-soluble drugs, preparation of the solid dispersion or the complexation with proper drugs or excipients should be a promising approach. This review focuses on aspects of polymorphic control and complexation behavior of active pharmaceutical ingredients by scCO(2) processing.
Temporal texture of associative encoding modulates recall processes.
Tibon, Roni; Levy, Daniel A
2014-02-01
Binding aspects of an experience that are distributed over time is an important element of episodic memory. In the current study, we examined how the temporal complexity of an experience may govern the processes required for its retrieval. We recorded event-related potentials during episodic cued recall following pair associate learning of concurrently and sequentially presented object-picture pairs. Cued recall success effects over anterior and posterior areas were apparent in several time windows. In anterior locations, these recall success effects were similar for concurrently and sequentially encoded pairs. However, in posterior sites clustered over parietal scalp the effect was larger for the retrieval of sequentially encoded pairs. We suggest that anterior aspects of the mid-latency recall success effects may reflect working-with-memory operations or direct access recall processes, while more posterior aspects reflect recollective processes which are required for retrieval of episodes of greater temporal complexity. Copyright © 2013 Elsevier Inc. All rights reserved.
Model-Driven Useware Engineering
NASA Astrophysics Data System (ADS)
Meixner, Gerrit; Seissler, Marc; Breiner, Kai
User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.
Healthcare software assurance.
Cooper, Jason G; Pauley, Keith A
2006-01-01
Software assurance is a rigorous, lifecycle phase-independent set of activities which ensure completeness, safety, and reliability of software processes and products. This is accomplished by guaranteeing conformance to all requirements, standards, procedures, and regulations. These assurance processes are even more important when coupled with healthcare software systems, embedded software in medical instrumentation, and other healthcare-oriented life-critical systems. The current Food and Drug Administration (FDA) regulatory requirements and guidance documentation do not address certain aspects of complete software assurance activities. In addition, the FDA's software oversight processes require enhancement to include increasingly complex healthcare systems such as Hospital Information Systems (HIS). The importance of complete software assurance is introduced, current regulatory requirements and guidance discussed, and the necessity for enhancements to the current processes shall be highlighted.
Cooper, Jason G.; Pauley, Keith A.
2006-01-01
Software assurance is a rigorous, lifecycle phase-independent set of activities which ensure completeness, safety, and reliability of software processes and products. This is accomplished by guaranteeing conformance to all requirements, standards, procedures, and regulations. These assurance processes are even more important when coupled with healthcare software systems, embedded software in medical instrumentation, and other healthcare-oriented life-critical systems. The current Food and Drug Administration (FDA) regulatory requirements and guidance documentation do not address certain aspects of complete software assurance activities. In addition, the FDA’s software oversight processes require enhancement to include increasingly complex healthcare systems such as Hospital Information Systems (HIS). The importance of complete software assurance is introduced, current regulatory requirements and guidance discussed, and the necessity for enhancements to the current processes shall be highlighted. PMID:17238324
ERIC Educational Resources Information Center
Cepeda, Nicholas J.; Blackwell, Katharine A.; Munakata, Yuko
2013-01-01
The rate at which people process information appears to influence many aspects of cognition across the lifespan. However, many commonly accepted measures of "processing speed" may require goal maintenance, manipulation of information in working memory, and decision-making, blurring the distinction between processing speed and executive…
Analyses of the chemical composition of complex DBP mixtures, produced by different drinking water treatment processes, are essential to generate toxicity data required for assessing their risks to humans. For mixture risk assessments, whole mixture toxicology studies generally a...
Analyses of the chemical composition of complex DBP mixtures, produced by different drinking water treatment processes, are essential to generate toxicity data required for assessing their risks to humans. For mixture risk assessments, whole mixture toxicology studies generally a...
Verification of Triple Modular Redundancy Insertion for Reliable and Trusted Systems
NASA Technical Reports Server (NTRS)
Berg, Melanie; LaBel, Kenneth
2016-01-01
If a system is required to be protected using triple modular redundancy (TMR), improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process and the complexity of digital designs, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems.
JPL Counterfeit Parts Avoidance
NASA Technical Reports Server (NTRS)
Risse, Lori
2012-01-01
SPACE ARCHITECTURE / ENGINEERING: It brings an extreme test bed for both technologies/concepts as well as procedures/processes. Design and construction (engineering) always go together, especially with complex systems. Requirements (objectives) are crucial. More important than the answers are the questions/Requirements/Tools-Techniques/Processes. Different environments force architects and engineering to think out of the box. For instance there might not be gravity forces. Architectural complex problems have common roots: in Space and on Earth. Let us bring Space down on Earth so we can keep sending Mankind to the stars from a better world. Have fun being architects and engineers...!!! This time is amazing and historical. We are changing the way we inhabit the solar systems!
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumar, Shekhar; Koganti, S.B.
2008-07-01
Acetohydroxamic acid (AHA) is a novel complexant for recycle of nuclear-fuel materials. It can be used in ordinary centrifugal extractors, eliminating the need for electro-redox equipment or complex maintenance requirements in a remotely maintained hot cell. In this work, the effect of AHA on Pu(IV) distribution ratios in 30% TBP system was quantified, modeled, and integrated in SIMPSEX code. Two sets of batch experiments involving macro Pu concentrations (conducted at IGCAR) and one high-Pu flowsheet (literature) were simulated for AHA based U-Pu separation. Based on the simulation and validation results, AHA based next-generation reprocessing flowsheets are proposed for co-processing basedmore » FBR and thermal-fuel reprocessing as well as evaporator-less macro-level Pu concentration process required for MOX fuel fabrication. Utilization of AHA results in significant simplification in plant design and simpler technology implementations with significant cost savings. (authors)« less
NASA Technical Reports Server (NTRS)
Schilling, D. L.
1974-01-01
Digital multiplication of two waveforms using delta modulation (DM) is discussed. It is shown that while conventional multiplication of two N bit words requires N2 complexity, multiplication using DM requires complexity which increases linearly with N. Bounds on the signal-to-quantization noise ratio (SNR) resulting from this multiplication are determined and compared with the SNR obtained using standard multiplication techniques. The phase locked loop (PLL) system, consisting of a phase detector, voltage controlled oscillator, and a linear loop filter, is discussed in terms of its design and system advantages. Areas requiring further research are identified.
DigiMemo: Facilitating the Note Taking Process
ERIC Educational Resources Information Center
Kurt, Serhat
2009-01-01
Everyone takes notes daily for various reasons. Note taking is very popular in school settings and generally recognized as an effective learning strategy. Further, note taking is a complex process because it requires understanding, selection of information and writing. Some new technological tools may facilitate the note taking process. Among such…
Near-Term Fetuses Process Temporal Features of Speech
ERIC Educational Resources Information Center
Granier-Deferre, Carolyn; Ribeiro, Aurelie; Jacquet, Anne-Yvonne; Bassereau, Sophie
2011-01-01
The perception of speech and music requires processing of variations in spectra and amplitude over different time intervals. Near-term fetuses can discriminate acoustic features, such as frequencies and spectra, but whether they can process complex auditory streams, such as speech sequences and more specifically their temporal variations, fast or…
Unified Approximations: A New Approach for Monoprotic Weak Acid-Base Equilibria
ERIC Educational Resources Information Center
Pardue, Harry; Odeh, Ihab N.; Tesfai, Teweldemedhin M.
2004-01-01
The unified approximations reduce the conceptual complexity by combining solutions for a relatively large number of different situations into just two similar sets of processes. Processes used to solve problems by either the unified or classical approximations require similar degrees of understanding of the underlying chemical processes.
NASA Astrophysics Data System (ADS)
Aksenova, Olesya; Pachkina, Anna
2017-11-01
The article deals with the problem of necessity of educational process transformation to meet the requirements of modern miming industry; cooperative developing of new educational programs and implementation of educational process taking into account modern manufacturability. The paper proves the idea of introduction into mining professionals learning process studying of three-dimensional models of surface technological complex, ore reserves and underground digging complex as well as creating these models in different graphic editors and working with the information analysis model obtained on the basis of these three-dimensional models. The technological process of manless coal mining at the premises of the mine Polysaevskaya controlled by the information analysis models built on the basis of three-dimensional models of individual objects and technological process as a whole, and at the same time requiring the staff able to use the programs of three-dimensional positioning in the miners and equipment global frame of reference is covered.
ATP synthase promotes germ cell differentiation independent of oxidative phosphorylation
Teixeira, Felipe K.; Sanchez, Carlos G.; Hurd, Thomas R.; Seifert, Jessica R. K.; Czech, Benjamin; Preall, Jonathan B.; Hannon, Gregory J.; Lehmann, Ruth
2015-01-01
The differentiation of stem cells is a tightly regulated process essential for animal development and tissue homeostasis. Through this process, attainment of new identity and function is achieved by marked changes in cellular properties. Intrinsic cellular mechanisms governing stem cell differentiation remain largely unknown, in part because systematic forward genetic approaches to the problem have not been widely used1,2. Analysing genes required for germline stem cell differentiation in the Drosophila ovary, we find that the mitochondrial ATP synthase plays a critical role in this process. Unexpectedly, the ATP synthesizing function of this complex was not necessary for differentiation, as knockdown of other members of the oxidative phosphorylation system did not disrupt the process. Instead, the ATP synthase acted to promote the maturation of mitochondrial cristae during differentiation through dimerization and specific upregulation of the ATP synthase complex. Taken together, our results suggest that ATP synthase-dependent crista maturation is a key developmental process required for differentiation independent of oxidative phosphorylation. PMID:25915123
NASA Astrophysics Data System (ADS)
Nawani, Jigna; Rixius, Julia; Neuhaus, Birgit J.
2016-08-01
Empirical analysis of secondary biology classrooms revealed that, on average, 68% of teaching time in Germany revolved around processing tasks. Quality of instruction can thus be assessed by analyzing the quality of tasks used in classroom discourse. This quasi-experimental study analyzed how teachers used tasks in 38 videotaped biology lessons pertaining to the topic 'blood and circulatory system'. Two fundamental characteristics used to analyze tasks include: (1) required cognitive level of processing (e.g. low level information processing: repetiition, summary, define, classify and high level information processing: interpret-analyze data, formulate hypothesis, etc.) and (2) complexity of task content (e.g. if tasks require use of factual, linking or concept level content). Additionally, students' cognitive knowledge structure about the topic 'blood and circulatory system' was measured using student-drawn concept maps (N = 970 students). Finally, linear multilevel models were created with high-level cognitive processing tasks and higher content complexity tasks as class-level predictors and students' prior knowledge, students' interest in biology, and students' interest in biology activities as control covariates. Results showed a positive influence of high-level cognitive processing tasks (β = 0.07; p < .01) on students' cognitive knowledge structure. However, there was no observed effect of higher content complexity tasks on students' cognitive knowledge structure. Presented findings encourage the use of high-level cognitive processing tasks in biology instruction.
HDAC3 and the Molecular Brake Pad Hypothesis
McQuown, Susan C.; Wood, Marcelo A.
2011-01-01
Successful transcription of specific genes required for long-term memory processes involves the orchestrated effort of not only transcription factors, but also very specific enzymatic protein complexes that modify chromatin structure. Chromatin modification has been identified as a pivotal molecular mechanism underlying certain forms of synaptic plasticity and memory. The best-studied form of chromatin modification in the learning and memory field is histone acetylation, which is regulated by histone acetyltransferases and histone deacetylases (HDACs). HDAC inhibitors have been shown to strongly enhance long-term memory processes, and recent work has aimed to identify contributions of individual HDACs. In this review, we focus on HDAC3 and discuss its recently defined role as a negative regulator of long-term memory formation. HDAC3 is part of a corepressor complex and has direct interactions with class II HDACs that may be important for its molecular and behavioral consequences. And last, we propose the “molecular brake pad” hypothesis of HDAC function. The HDACs and associated corepressor complexes may function in neurons, in part, as “molecular brake pads.” HDACs are localized to promoters of active genes and act as a persistent clamp that requires strong activity-dependent signaling to temporarily release these complexes (or brake pads) to activate gene expression required for long-term memory formation. Thus, HDAC inhibition removes the “molecular brake pads” constraining the processes necessary for long-term memory and results in strong, persistent memory formation. PMID:21521655
Developing Organizational Adaptability for Complex Environment
ERIC Educational Resources Information Center
Boylan, Steven A.; Turner, Kenneth A.
2017-01-01
Developing organizations capable of adapting requires leaders to set conditions. Setting conditions normally requires purposeful activities by the leadership to foster and develop leader and individual adaptability, supported by processes and activities that enable adaptive behaviors through the totality of the organization (Goldstein, Hazy, &…
Moving Students to Deeper Learning in Leadership
ERIC Educational Resources Information Center
Stover, Sheri; Seemiller, Corey
2017-01-01
The world is a volatile, uncertain, complex, and ambiguous (VUCA) environment (Carvan, 2015) that calls for leaders who can effectively navigate the complexity of leadership today. Students of leadership studies must not only learn leadership information content, but also be able to effectively implement the content and process, requiring deep…
ERIC Educational Resources Information Center
Collins, Steve; Ting, Hermia
2014-01-01
The profession of teaching is unique because of the extent to which a teacher becomes involved in the lives of their "clients". The level of care required to support students well can be intense, confusing, and overwhelming. Relationships co-evolve within an ever-changing process and care is considered an essential aspect of complex relationships…
Tension as an Enabling Characteristic of Innovating in Schools
ERIC Educational Resources Information Center
Perillo, Suzanne
2007-01-01
Purpose: The purpose of this paper is to argue that school innovation is a complex process requiring a detailed accounting of the relational activity characterising everyday innovating activity. It is further proposed that complex accounts of innovation practice that describe social factors only are insufficient. Design/methodology/approach: Using…
The Ecology of Role Play: Intentionality and Cultural Evolution
ERIC Educational Resources Information Center
Papadopoulou, Marianna
2012-01-01
This study examines the evolutionary function of children's pretence. The everyday, cultural environment that children engage with is of a highly complex structure. Human adaptation, thus, becomes, by analogy, an equally complex process that requires the development of life skills. Whilst in role play children engage in "mimesis" and…
When Time Makes a Difference: Addressing Ergodicity and Complexity in Education
ERIC Educational Resources Information Center
Koopmans, Matthijs
2015-01-01
The detection of complexity in behavioral outcomes often requires an estimation of their variability over a prolonged time spectrum to assess processes of stability and transformation. Conventional scholarship typically relies on time-independent measures, "snapshots", to analyze those outcomes, assuming that group means and their…
Engineering Complex Embedded Systems with State Analysis and the Mission Data System
NASA Technical Reports Server (NTRS)
Ingham, Michel D.; Rasmussen, Robert D.; Bennett, Matthew B.; Moncada, Alex C.
2004-01-01
It has become clear that spacecraft system complexity is reaching a threshold where customary methods of control are no longer affordable or sufficiently reliable. At the heart of this problem are the conventional approaches to systems and software engineering based on subsystem-level functional decomposition, which fail to scale in the tangled web of interactions typically encountered in complex spacecraft designs. Furthermore, there is a fundamental gap between the requirements on software specified by systems engineers and the implementation of these requirements by software engineers. Software engineers must perform the translation of requirements into software code, hoping to accurately capture the systems engineer's understanding of the system behavior, which is not always explicitly specified. This gap opens up the possibility for misinterpretation of the systems engineer s intent, potentially leading to software errors. This problem is addressed by a systems engineering methodology called State Analysis, which provides a process for capturing system and software requirements in the form of explicit models. This paper describes how requirements for complex aerospace systems can be developed using State Analysis and how these requirements inform the design of the system software, using representative spacecraft examples.
Practical Unitary Simulator for Non-Markovian Complex Processes
NASA Astrophysics Data System (ADS)
Binder, Felix C.; Thompson, Jayne; Gu, Mile
2018-06-01
Stochastic processes are as ubiquitous throughout the quantitative sciences as they are notorious for being difficult to simulate and predict. In this Letter, we propose a unitary quantum simulator for discrete-time stochastic processes which requires less internal memory than any classical analogue throughout the simulation. The simulator's internal memory requirements equal those of the best previous quantum models. However, in contrast to previous models, it only requires a (small) finite-dimensional Hilbert space. Moreover, since the simulator operates unitarily throughout, it avoids any unnecessary information loss. We provide a stepwise construction for simulators for a large class of stochastic processes hence directly opening the possibility for experimental implementations with current platforms for quantum computation. The results are illustrated for an example process.
Hess, Nancy J.; Pasa-Tolic, Ljiljana; Bailey, Vanessa L.; ...
2017-04-12
Understanding the role played by microorganisms within soil systems is challenged by the unique intersection of physics, chemistry, mineralogy and biology in fostering habitat for soil microbial communities. To address these challenges will require observations across multiple spatial and temporal scales to capture the dynamics and emergent behavior from complex and interdependent processes. The heterogeneity and complexity of the rhizosphere require advanced techniques that press the simultaneous frontiers of spatial resolution, analyte sensitivity and specificity, reproducibility, large dynamic range, and high throughput. Fortunately many exciting technical advancements are now available to inform and guide the development of new hypotheses. Themore » aim of this Special issue is to provide a holistic view of the rhizosphere in the perspective of modern molecular biology methodologies that enabled a highly-focused, detailed view on the processes in the rhizosphere, including numerous, strong and complex interactions between plant roots, soil constituents and microorganisms. We discuss the current rhizosphere research challenges and knowledge gaps, as well as perspectives and approaches using newly available state-of-the-art toolboxes. These new approaches and methodologies allow the study of rhizosphere processes and properties, and rhizosphere as a central component of ecosystems and biogeochemical cycles.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hess, Nancy J.; Paša-Tolić, Ljiljana; Bailey, Vanessa L.
Understanding the role played by microorganisms within soil systems is challenged by the unique intersection of physics, chemistry, mineralogy and biology in fostering habitat for soil microbial communities. To address these challenges will require observations across multiple spatial and temporal scales to capture the dynamics and emergent behavior from complex and interdependent processes. The heterogeneity and complexity of the rhizosphere require advanced techniques that press the simultaneous frontiers of spatial resolution, analyte sensitivity and specificity, reproducibility, large dynamic range, and high throughput. Fortunately many exciting technical advancements are now available to inform and guide the development of new hypotheses. Themore » aim of this Special issue is to provide a holistic view of the rhizosphere in the perspective of modern molecular biology methodologies that enabled a highly-focused, detailed view on the processes in the rhizosphere, including numerous, strong and complex interactions between plant roots, soil constituents and microorganisms. We discuss the current rhizosphere research challenges and knowledge gaps, as well as perspectives and approaches using newly available state-of-the-art toolboxes. These new approaches and methodologies allow the study of rhizosphere processes and properties, and rhizosphere as a central component of ecosystems and biogeochemical cycles.« less
NASA Technical Reports Server (NTRS)
Sills, Joel W., Jr.; Griffin, Thomas J. (Technical Monitor)
2001-01-01
The Hubble Space Telescope (HST) Disturbance Verification Test (DVT) was conducted to characterize responses of the Observatory's new set of rigid solar array's (SA3) to thermally induced 'creak' or stiction releases. The data acquired in the DVT were used in verification of the HST Pointing Control System on-orbit performance, post-Servicing Mission 3B (SM3B). The test simulated the on-orbit environment on a deployed SA3 flight wing. Instrumentation for this test required pretest simulations in order to select the correct sensitivities. Vacuum compatible, highly accurate accelerometers and force gages were used for this test. The complexity of the test, as well as a short planning schedule, required a data acquisition system that was easy to configure, highly flexible, and extremely robust. A PC Windows oriented data acquisition system meets these requirements, allowing the test engineers to minimize the time required to plan and perform complex environmental test. The SA3 DVT provided a direct practical and complex demonstration of the versatility that PC based data acquisition systems provide. Two PC based data acquisition systems were assembled to acquire, process, distribute, and provide real time processing for several types of transducers used in the SA3 DVT. A high sample rate digital tape recorder was used to archive the sensor signals. The two systems provided multi-channel hardware and software architecture and were selected based on the test requirements. How these systems acquire and processes multiple data rates from different transducer types is discussed, along with the system hardware and software architecture.
An intelligent approach to welding robot selection
NASA Astrophysics Data System (ADS)
Milano, J.; Mauk, S. D.; Flitter, L.; Morris, R.
1993-10-01
In a shipyard where multiple stationary and mobile workcells are employed in the fabrication of components of complex sub-assemblies,efficient operation requires an intelligent method of scheduling jobs and selecting workcells based on optimum throughput and cost. The achievement of this global solution requires the successful organization of resource availability,process requirements,and process constraints. The Off-line Planner (OLP) of the Programmable Automated Weld Systemd (PAWS) is capable of advanced modeling of weld processes and environments as well as the generation of complete weld procedures. These capabilities involve the integration of advanced Computer Aided Design (CAD), path planning, and obstacle detection and avoidance techniques as well as the synthesis of complex design and process information. These existing capabilities provide the basis of the functionality required for the successful implementation of an intelligent weld robot selector and material flow planner. Current efforts are focused on robot selection via the dynamic routing of components to the appropriate work cells. It is proposed that this problem is a variant of the “Traveling Salesman Problem” (TSP) that has been proven to belong to a larger set of optimization problems termed nondeterministic polynomial complete (NP complete). In this paper, a heuristic approach utilizing recurrent neural networks is explored as a rapid means of producing a near optimal, if not optimal, bdweld robot selection.
Manzano, David; Marquardt, Sebastian; Jones, Alexandra M. E.; Bäurle, Isabel; Liu, Fuquan; Dean, Caroline
2009-01-01
The role of RNA metabolism in chromatin silencing is now widely recognized. We have studied the Arabidopsis RNA-binding protein FCA that down-regulates an endogenous floral repressor gene through a chromatin mechanism involving histone demethylase activity. This mechanism needs FCA to interact with an RNA 3′ processing/polyadenylation factor (FY/Pfs2p), but the subsequent events leading to chromatin changes are unknown. Here, we show that this FCA–FY interaction is required for general chromatin silencing roles where hairpin transgenes induce DNA methylation of an endogenous gene. We also show 2 conserved RNA processing factors, AtCPSF100 and AtCPSF160, but not FCA, are stably associated with FY in vivo and form a range of different-sized complexes. A hypomorphic fy allele producing a shorter protein, able to provide some FY functions but unable to interact with FCA, reduces abundance of some of the larger MW complexes. Suppressor mutants, which specifically disrupt the FY motif through which FCA interacts, also lacked these larger complexes. Our data support a model whereby FCA, perhaps after recognition of a specific RNA feature, transiently interacts with FY, an integral component of the canonical RNA 3′ processing machinery, changing the interactions of the different RNA processing components. These altered interactions would appear to be a necessary step in this RNA-mediated chromatin silencing. PMID:19439664
Manzano, David; Marquardt, Sebastian; Jones, Alexandra M E; Bäurle, Isabel; Liu, Fuquan; Dean, Caroline
2009-05-26
The role of RNA metabolism in chromatin silencing is now widely recognized. We have studied the Arabidopsis RNA-binding protein FCA that down-regulates an endogenous floral repressor gene through a chromatin mechanism involving histone demethylase activity. This mechanism needs FCA to interact with an RNA 3' processing/polyadenylation factor (FY/Pfs2p), but the subsequent events leading to chromatin changes are unknown. Here, we show that this FCA-FY interaction is required for general chromatin silencing roles where hairpin transgenes induce DNA methylation of an endogenous gene. We also show 2 conserved RNA processing factors, AtCPSF100 and AtCPSF160, but not FCA, are stably associated with FY in vivo and form a range of different-sized complexes. A hypomorphic fy allele producing a shorter protein, able to provide some FY functions but unable to interact with FCA, reduces abundance of some of the larger MW complexes. Suppressor mutants, which specifically disrupt the FY motif through which FCA interacts, also lacked these larger complexes. Our data support a model whereby FCA, perhaps after recognition of a specific RNA feature, transiently interacts with FY, an integral component of the canonical RNA 3' processing machinery, changing the interactions of the different RNA processing components. These altered interactions would appear to be a necessary step in this RNA-mediated chromatin silencing.
Calibration of 3D ALE finite element model from experiments on friction stir welding of lap joints
NASA Astrophysics Data System (ADS)
Fourment, Lionel; Gastebois, Sabrina; Dubourg, Laurent
2016-10-01
In order to support the design of such a complex process like Friction Stir Welding (FSW) for the aeronautic industry, numerical simulation software requires (1) developing an efficient and accurate Finite Element (F.E.) formulation that allows predicting welding defects, (2) properly modeling the thermo-mechanical complexity of the FSW process and (3) calibrating the F.E. model from accurate measurements from FSW experiments. This work uses a parallel ALE formulation developed in the Forge® F.E. code to model the different possible defects (flashes and worm holes), while pin and shoulder threads are modeled by a new friction law at the tool / material interface. FSW experiments require using a complex tool with scroll on shoulder, which is instrumented for providing sensitive thermal data close to the joint. Calibration of unknown material thermal coefficients, constitutive equations parameters and friction model from measured forces, torques and temperatures is carried out using two F.E. models, Eulerian and ALE, to reach a satisfactory agreement assessed by the proper sensitivity of the simulation to process parameters.
Applying Early Systems Engineering: Injecting Knowledge into the Capability Development Process
2012-10-01
involves early use of systems engi- neering and technical analyses to supplement the existing operational analysis techniques currently used in...complexity, and costs of systems now being developed require tight coupling between operational requirements stated in the CDD, system requirements...Fleischer » Keywords: Capability Development, Competitive Prototyping, Knowledge Points, Early Systems Engineering Applying Early Systems
Biomedically relevant chemical and physical properties of coal combustion products.
Fisher, G L
1983-01-01
The evaluation of the potential public and occupational health hazards of developing and existing combustion processes requires a detailed understanding of the physical and chemical properties of effluents available for human and environmental exposures. These processes produce complex mixtures of gases and aerosols which may interact synergistically or antagonistically with biological systems. Because of the physicochemical complexity of the effluents, the biomedically relevant properties of these materials must be carefully assessed. Subsequent to release from combustion sources, environmental interactions further complicate assessment of the toxicity of combustion products. This report provides an overview of the biomedically relevant physical and chemical properties of coal fly ash. Coal fly ash is presented as a model complex mixture for health and safety evaluation of combustion processes. PMID:6337824
Complex Problem Solving in Teams: The Impact of Collective Orientation on Team Process Demands.
Hagemann, Vera; Kluge, Annette
2017-01-01
Complex problem solving is challenging and a high-level cognitive process for individuals. When analyzing complex problem solving in teams, an additional, new dimension has to be considered, as teamwork processes increase the requirements already put on individual team members. After introducing an idealized teamwork process model, that complex problem solving teams pass through, and integrating the relevant teamwork skills for interdependently working teams into the model and combining it with the four kinds of team processes (transition, action, interpersonal, and learning processes), the paper demonstrates the importance of fulfilling team process demands for successful complex problem solving within teams. Therefore, results from a controlled team study within complex situations are presented. The study focused on factors that influence action processes, like coordination, such as emergent states like collective orientation, cohesion, and trust and that dynamically enable effective teamwork in complex situations. Before conducting the experiments, participants were divided by median split into two-person teams with either high ( n = 58) or low ( n = 58) collective orientation values. The study was conducted with the microworld C3Fire, simulating dynamic decision making, and acting in complex situations within a teamwork context. The microworld includes interdependent tasks such as extinguishing forest fires or protecting houses. Two firefighting scenarios had been developed, which takes a maximum of 15 min each. All teams worked on these two scenarios. Coordination within the team and the resulting team performance were calculated based on a log-file analysis. The results show that no relationships between trust and action processes and team performance exist. Likewise, no relationships were found for cohesion. Only collective orientation of team members positively influences team performance in complex environments mediated by action processes such as coordination within the team. The results are discussed in relation to previous empirical findings and to learning processes within the team with a focus on feedback strategies.
Complex Problem Solving in Teams: The Impact of Collective Orientation on Team Process Demands
Hagemann, Vera; Kluge, Annette
2017-01-01
Complex problem solving is challenging and a high-level cognitive process for individuals. When analyzing complex problem solving in teams, an additional, new dimension has to be considered, as teamwork processes increase the requirements already put on individual team members. After introducing an idealized teamwork process model, that complex problem solving teams pass through, and integrating the relevant teamwork skills for interdependently working teams into the model and combining it with the four kinds of team processes (transition, action, interpersonal, and learning processes), the paper demonstrates the importance of fulfilling team process demands for successful complex problem solving within teams. Therefore, results from a controlled team study within complex situations are presented. The study focused on factors that influence action processes, like coordination, such as emergent states like collective orientation, cohesion, and trust and that dynamically enable effective teamwork in complex situations. Before conducting the experiments, participants were divided by median split into two-person teams with either high (n = 58) or low (n = 58) collective orientation values. The study was conducted with the microworld C3Fire, simulating dynamic decision making, and acting in complex situations within a teamwork context. The microworld includes interdependent tasks such as extinguishing forest fires or protecting houses. Two firefighting scenarios had been developed, which takes a maximum of 15 min each. All teams worked on these two scenarios. Coordination within the team and the resulting team performance were calculated based on a log-file analysis. The results show that no relationships between trust and action processes and team performance exist. Likewise, no relationships were found for cohesion. Only collective orientation of team members positively influences team performance in complex environments mediated by action processes such as coordination within the team. The results are discussed in relation to previous empirical findings and to learning processes within the team with a focus on feedback strategies. PMID:29033886
Scheduling Software for Complex Scenarios
NASA Technical Reports Server (NTRS)
2006-01-01
Preparing a vehicle and its payload for a single launch is a complex process that involves thousands of operations. Because the equipment and facilities required to carry out these operations are extremely expensive and limited in number, optimal assignment and efficient use are critically important. Overlapping missions that compete for the same resources, ground rules, safety requirements, and the unique needs of processing vehicles and payloads destined for space impose numerous constraints that, when combined, require advanced scheduling. Traditional scheduling systems use simple algorithms and criteria when selecting activities and assigning resources and times to each activity. Schedules generated by these simple decision rules are, however, frequently far from optimal. To resolve mission-critical scheduling issues and predict possible problem areas, NASA historically relied upon expert human schedulers who used their judgment and experience to determine where things should happen, whether they will happen on time, and whether the requested resources are truly necessary.
Managing complexity in simulations of land surface and near-surface processes
Coon, Ethan T.; Moulton, J. David; Painter, Scott L.
2016-01-12
Increasing computing power and the growing role of simulation in Earth systems science have led to an increase in the number and complexity of processes in modern simulators. We present a multiphysics framework that specifies interfaces for coupled processes and automates weak and strong coupling strategies to manage this complexity. Process management is enabled by viewing the system of equations as a tree, where individual equations are associated with leaf nodes and coupling strategies with internal nodes. A dynamically generated dependency graph connects a variable to its dependencies, streamlining and automating model evaluation, easing model development, and ensuring models aremore » modular and flexible. Additionally, the dependency graph is used to ensure that data requirements are consistent between all processes in a given simulation. Here we discuss the design and implementation of these concepts within the Arcos framework, and demonstrate their use for verification testing and hypothesis evaluation in numerical experiments.« less
Reducing the complexity of the software design process with object-oriented design
NASA Technical Reports Server (NTRS)
Schuler, M. P.
1991-01-01
Designing software is a complex process. How object-oriented design (OOD), coupled with formalized documentation and tailored object diagraming techniques, can reduce the complexity of the software design process is described and illustrated. The described OOD methodology uses a hierarchical decomposition approach in which parent objects are decomposed into layers of lower level child objects. A method of tracking the assignment of requirements to design components is also included. Increases in the reusability, portability, and maintainability of the resulting products are also discussed. This method was built on a combination of existing technology, teaching experience, consulting experience, and feedback from design method users. The discussed concepts are applicable to hierarchal OOD processes in general. Emphasis is placed on improving the design process by documenting the details of the procedures involved and incorporating improvements into those procedures as they are developed.
What's Wrong with Cookbooks? A Reply to Ault
ERIC Educational Resources Information Center
Monteyne, Kereen; Cracolice, Mark S.
2004-01-01
The work done in a chemistry laboratory is compared to cooking, as both processes use books for reference. It is felt that cooking and chemistry are complex processes and are creative endeavors that require skills beyond those developed by merely following the directions.
Effective Software Engineering Leadership for Development Programs
ERIC Educational Resources Information Center
Cagle West, Marsha
2010-01-01
Software is a critical component of systems ranging from simple consumer appliances to complex health, nuclear, and flight control systems. The development of quality, reliable, and effective software solutions requires the incorporation of effective software engineering processes and leadership. Processes, approaches, and methodologies for…
Design for waste-management system
NASA Technical Reports Server (NTRS)
Guarneri, C. A.; Reed, A.; Renman, R.
1973-01-01
Study was made and system defined for water-recovery and solid-waste processing for low-rise apartment complexes. System can be modified to conform with unique requirements of community, including hydrology, geology, and climate. Reclamation is accomplished by treatment process that features reverse-osmosis membranes.
Molding cork sheets to complex shapes
NASA Technical Reports Server (NTRS)
Sharpe, M. H.; Simpson, W. G.; Walker, H. M.
1977-01-01
Partially cured cork sheet is easily formed to complex shapes and then final-cured. Temperature and pressure levels required for process depend upon resin system used and final density and strength desired. Sheet can be bonded to surface during final cure, or can be first-formed in mold and bonded to surface in separate step.
Students' Explanations in Complex Learning of Disciplinary Programming
ERIC Educational Resources Information Center
Vieira, Camilo
2016-01-01
Computational Science and Engineering (CSE) has been denominated as the third pillar of science and as a set of important skills to solve the problems of a global society. Along with the theoretical and the experimental approaches, computation offers a third alternative to solve complex problems that require processing large amounts of data, or…
ERIC Educational Resources Information Center
Locher, Paul J.; Simmons, Roger W.
Two experiments were conducted to investigate the perceptual processes involved in haptic exploration of randomly generated shapes. Experiment one required subjects to detect symmetrical or asymmetrical characteristics of individually presented plastic shapes, also varying in complexity. Scanning time for both symmetrical and asymmetrical shapes…
Dunham, Kylee; Grand, James B.
2016-01-01
We examined the effects of complexity and priors on the accuracy of models used to estimate ecological and observational processes, and to make predictions regarding population size and structure. State-space models are useful for estimating complex, unobservable population processes and making predictions about future populations based on limited data. To better understand the utility of state space models in evaluating population dynamics, we used them in a Bayesian framework and compared the accuracy of models with differing complexity, with and without informative priors using sequential importance sampling/resampling (SISR). Count data were simulated for 25 years using known parameters and observation process for each model. We used kernel smoothing to reduce the effect of particle depletion, which is common when estimating both states and parameters with SISR. Models using informative priors estimated parameter values and population size with greater accuracy than their non-informative counterparts. While the estimates of population size and trend did not suffer greatly in models using non-informative priors, the algorithm was unable to accurately estimate demographic parameters. This model framework provides reasonable estimates of population size when little to no information is available; however, when information on some vital rates is available, SISR can be used to obtain more precise estimates of population size and process. Incorporating model complexity such as that required by structured populations with stage-specific vital rates affects precision and accuracy when estimating latent population variables and predicting population dynamics. These results are important to consider when designing monitoring programs and conservation efforts requiring management of specific population segments.
Chinnam, Meenalakshmi; Povinelli, Benjamin J.; Fisher, Daniel T.; Golding, Michelle; Appenheimer, Michelle M.; Nemeth, Michael J.; Evans, Sharon; Goodrich, David W.
2014-01-01
Co-transcriptionally assembled ribonucleoprotein (RNP) complexes are critical for RNA processing and nuclear export. RNPs have been hypothesized to contribute to the regulation of coordinated gene expression, and defects in RNP biogenesis contribute to genome instability and disease. Despite the large number of RNPs and the importance of the molecular processes they mediate, the requirements for individual RNP complexes in mammalian development and tissue homeostasis are not well characterized. THO is an evolutionarily conserved, nuclear RNP complex that physically links nascent transcripts with the nuclear export apparatus. THO is essential for early mouse embryonic development, limiting characterization of the requirements for THO in adult tissues. To address this shortcoming, a mouse strain has been generated allowing inducible deletion of the Thoc1 gene which encodes an essential protein subunit of THO. Bone marrow reconstitution was used to generate mice in which Thoc1 deletion could be induced specifically in the hematopoietic system. We find that granulocyte macrophage progenitors have a cell autonomous requirement for Thoc1 to maintain cell growth and viability. Lymphoid lineages are not detectably affected by Thoc1 loss under the homeostatic conditions tested. Myeloid lineages may be more sensitive to Thoc1 loss due to their relatively high rate of proliferation and turnover. PMID:24830368
Pitzonka, Laura; Ullas, Sumana; Chinnam, Meenalakshmi; Povinelli, Benjamin J; Fisher, Daniel T; Golding, Michelle; Appenheimer, Michelle M; Nemeth, Michael J; Evans, Sharon; Goodrich, David W
2014-01-01
Co-transcriptionally assembled ribonucleoprotein (RNP) complexes are critical for RNA processing and nuclear export. RNPs have been hypothesized to contribute to the regulation of coordinated gene expression, and defects in RNP biogenesis contribute to genome instability and disease. Despite the large number of RNPs and the importance of the molecular processes they mediate, the requirements for individual RNP complexes in mammalian development and tissue homeostasis are not well characterized. THO is an evolutionarily conserved, nuclear RNP complex that physically links nascent transcripts with the nuclear export apparatus. THO is essential for early mouse embryonic development, limiting characterization of the requirements for THO in adult tissues. To address this shortcoming, a mouse strain has been generated allowing inducible deletion of the Thoc1 gene which encodes an essential protein subunit of THO. Bone marrow reconstitution was used to generate mice in which Thoc1 deletion could be induced specifically in the hematopoietic system. We find that granulocyte macrophage progenitors have a cell autonomous requirement for Thoc1 to maintain cell growth and viability. Lymphoid lineages are not detectably affected by Thoc1 loss under the homeostatic conditions tested. Myeloid lineages may be more sensitive to Thoc1 loss due to their relatively high rate of proliferation and turnover.
Nonterrestrial material processing and manufacturing of large space systems
NASA Technical Reports Server (NTRS)
Von Tiesenhausen, G.
1979-01-01
Nonterrestrial processing of materials and manufacturing of large space system components from preprocessed lunar materials at a manufacturing site in space is described. Lunar materials mined and preprocessed at the lunar resource complex will be flown to the space manufacturing facility (SMF), where together with supplementary terrestrial materials, they will be final processed and fabricated into space communication systems, solar cell blankets, radio frequency generators, and electrical equipment. Satellite Power System (SPS) material requirements and lunar material availability and utilization are detailed, and the SMF processing, refining, fabricating facilities, material flow and manpower requirements are described.
Gyanani, Vijay; Siddalingappa, Basavaraj; Betageri, Guru V
2015-01-01
Insoluble drugs often formulated with various excipients to enhance the dissolution. Cyclodextrins (CDs) are widely used excipients to improve dissolution profile of poorly soluble drugs. Drug-CD complexation process is complex and often requires multiple processes to produce solid dosage form. Hence, this study explored commonly used granulation processes for simultaneous complexation and granulation. Poorly soluble drugs ibuprofen and glyburide were selected as experimental drugs. Co-evaporation of drug:CD mixture from a solvent followed by wet granulation with water was considered as standard process for comparison. Spray granulation and fluid bed processing (FBP) using drug:CD solution in ethanol were evaluated as an alternative processes. The dissolution data of glyburide tablets indicated that tablets produced by spray granulation, FBP and co-evaporation-granulation have almost identical dissolution profile in water and 0.1% SLS (>70% in water and >60% in SLS versus 30 and 34%, respectively for plain tablet, in 120 min). Similarly, ibuprofen:CD tablets produced by co-evaporation-granulation and FBP displayed similar dissolution profile in 0.01 M HCl (pH 2.0) and buffer pH 5.5 (>90 and 100% versus 44 and 80% respectively for plain tablets, 120 min). Results of this study demonstrated that spray granulation is simple and cost effective process for low dose poorly soluble drugs to incorporate drug:CD complex into solid dosage form, whereas FBP is suitable for poorly soluble drugs with moderate dose.
Optimation and Determination of Fe-Oxinate Complex by Using High Performance Liquid Chromatography
NASA Astrophysics Data System (ADS)
Oktavia, B.; Nasra, E.; Sary, R. C.
2018-04-01
The need for iron will improve the industrial processes that require iron as its raw material. Control of industrial iron waste is very important to do. One method of iron analysis is to conduct indirect analysis of iron (III) ions by complexing with 8-Hydroxyquinoline or oxine. In this research, qualitative and quantitative tests of iron (III) ions in the form of complex with oxine. The analysis was performed using HPLC at a wavelength of 470 nm with an ODS C18 column. Three methods of analysis were performed: 1) Fe-oxinate complexes were prepared in an ethanol solvent so no need for separation anymore, (2) Fe-oxinate complexes were made in chloroform so that a solvent extraction was required before the complex was injected into the column while the third complex was formed in the column, wherein the eluent contains the oxide and the metal ions are then injected. The resulting chromatogram shows that the 3rd way provides a better chromatogram for iron analysis.
Epithelial junction formation requires confinement of Cdc42 activity by a novel SH3BP1 complex
Elbediwy, Ahmed; Zihni, Ceniz; Terry, Stephen J.; Clark, Peter
2012-01-01
Epithelial cell–cell adhesion and morphogenesis require dynamic control of actin-driven membrane remodeling. The Rho guanosine triphosphatase (GTPase) Cdc42 regulates sequential molecular processes during cell–cell junction formation; hence, mechanisms must exist that inactivate Cdc42 in a temporally and spatially controlled manner. In this paper, we identify SH3BP1, a GTPase-activating protein for Cdc42 and Rac, as a regulator of junction assembly and epithelial morphogenesis using a functional small interfering ribonucleic acid screen. Depletion of SH3BP1 resulted in loss of spatial control of Cdc42 activity, stalled membrane remodeling, and enhanced growth of filopodia. SH3BP1 formed a complex with JACOP/paracingulin, a junctional adaptor, and CD2AP, a scaffolding protein; both were required for normal Cdc42 signaling and junction formation. The filamentous actin–capping protein CapZ also associated with the SH3BP1 complex and was required for control of actin remodeling. Epithelial junction formation and morphogenesis thus require a dual activity complex, containing SH3BP1 and CapZ, that is recruited to sites of active membrane remodeling to guide Cdc42 signaling and cytoskeletal dynamics. PMID:22891260
Age-related differences in reaction time task performance in young children.
Kiselev, Sergey; Espy, Kimberly Andrews; Sheffield, Tiffany
2009-02-01
Performance of reaction time (RT) tasks was investigated in young children and adults to test the hypothesis that age-related differences in processing speed supersede a "global" mechanism and are a function of specific differences in task demands and processing requirements. The sample consisted of 54 4-year-olds, 53 5-year-olds, 59 6-year-olds, and 35 adults from Russia. Using the regression approach pioneered by Brinley and the transformation method proposed by Madden and colleagues and Ridderinkhoff and van der Molen, age-related differences in processing speed differed among RT tasks with varying demands. In particular, RTs differed between children and adults on tasks that required response suppression, discrimination of color or spatial orientation, reversal of contingencies of previously learned stimulus-response rules, and greater stimulus-response complexity. Relative costs of these RT task differences were larger than predicted by the global difference hypothesis except for response suppression. Among young children, age-related differences larger than predicted by the global difference hypothesis were evident when tasks required color or spatial orientation discrimination and stimulus-response rule complexity, but not for response suppression or reversal of stimulus-response contingencies. Process-specific, age-related differences in processing speed that support heterochronicity of brain development during childhood were revealed.
Evaluation in the Design of Complex Systems
ERIC Educational Resources Information Center
Ho, Li-An; Schwen, Thomas M.
2006-01-01
We identify literature that argues the process of creating knowledge-based system is often imbalanced. In most knowledge-based systems, development is often technology-driven instead of requirement-driven. Therefore, we argue designers must recognize that evaluation is a critical link in the application of requirement-driven development models…
Structural and Functional Analyses of the Proteins Involved in the Iron-Sulfur Cluster Biosynthesis
NASA Astrophysics Data System (ADS)
Wada, Kei
The iron-sulfur (Fe-S) clusters are ubiquitous prosthetic groups that are required to maintain such fundamental life processes as respiratory chain, photosynthesis and the regulation of gene expression. Assembly of intracellular Fe-S cluster requires the sophisticated biosynthetic systems called ISC and SUF machineries. To shed light on the molecular mechanism of Fe-S cluster assembly mediated by SUF machinery, several structures of the SUF components and their sub-complex were determined. The structural findings together with biochemical characterization of the core-complex (SufB-SufC-SufD complex) have led me to propose a working model for the cluster biosynthesis in the SUF machinery.
On Convergence of Development Costs and Cost Models for Complex Spaceflight Instrument Electronics
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Patel, Umeshkumar D.; Kasa, Robert L.; Hestnes, Phyllis; Brown, Tammy; Vootukuru, Madhavi
2008-01-01
Development costs of a few recent spaceflight instrument electrical and electronics subsystems have diverged from respective heritage cost model predictions. The cost models used are Grass Roots, Price-H and Parametric Model. These cost models originated in the military and industry around 1970 and were successfully adopted and patched by NASA on a mission-by-mission basis for years. However, the complexity of new instruments recently changed rapidly by orders of magnitude. This is most obvious in the complexity of representative spaceflight instrument electronics' data system. It is now required to perform intermediate processing of digitized data apart from conventional processing of science phenomenon signals from multiple detectors. This involves on-board instrument formatting of computational operands from row data for example, images), multi-million operations per second on large volumes of data in reconfigurable hardware (in addition to processing on a general purpose imbedded or standalone instrument flight computer), as well as making decisions for on-board system adaptation and resource reconfiguration. The instrument data system is now tasked to perform more functions, such as forming packets and instrument-level data compression of more than one data stream, which are traditionally performed by the spacecraft command and data handling system. It is furthermore required that the electronics box for new complex instruments is developed for one-digit watt power consumption, small size and that it is light-weight, and delivers super-computing capabilities. The conflict between the actual development cost of newer complex instruments and its electronics components' heritage cost model predictions seems to be irreconcilable. This conflict and an approach to its resolution are addressed in this paper by determining the complexity parameters, complexity index, and their use in enhanced cost model.
Reframing as a Best Practice: The Priority of Process in Highly Adaptive Decision Making
ERIC Educational Resources Information Center
Peters, Gary B.
2008-01-01
The development and practice of a well-defined process in which decisions are fully contemplated is needed in education today. The complexity of societal issues requires new depths of understanding, appreciation, and communication. Framing refers to the way a situation is described or viewed; reframing is the process of expanding and enriching the…
Brain Dynamics Sustaining Rapid Rule Extraction from Speech
ERIC Educational Resources Information Center
de Diego-Balaguer, Ruth; Fuentemilla, Lluis; Rodriguez-Fornells, Antoni
2011-01-01
Language acquisition is a complex process that requires the synergic involvement of different cognitive functions, which include extracting and storing the words of the language and their embedded rules for progressive acquisition of grammatical information. As has been shown in other fields that study learning processes, synchronization…
Utashiro, Nao; Williams, Claire R; Parrish, Jay Z; Emoto, Kazuo
2018-06-05
Animal responses to their environment rely on activation of sensory neurons by external stimuli. In many sensory systems, however, neurons display basal activity prior to the external stimuli. This prior activity is thought to modulate neural functions, yet its impact on animal behavior remains elusive. Here, we reveal a potential role for prior activity in olfactory receptor neurons (ORNs) in shaping larval olfactory behavior. We show that prior activity in larval ORNs is mediated by the olfactory receptor complex (OR complex). Mutations of Orco, an odorant co-receptor required for OR complex function, cause reduced attractive behavior in response to optogenetic activation of ORNs. Calcium imaging reveals that Orco mutant ORNs fully respond to optogenetic stimulation but exhibit altered temporal patterns of neural responses. These findings together suggest a critical role for prior activity in information processing upon ORN activation in Drosophila larvae, which in turn contributes to olfactory behavior control.
Nonparametric estimation of stochastic differential equations with sparse Gaussian processes.
García, Constantino A; Otero, Abraham; Félix, Paulo; Presedo, Jesús; Márquez, David G
2017-08-01
The application of stochastic differential equations (SDEs) to the analysis of temporal data has attracted increasing attention, due to their ability to describe complex dynamics with physically interpretable equations. In this paper, we introduce a nonparametric method for estimating the drift and diffusion terms of SDEs from a densely observed discrete time series. The use of Gaussian processes as priors permits working directly in a function-space view and thus the inference takes place directly in this space. To cope with the computational complexity that requires the use of Gaussian processes, a sparse Gaussian process approximation is provided. This approximation permits the efficient computation of predictions for the drift and diffusion terms by using a distribution over a small subset of pseudosamples. The proposed method has been validated using both simulated data and real data from economy and paleoclimatology. The application of the method to real data demonstrates its ability to capture the behavior of complex systems.
NASA Astrophysics Data System (ADS)
Mahoney, John R.; Aghamohammadi, Cina; Crutchfield, James P.
2016-02-01
A stochastic process’ statistical complexity stands out as a fundamental property: the minimum information required to synchronize one process generator to another. How much information is required, though, when synchronizing over a quantum channel? Recent work demonstrated that representing causal similarity as quantum state-indistinguishability provides a quantum advantage. We generalize this to synchronization and offer a sequence of constructions that exploit extended causal structures, finding substantial increase of the quantum advantage. We demonstrate that maximum compression is determined by the process’ cryptic order-a classical, topological property closely allied to Markov order, itself a measure of historical dependence. We introduce an efficient algorithm that computes the quantum advantage and close noting that the advantage comes at a cost-one trades off prediction for generation complexity.
FPGA-based coprocessor for matrix algorithms implementation
NASA Astrophysics Data System (ADS)
Amira, Abbes; Bensaali, Faycal
2003-03-01
Matrix algorithms are important in many types of applications including image and signal processing. These areas require enormous computing power. A close examination of the algorithms used in these, and related, applications reveals that many of the fundamental actions involve matrix operations such as matrix multiplication which is of O (N3) on a sequential computer and O (N3/p) on a parallel system with p processors complexity. This paper presents an investigation into the design and implementation of different matrix algorithms such as matrix operations, matrix transforms and matrix decompositions using an FPGA based environment. Solutions for the problem of processing large matrices have been proposed. The proposed system architectures are scalable, modular and require less area and time complexity with reduced latency when compared with existing structures.
Kühbacher, Andreas; Emmenlauer, Mario; Rämo, Pauli; Kafai, Natasha; Dehio, Christoph
2015-01-01
ABSTRACT Listeria monocytogenes enters nonphagocytic cells by a receptor-mediated mechanism that is dependent on a clathrin-based molecular machinery and actin rearrangements. Bacterial intra- and intercellular movements are also actin dependent and rely on the actin nucleating Arp2/3 complex, which is activated by host-derived nucleation-promoting factors downstream of the cell receptor Met during entry and by the bacterial nucleation-promoting factor ActA during comet tail formation. By genome-wide small interfering RNA (siRNA) screening for host factors involved in bacterial infection, we identified diverse cellular signaling networks and protein complexes that support or limit these processes. In addition, we could precise previously described molecular pathways involved in Listeria invasion. In particular our results show that the requirements for actin nucleators during Listeria entry and actin comet tail formation are different. Knockdown of several actin nucleators, including SPIRE2, reduced bacterial invasion while not affecting the generation of comet tails. Most interestingly, we observed that in contrast to our expectations, not all of the seven subunits of the Arp2/3 complex are required for Listeria entry into cells or actin tail formation and that the subunit requirements for each of these processes differ, highlighting a previously unsuspected versatility in Arp2/3 complex composition and function. PMID:25991686
The Fleeting Nature of Sex Differences in Spatial Ability.
ERIC Educational Resources Information Center
Alderton, David L.
Gender differences were examined on three computer-administered spatial processing tasks: (1) the Intercept task, requiring processing dynamic or moving figures; (2) the mental rotation test, employing rotated asymmetric polygons; and (3) the integrating details test, in which subjects performed a complex visual synthesis. Participants were about…
A Design Rationale Capture Using REMAP/MM
1994-06-01
company-wide down-sizing, the power company has determined that an automated service order processing system is the most economical solution. This new...service order processing system for a large power company can easily be 37 led. A system of this complexity would typically require three to five years
Biology Diagrams: Tools To Think With.
ERIC Educational Resources Information Center
Kindfield, Ann C. H.
Subcellular processes like meiosis are frequently problematic for learners because they are complex and, except for the extent that they can be observed under a light microscope, occur outside of our direct experience. More detailed characterization of what underlies various degrees of student understanding of a process is required to more fully…
The Gully in the "Brain Glitch" Theory
ERIC Educational Resources Information Center
Willis, Judy
2007-01-01
Learning to read is a complex process that requires multiple areas of the brain to operate together through intricate networks of neurons. The author of this article, a neurologist and middle school teacher, takes exception to interpretations of neuroimaging research that treat reading as an isolated, independent cognitive process. She…
NASA Astrophysics Data System (ADS)
Ibrahim, Bashirah; Ding, Lin; Heckler, Andrew F.; White, Daniel R.; Badeau, Ryan
2017-12-01
We examine students' mathematical performance on quantitative "synthesis problems" with varying mathematical complexity. Synthesis problems are tasks comprising multiple concepts typically taught in different chapters. Mathematical performance refers to the formulation, combination, and simplification of equations. Generally speaking, formulation and combination of equations require conceptual reasoning; simplification of equations requires manipulation of equations as computational tools. Mathematical complexity is operationally defined by the number and the type of equations to be manipulated concurrently due to the number of unknowns in each equation. We use two types of synthesis problems, namely, sequential and simultaneous tasks. Sequential synthesis tasks require a chronological application of pertinent concepts, and simultaneous synthesis tasks require a concurrent application of the pertinent concepts. A total of 179 physics major students from a second year mechanics course participated in the study. Data were collected from written tasks and individual interviews. Results show that mathematical complexity negatively influences the students' mathematical performance on both types of synthesis problems. However, for the sequential synthesis tasks, it interferes only with the students' simplification of equations. For the simultaneous synthesis tasks, mathematical complexity additionally impedes the students' formulation and combination of equations. Several reasons may explain this difference, including the students' different approaches to the two types of synthesis problems, cognitive load, and the variation of mathematical complexity within each synthesis type.
Thapa, Narendra; Sun, Yue; Schramp, Mark; Choi, Suyoung; Ling, Kun; Anderson, Richard A
2011-01-01
Summary Polarized delivery of signaling and adhesion molecules to the leading edge is required for directional migration of cells. Here, we describe a role for the PIP2 synthesizing enzyme, PIPKIγi2, in regulation of exocyst complex control of cell polarity and polarized integrin trafficking during migration. Loss of PIPKIγi2 impaired directional migration, formation of cell polarity, and integrin trafficking to the leading edge. Upon initiation of directional migration PIPKIγi2 via PIP2 generation controls the integration of the exocyst complex into an integrin-containing trafficking compartment(s) that requires the talin-binding ability of PIPKIγi2, and talin for integrin recruitment to the leading edge. A PIP2 requirement is further emphasized by inhibition of PIPKIγi2-regulated directional migration by an Exo70 mutant deficient in PIP2 binding. These results reveal how phosphoinositide generation orchestrates polarized trafficking of integrin in coordination with talin that links integrins to the actin cytoskeleton, processes that are required for directional migration. PMID:22264730
Rapid production of hollow SS316 profiles by extrusion based additive manufacturing
NASA Astrophysics Data System (ADS)
Rane, Kedarnath; Cataldo, Salvatore; Parenti, Paolo; Sbaglia, Luca; Mussi, Valerio; Annoni, Massimiliano; Giberti, Hermes; Strano, Matteo
2018-05-01
Complex shaped stainless steel tubes are often required for special purpose biomedical equipment. Nevertheless, traditional manufacturing technologies, such as extrusion, lack the ability to compete in a market of customized complex components because of associated expenses towards tooling and extrusion presses. To rapid manufacture few of such components with low cost and high precision, a new Extrusion based Additive Manufacturing (EAM) process, is proposed in this paper, and as an example, short stainless steel 316L complex shaped and sectioned tubes were prepared by EAM. Several sample parts were produced using this process; the dimensional stability, surface roughness and chemical composition of sintered samples were investigated to prove process competence. The results indicate that feedstock with a 316L particle content of 92.5 wt. % can be prepared with a sigma blade mixing, whose rheological behavior is fit for EAM. The green samples have sufficient strength to handle them for subsequent treatments. The sintered samples considerably shrunk to designed dimensions and have a homogeneous microstructure to impart mechanical strength. Whereas, maintaining comparable dimensional accuracy and chemical composition which are required for biomedical equipment still need iterations, a kinematic correction and modification in debinding cycle was proposed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hess, Nancy J.; Pasa-Tolic, Ljiljana; Bailey, Vanessa L.
Understanding the role played by microorganisms within soil systems is challenged by the unique intersection of physics, chemistry, mineralogy and biology in fostering habitat for soil microbial communities. To address these challenges will require observations across multiple spatial and temporal scales to capture the dynamics and emergent behavior from complex and interdependent processes. The heterogeneity and complexity of the rhizosphere require advanced techniques that press the simultaneous frontiers of spatial resolution, analyte sensitivity and specificity, reproducibility, large dynamic range, and high throughput. Fortunately many exciting technical advancements are now available to inform and guide the development of new hypotheses. Themore » aim of this Special issue is to provide a holistic view of the rhizosphere in the perspective of modern molecular biology methodologies that enabled a highly-focused, detailed view on the processes in the rhizosphere, including numerous, strong and complex interactions between plant roots, soil constituents and microorganisms. We discuss the current rhizosphere research challenges and knowledge gaps, as well as perspectives and approaches using newly available state-of-the-art toolboxes. These new approaches and methodologies allow the study of rhizosphere processes and properties, and rhizosphere as a central component of ecosystems and biogeochemical cycles.« less
Integrating technology into complex intervention trial processes: a case study.
Drew, Cheney J G; Poile, Vincent; Trubey, Rob; Watson, Gareth; Kelson, Mark; Townson, Julia; Rosser, Anne; Hood, Kerenza; Quinn, Lori; Busse, Monica
2016-11-17
Trials of complex interventions are associated with high costs and burdens in terms of paperwork, management, data collection, validation, and intervention fidelity assessment occurring across multiple sites. Traditional data collection methods rely on paper-based forms, where processing can be time-consuming and error rates high. Electronic source data collection can potentially address many of these inefficiencies, but has not routinely been used in complex intervention trials. Here we present the use of an on-line system for managing all aspects of data handling and for the monitoring of trial processes in a multicentre trial of a complex intervention. We custom built a web-accessible software application for the delivery of ENGAGE-HD, a multicentre trial of a complex physical therapy intervention. The software incorporated functionality for participant randomisation, data collection and assessment of intervention fidelity. It was accessible to multiple users with differing levels of access depending on required usage or to maintain blinding. Each site was supplied with a 4G-enabled iPad for accessing the system. The impact of this system was quantified through review of data quality and collation of feedback from site coordinators and assessors through structured process interviews. The custom-built system was an efficient tool for collecting data and managing trial processes. Although the set-up time required was significant, using the system resulted in an overall data completion rate of 98.5% with a data query rate of 0.1%, the majority of which were resolved in under a week. Feedback from research staff indicated that the system was highly acceptable for use in a research environment. This was a reflection of the portability and accessibility of the system when using the iPad and its usefulness in aiding accurate data collection, intervention fidelity and general administration. A combination of commercially available hardware and a bespoke online database designed to support data collection, intervention fidelity and trial progress provides a viable option for streamlining trial processes in a multicentre complex intervention trial. There is scope to further extend the system to cater for larger trials and add further functionality such as automatic reporting facilities and participant management support. ISRCTN65378754 , registered on 13 March 2014.
Microbial certification of the MER spacecraft
NASA Technical Reports Server (NTRS)
Schubert, W. W.; Arakelian, T.; Barengoltz, J. B.; Chough, N. G.; Chung, S. Y.; Law, J.; Kirschner, L.; Koukol, R. C.; Newlin, L. E.; Morales, F.
2003-01-01
We conclude in this paper that a combination of Dry Heat Microbrial Reduction and control measures during complex mechanical assembly processes can result in a total spore bioburden that meets requirements.
40 CFR 725.1075 - Burkholderia cepacia complex.
Code of Federal Regulations, 2010 CFR
2010-07-01
... CONTROL ACT REPORTING REQUIREMENTS AND REVIEW PROCESSES FOR MICROORGANISMS Significant New Uses for... significant new use is any use other than research and development in the degradation of chemicals via...
40 CFR 725.1075 - Burkholderia cepacia complex.
Code of Federal Regulations, 2013 CFR
2013-07-01
... CONTROL ACT REPORTING REQUIREMENTS AND REVIEW PROCESSES FOR MICROORGANISMS Significant New Uses for... significant new use is any use other than research and development in the degradation of chemicals via...
40 CFR 725.1075 - Burkholderia cepacia complex.
Code of Federal Regulations, 2012 CFR
2012-07-01
... CONTROL ACT REPORTING REQUIREMENTS AND REVIEW PROCESSES FOR MICROORGANISMS Significant New Uses for... significant new use is any use other than research and development in the degradation of chemicals via...
40 CFR 725.1075 - Burkholderia cepacia complex.
Code of Federal Regulations, 2011 CFR
2011-07-01
... CONTROL ACT REPORTING REQUIREMENTS AND REVIEW PROCESSES FOR MICROORGANISMS Significant New Uses for... significant new use is any use other than research and development in the degradation of chemicals via...
40 CFR 725.1075 - Burkholderia cepacia complex.
Code of Federal Regulations, 2014 CFR
2014-07-01
... CONTROL ACT REPORTING REQUIREMENTS AND REVIEW PROCESSES FOR MICROORGANISMS Significant New Uses for... significant new use is any use other than research and development in the degradation of chemicals via...
Space shuttle engineering and operations support. Avionics system engineering
NASA Technical Reports Server (NTRS)
Broome, P. A.; Neubaur, R. J.; Welsh, R. T.
1976-01-01
The shuttle avionics integration laboratory (SAIL) requirements for supporting the Spacelab/orbiter avionics verification process are defined. The principal topics are a Spacelab avionics hardware assessment, test operations center/electronic systems test laboratory (TOC/ESL) data processing requirements definition, SAIL (Building 16) payload accommodations study, and projected funding and test scheduling. Because of the complex nature of the Spacelab/orbiter computer systems, the PCM data link, and the high rate digital data system hardware/software relationships, early avionics interface verification is required. The SAIL is a prime candidate test location to accomplish this early avionics verification.
Digital imaging technology assessment: Digital document storage project
NASA Technical Reports Server (NTRS)
1989-01-01
An ongoing technical assessment and requirements definition project is examining the potential role of digital imaging technology at NASA's STI facility. The focus is on the basic components of imaging technology in today's marketplace as well as the components anticipated in the near future. Presented is a requirement specification for a prototype project, an initial examination of current image processing at the STI facility, and an initial summary of image processing projects at other sites. Operational imaging systems incorporate scanners, optical storage, high resolution monitors, processing nodes, magnetic storage, jukeboxes, specialized boards, optical character recognition gear, pixel addressable printers, communications, and complex software processes.
Community Care for People with Complex Care Needs: Bridging the Gap between Health and Social Care
Ho, Julia W.; Hans, Parminder Kaur; Nelson, Michelle LA
2017-01-01
Introduction: A growing number of people are living with complex care needs characterized by multimorbidity, mental health challenges and social deprivation. Required is the integration of health and social care, beyond traditional health care services to address social determinants. This study investigates key care components to support complex patients and their families in the community. Methods: Expert panel focus groups with 24 care providers, working in health and social care sectors across Toronto, Ontario, Canada were conducted. Patient vignettes illustrating significant health and social care needs were presented to participants. The vignettes prompted discussions on i) how best to meet complex care needs in the community and ii) the barriers to delivering care to this population. Results: Categories to support care needs of complex patients and their families included i) relationships as the foundation for care, ii) desired processes and structures of care, and iii) barriers and workarounds for desired care. Discussion and Conclusions: Meeting the needs of the population who require health and social care requires time to develop authentic relationships, broadening the membership of the care team, communicating across sectors, co-locating health and social care, and addressing the barriers that prevent providers from engaging in these required practices. PMID:28970760
A Chemical Engineer's Perspective on Health and Disease
Androulakis, Ioannis P.
2014-01-01
Chemical process systems engineering considers complex supply chains which are coupled networks of dynamically interacting systems. The quest to optimize the supply chain while meeting robustness and flexibility constraints in the face of ever changing environments necessitated the development of theoretical and computational tools for the analysis, synthesis and design of such complex engineered architectures. However, it was realized early on that optimality is a complex characteristic required to achieve proper balance between multiple, often competing, objectives. As we begin to unravel life's intricate complexities, we realize that that living systems share similar structural and dynamic characteristics; hence much can be learned about biological complexity from engineered systems. In this article, we draw analogies between concepts in process systems engineering and conceptual models of health and disease; establish connections between these concepts and physiologic modeling; and describe how these mirror onto the physiological counterparts of engineered systems. PMID:25506103
Graphical Language for Data Processing
NASA Technical Reports Server (NTRS)
Alphonso, Keith
2011-01-01
A graphical language for processing data allows processing elements to be connected with virtual wires that represent data flows between processing modules. The processing of complex data, such as lidar data, requires many different algorithms to be applied. The purpose of this innovation is to automate the processing of complex data, such as LIDAR, without the need for complex scripting and programming languages. The system consists of a set of user-interface components that allow the user to drag and drop various algorithmic and processing components onto a process graph. By working graphically, the user can completely visualize the process flow and create complex diagrams. This innovation supports the nesting of graphs, such that a graph can be included in another graph as a single step for processing. In addition to the user interface components, the system includes a set of .NET classes that represent the graph internally. These classes provide the internal system representation of the graphical user interface. The system includes a graph execution component that reads the internal representation of the graph (as described above) and executes that graph. The execution of the graph follows the interpreted model of execution in that each node is traversed and executed from the original internal representation. In addition, there are components that allow external code elements, such as algorithms, to be easily integrated into the system, thus making the system infinitely expandable.
Artificial intelligence techniques for scheduling Space Shuttle missions
NASA Technical Reports Server (NTRS)
Henke, Andrea L.; Stottler, Richard H.
1994-01-01
Planning and scheduling of NASA Space Shuttle missions is a complex, labor-intensive process requiring the expertise of experienced mission planners. We have developed a planning and scheduling system using combinations of artificial intelligence knowledge representations and planning techniques to capture mission planning knowledge and automate the multi-mission planning process. Our integrated object oriented and rule-based approach reduces planning time by orders of magnitude and provides planners with the flexibility to easily modify planning knowledge and constraints without requiring programming expertise.
NASA Astrophysics Data System (ADS)
Spoelstra, Paul; Djakow, Eugen; Homberg, Werner
2017-10-01
The production of complex organic shapes in sheet metals is gaining more importance in the food industry due to increasing functional and hygienic demands. Hence it is necessary to produce parts with complex geometries promoting cleanability and general sanitation leading to improvement of food safety. In this context, and especially when stainless steel has to be formed into highly complex geometries while maintaining desired surface properties, it is inevitable that alternative manufacturing processes will need to be used which meet these requirements. Rubber pad forming offers high potential when it comes to shaping complex parts with excellent surface quality, with virtually no tool marks and scratches. Especially in cases where only small series are to be produced, rubber pad forming processes offers both technological and economic advantages. Due to the flexible punch, variation in metal thickness can be used with the same forming tool. The investments to set-up Rubber pad forming is low in comparison to conventional sheet metal forming processes. The process facilitates production of shallow sheet metal parts with complex contours and bends. Different bending sequences in a multiple tool set-up can also be conducted. The planned contribution thus describes a brief overview of the rubber pad technology. It shows the prototype rubber pad forming machine which can be used to perform complex part geometries made from stainless steel (1.4301). Based on an analysis of the already existing systems and new machines for rubber pad forming processes, together with their process properties, influencing variables and areas of application, some relevant parts for the food industry are presented.
Ordinal optimization and its application to complex deterministic problems
NASA Astrophysics Data System (ADS)
Yang, Mike Shang-Yu
1998-10-01
We present in this thesis a new perspective to approach a general class of optimization problems characterized by large deterministic complexities. Many problems of real-world concerns today lack analyzable structures and almost always involve high level of difficulties and complexities in the evaluation process. Advances in computer technology allow us to build computer models to simulate the evaluation process through numerical means, but the burden of high complexities remains to tax the simulation with an exorbitant computing cost for each evaluation. Such a resource requirement makes local fine-tuning of a known design difficult under most circumstances, let alone global optimization. Kolmogorov equivalence of complexity and randomness in computation theory is introduced to resolve this difficulty by converting the complex deterministic model to a stochastic pseudo-model composed of a simple deterministic component and a white-noise like stochastic term. The resulting randomness is then dealt with by a noise-robust approach called Ordinal Optimization. Ordinal Optimization utilizes Goal Softening and Ordinal Comparison to achieve an efficient and quantifiable selection of designs in the initial search process. The approach is substantiated by a case study in the turbine blade manufacturing process. The problem involves the optimization of the manufacturing process of the integrally bladed rotor in the turbine engines of U.S. Air Force fighter jets. The intertwining interactions among the material, thermomechanical, and geometrical changes makes the current FEM approach prohibitively uneconomical in the optimization process. The generalized OO approach to complex deterministic problems is applied here with great success. Empirical results indicate a saving of nearly 95% in the computing cost.
Dhungel, Nripesh; Hopper, Anita K.
2012-01-01
Pre-tRNA splicing is an essential process in all eukaryotes. In yeast and vertebrates, the enzyme catalyzing intron removal from pre-tRNA is a heterotetrameric complex (splicing endonuclease [SEN] complex). Although the SEN complex is conserved, the subcellular location where pre-tRNA splicing occurs is not. In yeast, the SEN complex is located at the cytoplasmic surface of mitochondria, whereas in vertebrates, pre-tRNA splicing is nuclear. We engineered yeast to mimic the vertebrate cell biology and demonstrate that all three steps of pre-tRNA splicing, as well as tRNA nuclear export and aminoacylation, occur efficiently when the SEN complex is nuclear. However, nuclear pre-tRNA splicing fails to complement growth defects of cells with defective mitochondrial-located splicing, suggesting that the yeast SEN complex surprisingly serves a novel and essential function in the cytoplasm that is unrelated to tRNA splicing. The novel function requires all four SEN complex subunits and the catalytic core. A subset of pre-rRNAs accumulates when the SEN complex is restricted to the nucleus, indicating that the SEN complex moonlights in rRNA processing. Thus, findings suggest that selection for the subcellular distribution of the SEN complex may reside not in its canonical, but rather in a novel, activity. PMID:22391451
ERIC Educational Resources Information Center
Rodicio, Hector Garcia; Sanchez, Emilio; Acuna, Santiago R.
2013-01-01
Acquiring complex conceptual knowledge requires learners to self-regulate their learning by planning, monitoring, and adjusting the process but they find it difficult to do so. In one experiment, we examined whether learners need broad systems of support for self-regulation or whether they are also able to learn with more economical support…
On-board multispectral classification study
NASA Technical Reports Server (NTRS)
Ewalt, D.
1979-01-01
The factors relating to onboard multispectral classification were investigated. The functions implemented in ground-based processing systems for current Earth observation sensors were reviewed. The Multispectral Scanner, Thematic Mapper, Return Beam Vidicon, and Heat Capacity Mapper were studied. The concept of classification was reviewed and extended from the ground-based image processing functions to an onboard system capable of multispectral classification. Eight different onboard configurations, each with varying amounts of ground-spacecraft interaction, were evaluated. Each configuration was evaluated in terms of turnaround time, onboard processing and storage requirements, geometric and classification accuracy, onboard complexity, and ancillary data required from the ground.
NASA Astrophysics Data System (ADS)
Taylor, John R.; Stolz, Christopher J.
1993-08-01
Laser system performance and reliability depends on the related performance and reliability of the optical components which define the cavity and transport subsystems. High-average-power and long transport lengths impose specific requirements on component performance. The complexity of the manufacturing process for optical components requires a high degree of process control and verification. Qualification has proven effective in ensuring confidence in the procurement process for these optical components. Issues related to component reliability have been studied and provide useful information to better understand the long term performance and reliability of the laser system.
NASA Astrophysics Data System (ADS)
Taylor, J. R.; Stolz, C. J.
1992-12-01
Laser system performance and reliability depends on the related performance and reliability of the optical components which define the cavity and transport subsystems. High-average-power and long transport lengths impose specific requirements on component performance. The complexity of the manufacturing process for optical components requires a high degree of process control and verification. Qualification has proven effective in ensuring confidence in the procurement process for these optical components. Issues related to component reliability have been studied and provide useful information to better understand the long term performance and reliability of the laser system.
BAG3 regulates formation of the SNARE complex and insulin secretion
Iorio, V; Festa, M; Rosati, A; Hahne, M; Tiberti, C; Capunzo, M; De Laurenzi, V; Turco, M C
2015-01-01
Insulin release in response to glucose stimulation requires exocytosis of insulin-containing granules. Glucose stimulation of beta cells leads to focal adhesion kinase (FAK) phosphorylation, which acts on the Rho family proteins (Rho, Rac and Cdc42) that direct F-actin remodeling. This process requires docking and fusion of secretory vesicles to the release sites at the plasma membrane and is a complex mechanism that is mediated by SNAREs. This transiently disrupts the F-actin barrier and allows the redistribution of the insulin-containing granules to more peripheral regions of the β cell, hence facilitating insulin secretion. In this manuscript, we show for the first time that BAG3 plays an important role in this process. We show that BAG3 downregulation results in increased insulin secretion in response to glucose stimulation and in disruption of the F-actin network. Moreover, we show that BAG3 binds to SNAP-25 and syntaxin-1, two components of the t-SNARE complex preventing the interaction between SNAP-25 and syntaxin-1. Upon glucose stimulation BAG3 is phosphorylated by FAK and dissociates from SNAP-25 allowing the formation of the SNARE complex, destabilization of the F-actin network and insulin release. PMID:25766323
Activation of HIV-1 pre-mRNA 3' processing in vitro requires both an upstream element and TAR.
Gilmartin, G M; Fleming, E S; Oetjen, J
1992-01-01
The architecture of the human immunodeficiency virus type 1 (HIV-1) genome presents an intriguing dilemma for the 3' processing of viral transcripts--to disregard a canonical 'core' poly(A) site processing signal present at the 5' end of the transcript and yet to utilize efficiently an identical signal that resides at the 3' end of the message. The choice of processing sites in HIV-1 appears to be influenced by two factors: (i) proximity to the cap site, and (ii) sequences upstream of the core poly(A) site. We now demonstrate that an in vivo-defined upstream element that resides within the U3 region, 76 nucleotides upstream of the AAUAAA hexamer, acts specifically to enhance 3' processing at the HIV-1 core poly(A) site in vitro. We furthermore show that efficient in vitro 3' processing requires the RNA stem-loop structure of TAR, which serves to juxtapose spatially the upstream element and the core poly(A) site. An analysis of the stability of 3' processing complexes formed at the HIV-1 poly(A) site in vitro suggests that the upstream element may function by increasing processing complex stability at the core poly(A) site. Images PMID:1425577
Autophagy in C. elegans development.
Palmisano, Nicholas J; Meléndez, Alicia
2018-04-27
Autophagy involves the sequestration of cytoplasmic contents in a double-membrane structure referred to as the autophagosome and the degradation of its contents upon delivery to lysosomes. Autophagy activity has a role in multiple biological processes during the development of the nematode Caenorhabditis elegans. Basal levels of autophagy are required to remove aggregate prone proteins, paternal mitochondria, and spermatid-specific membranous organelles. During larval development, autophagy is required for the remodeling that occurs during dauer development, and autophagy can selectively degrade components of the miRNA-induced silencing complex, and modulate miRNA-mediated silencing. Basal levels of autophagy are important in synapse formation and in the germ line, to promote the proliferation of proliferating stem cells. Autophagy activity is also required for the efficient removal of apoptotic cell corpses by promoting phagosome maturation. Finally, autophagy is also involved in lipid homeostasis and in the aging process. In this review, we first describe the molecular complexes involved in the process of autophagy, its regulation, and mechanisms for cargo recognition. In the second section, we discuss the developmental contexts where autophagy has been shown to be important. Studies in C. elegans provide valuable insights into the physiological relevance of this process during metazoan development. Copyright © 2018 Elsevier Inc. All rights reserved.
Hot melt extrusion of ion-exchange resin for taste masking.
Tan, David Cheng Thiam; Ong, Jeremy Jianming; Gokhale, Rajeev; Heng, Paul Wan Sia
2018-05-30
Taste masking is important for some unpleasant tasting bioactives in oral dosage forms. Among many methods available for taste-masking, use of ion-exchange resin (IER) holds promise. IER combined with hot melt extrusion (HME) may offer additional advantages over solvent methods. IER provides taste masking by complexing with the drug ions and preventing drug dissolution in the mouth. Drug-IER complexation approaches described in literatures are mainly based either on batch processing or column eluting. These methods of drug-IER complexation have obvious limitations such as high solvent volume requirements, multiprocessing steps and extended processing time. Thus, the objective of this study was to develop a single-step, solvent-free, continuous HME process for complexation of drug-IER. The screening study evaluated drug to IER ratio, types of IER and drug complexation methods. In the screening study, a potassium salt of a weakly acidic carboxylate-based cationic IER was found suitable for the HME method. Thereafter, optimization study was conducted by varying HME process parameters such as screw speed, extrusion temperature and drug to IER ratio. It was observed that extrusion temperature and drug to IER ratio are imperative in drug-IER complexation through HME. In summary, this study has established the feasibility of a continuous complexation method for drug to IER using HME for taste masking. Copyright © 2018 Elsevier B.V. All rights reserved.
Positive Disintegration as a Process of Symmetry Breaking.
Laycraft, Krystyna
2017-04-01
This article presents an analysis of the positive disintegration as a process of symmetry breaking. Symmetry breaking plays a major role in self-organized patterns formation and correlates directly to increasing complexity and function specialization. According to Dabrowski, a creator of the Theory of Positive Disintegration, the change from lower to higher levels of human development requires a major restructuring of an individual's psychological makeup. Each level of human development is a relatively stable and coherent configuration of emotional-cognitive patterns called developmental dynamisms. Their main function is to restructure a mental structure by breaking the symmetry of a low level and bringing differentiation and then integration to higher levels. The positive disintegration is then the process of transitions from a lower level of high symmetry and low complexity to higher levels of low symmetry and high complexity of mental structure.
48 CFR 915.404-4-71-4 - Considerations affecting fee amounts.
Code of Federal Regulations, 2012 CFR
2012-10-01
...—Manufacturing plants involving operations requiring a high degree of design layout or process control; nuclear reactors; atomic particle accelerators; complex laboratories or industrial units especially designed for...
48 CFR 915.404-4-71-4 - Considerations affecting fee amounts.
Code of Federal Regulations, 2014 CFR
2014-10-01
...—Manufacturing plants involving operations requiring a high degree of design layout or process control; nuclear reactors; atomic particle accelerators; complex laboratories or industrial units especially designed for...
48 CFR 915.404-4-71-4 - Considerations affecting fee amounts.
Code of Federal Regulations, 2010 CFR
2010-10-01
...—Manufacturing plants involving operations requiring a high degree of design layout or process control; nuclear reactors; atomic particle accelerators; complex laboratories or industrial units especially designed for...
48 CFR 915.404-4-71-4 - Considerations affecting fee amounts.
Code of Federal Regulations, 2011 CFR
2011-10-01
...—Manufacturing plants involving operations requiring a high degree of design layout or process control; nuclear reactors; atomic particle accelerators; complex laboratories or industrial units especially designed for...
48 CFR 915.404-4-71-4 - Considerations affecting fee amounts.
Code of Federal Regulations, 2013 CFR
2013-10-01
...—Manufacturing plants involving operations requiring a high degree of design layout or process control; nuclear reactors; atomic particle accelerators; complex laboratories or industrial units especially designed for...
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.
1995-01-01
Computing architectures are being assembled that extend concurrent engineering practices by providing more efficient execution and collaboration on distributed, heterogeneous computing networks. Built on the successes of initial architectures, requirements for a next-generation design computing infrastructure can be developed. These requirements concentrate on those needed by a designer in decision-making processes from product conception to recycling and can be categorized in two areas: design process and design information management. A designer both designs and executes design processes throughout design time to achieve better product and process capabilities while expanding fewer resources. In order to accomplish this, information, or more appropriately design knowledge, needs to be adequately managed during product and process decomposition as well as recomposition. A foundation has been laid that captures these requirements in a design architecture called DREAMS (Developing Robust Engineering Analysis Models and Specifications). In addition, a computing infrastructure, called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment), is being developed that satisfies design requirements defined in DREAMS and incorporates enabling computational technologies.
Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N A; Bin Zaheer, Kashif
2015-01-01
Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called 'StakeMeter'. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error.
Cex1p facilitates Rna1p-mediated dissociation of the Los1p-tRNA-Gsp1p-GTP export complex.
McGuire, Andrew T; Mangroo, Dev
2012-02-01
Nuclear tRNA export plays an essential role in key cellular processes such as regulation of protein synthesis, cell cycle progression, response to nutrient availability and DNA damage and development. Like other nuclear export processes, assembly of the nuclear tRNA export complex in the nucleus is dependent on Ran-GTP/Gsp1p-GTP, and dissociation of the export receptor-tRNA-Ran-GTP/Gsp1p-GTP complex in the cytoplasm requires RanBP1/Yrb1p and RanGAP/Rna1p to activate the GTPase activity of Ran-GTP/Gsp1p-GTP. The Saccharomyces cerevisiae Cex1p and Human Scyl1 have also been proposed to participate in unloading of the tRNA export receptors at the cytoplasmic face of the nuclear pore complex (NPC). Here, we provide evidence suggesting that Cex1p is required for activation of the GTPase activity of Gsp1p and dissociation of the receptor-tRNA-Gsp1p export complex in S. cerevisiae. The data suggest that Cex1p recruits Rna1p from the cytoplasm to the NPC and facilitates Rna1p activation of the GTPase activity of Gsp1p by enabling Rna1p to gain access to Gsp1p-GTP bound to the export receptor tRNA complex. It is possible that this tRNA unloading mechanism is conserved in evolutionarily diverse organisms and that other Gsp1p-GTP-dependent export processes use a pathway-specific component to recruit Rna1p to the NPC. © 2011 John Wiley & Sons A/S.
Love, Tracy; Haist, Frank; Nicol, Janet; Swinney, David
2009-01-01
Using functional magnetic resonance imaging (fMRI), this study directly examined an issue that bridges the potential language processing and multi-modal views of the role of Broca’s area: the effects of task-demands in language comprehension studies. We presented syntactically simple and complex sentences for auditory comprehension under three different (differentially complex) task-demand conditions: passive listening, probe verification, and theme judgment. Contrary to many language imaging findings, we found that both simple and complex syntactic structures activated left inferior frontal cortex (L-IFC). Critically, we found activation in these frontal regions increased together with increased task-demands. Specifically, tasks that required greater manipulation and comparison of linguistic material recruited L-IFC more strongly; independent of syntactic structure complexity. We argue that much of the presumed syntactic effects previously found in sentence imaging studies of L-IFC may, among other things, reflect the tasks employed in these studies and that L-IFC is a region underlying mnemonic and other integrative functions, on which much language processing may rely. PMID:16881268
Foundations for Streaming Model Transformations by Complex Event Processing.
Dávid, István; Ráth, István; Varró, Dániel
2018-01-01
Streaming model transformations represent a novel class of transformations to manipulate models whose elements are continuously produced or modified in high volume and with rapid rate of change. Executing streaming transformations requires efficient techniques to recognize activated transformation rules over a live model and a potentially infinite stream of events. In this paper, we propose foundations of streaming model transformations by innovatively integrating incremental model query, complex event processing (CEP) and reactive (event-driven) transformation techniques. Complex event processing allows to identify relevant patterns and sequences of events over an event stream. Our approach enables event streams to include model change events which are automatically and continuously populated by incremental model queries. Furthermore, a reactive rule engine carries out transformations on identified complex event patterns. We provide an integrated domain-specific language with precise semantics for capturing complex event patterns and streaming transformations together with an execution engine, all of which is now part of the Viatra reactive transformation framework. We demonstrate the feasibility of our approach with two case studies: one in an advanced model engineering workflow; and one in the context of on-the-fly gesture recognition.
Paperless clinical trials: Myth or reality?
Gupta, Sandeep K.
2015-01-01
There is an urgent need to expedite the time-to-market for new drugs and to make the approval process simpler. But clinical trials are a complex process and the increased complexity leads to decreased efficiency. Hence, pharmaceutical organizations want to move toward a more technology-driven clinical trial process for recording, analyzing, reporting, archiving, etc., In recent times, the progress has certainly been made in developing paperless systems that improve data capture and management. The adaptation of paperless processes may require major changes to existing procedures. But this is in the best interests of these organizations to remain competitive because a paperless clinical trial would lead to a consistent and streamlined framework. Moreover, all major regulatory authorities also advocate adoption of paperless trial. But challenges still remain toward implementation of paperless clinical trial process. PMID:26288464
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wynne, Adam S.
2011-05-05
In many application domains in science and engineering, data produced by sensors, instruments and networks is naturally processed by software applications structured as a pipeline . Pipelines comprise a sequence of software components that progressively process discrete units of data to produce a desired outcome. For example, in a Web crawler that is extracting semantics from text on Web sites, the first stage in the pipeline might be to remove all HTML tags to leave only the raw text of the document. The second step may parse the raw text to break it down into its constituent grammatical parts, suchmore » as nouns, verbs and so on. Subsequent steps may look for names of people or places, interesting events or times so documents can be sequenced on a time line. Each of these steps can be written as a specialized program that works in isolation with other steps in the pipeline. In many applications, simple linear software pipelines are sufficient. However, more complex applications require topologies that contain forks and joins, creating pipelines comprising branches where parallel execution is desirable. It is also increasingly common for pipelines to process very large files or high volume data streams which impose end-to-end performance constraints. Additionally, processes in a pipeline may have specific execution requirements and hence need to be distributed as services across a heterogeneous computing and data management infrastructure. From a software engineering perspective, these more complex pipelines become problematic to implement. While simple linear pipelines can be built using minimal infrastructure such as scripting languages, complex topologies and large, high volume data processing requires suitable abstractions, run-time infrastructures and development tools to construct pipelines with the desired qualities-of-service and flexibility to evolve to handle new requirements. The above summarizes the reasons we created the MeDICi Integration Framework (MIF) that is designed for creating high-performance, scalable and modifiable software pipelines. MIF exploits a low friction, robust, open source middleware platform and extends it with component and service-based programmatic interfaces that make implementing complex pipelines simple. The MIF run-time automatically handles queues between pipeline elements in order to handle request bursts, and automatically executes multiple instances of pipeline elements to increase pipeline throughput. Distributed pipeline elements are supported using a range of configurable communications protocols, and the MIF interfaces provide efficient mechanisms for moving data directly between two distributed pipeline elements.« less
Relating geomorphic change and grazing to avian communities in riparian forests
Scott, M.L.; Skagen, S.K.; Merligliano, M.F.
2003-01-01
Avian conservation in riparian or bottomland forests requires an understanding of the physical and biotic factors that sustain the structural complexity of riparian vegetation. Riparian forests of western North America are dependent upon flow-related geomorphic processes necessary for establishment of new cottonwood and willow patches. In June 1995, we examined how fluvial geomorphic processes and long-term grazing influence the structural complexity of riparian vegetation and the abundance and diversity of breeding birds along the upper Missouri River in central Montana, a large, flow-regulated, and geomorphically constrained reach. Use by breeding birds was linked to fluvial geomorphic processes that influence the structure of these patches. Species richness and bird diversity increased with increasing structural complexity of vegetation (F1,32 = 75.49, p < 0.0001; F1,32 = 79.76, p < 0.0001, respectively). Bird species composition was significantly correlated with vegetation strata diversity (rs,33 = 0.98, p < 0.0001). Bird abundance in canopy and tall-shrub foraging guilds increased significantly with increasing tree cover and tall-shrub cover (F1,22 = 34.68, p < 0.0001; F1,20 = 22.22, p < 0.0001, respectively). Seventeen bird species, including five species of concern (e.g., Red-eyed Vireo [Vireo olivaceus]), were significantly associated (p < 0.10) with structurally complex forest patches, whereas only six bird species were significantly associated with structurally simple forest patches. We related the structural complexity of 34 riparian vegetation patches to geomorphic change, woody vegetation establishment, and grazing history over a 35-year post-dam period (1953–1988). The structural complexity of habitat patches was positively related to recent sediment accretion (t33 = 3.31, p = 0.002) and vegetation establishment (t20.7 = −3.63, p = 0.002) and negatively related to grazing activity (t19.6 = 3.75, p = 0.001). Avian conservation along rivers like the upper Missouri requires maintenance of the geomorphic processes responsible for tree establishment and management of land-use activities in riparian forests.
Heffernan, Kayla Joanne; Chang, Shanton; Maclean, Skye Tamara; Callegari, Emma Teresa; Garland, Suzanne Marie; Reavley, Nicola Jane; Varigos, George Andrew; Wark, John Dennis
2016-02-09
The now ubiquitous catchphrase, "There's an app for that," rings true owing to the growing number of mobile phone apps. In excess of 97,000 eHealth apps are available in major app stores. Yet the effectiveness of these apps varies greatly. While a minority of apps are developed grounded in theory and in conjunction with health care experts, the vast majority are not. This is concerning given the Hippocratic notion of "do no harm." There is currently no unified formal theory for developing interactive eHealth apps, and development is especially difficult when complex messaging is required, such as in health promotion and prevention. This paper aims to provide insight into the creation of interactive eHealth apps for complex messaging, by leveraging the Safe-D case study, which involved complex messaging required to guide safe but sufficient UV exposure for vitamin D synthesis in users. We aim to create recommendations for developing interactive eHealth apps for complex messages based on the lessons learned during Safe-D app development. For this case study we developed an Apple and Android app, both named Safe-D, to safely improve vitamin D status in young women through encouraging safe ultraviolet radiation exposure. The app was developed through participatory action research involving medical and human computer interaction researchers, subject matter expert clinicians, external developers, and target users. The recommendations for development were created from analysis of the development process. By working with clinicians and implementing disparate design examples from the literature, we developed the Safe-D app. From this development process, recommendations for developing interactive eHealth apps for complex messaging were created: (1) involve a multidisciplinary team in the development process, (2) manage complex messages to engage users, and (3) design for interactivity (tailor recommendations, remove barriers to use, design for simplicity). This research has provided principles for developing interactive eHealth apps for complex messaging as guidelines by aggregating existing design concepts and expanding these concepts and new learnings from our development process. A set of guidelines to develop interactive eHealth apps generally, and specifically those for complex messaging, was previously missing from the literature; this research has contributed these principles. Safe-D delivers complex messaging simply, to aid education, and explicitly, considering user safety.
Heffernan, Kayla Joanne; Maclean, Skye Tamara; Callegari, Emma Teresa; Garland, Suzanne Marie; Reavley, Nicola Jane; Varigos, George Andrew; Wark, John Dennis
2016-01-01
Background The now ubiquitous catchphrase, “There’s an app for that,” rings true owing to the growing number of mobile phone apps. In excess of 97,000 eHealth apps are available in major app stores. Yet the effectiveness of these apps varies greatly. While a minority of apps are developed grounded in theory and in conjunction with health care experts, the vast majority are not. This is concerning given the Hippocratic notion of “do no harm.” There is currently no unified formal theory for developing interactive eHealth apps, and development is especially difficult when complex messaging is required, such as in health promotion and prevention. Objective This paper aims to provide insight into the creation of interactive eHealth apps for complex messaging, by leveraging the Safe-D case study, which involved complex messaging required to guide safe but sufficient UV exposure for vitamin D synthesis in users. We aim to create recommendations for developing interactive eHealth apps for complex messages based on the lessons learned during Safe-D app development. Methods For this case study we developed an Apple and Android app, both named Safe-D, to safely improve vitamin D status in young women through encouraging safe ultraviolet radiation exposure. The app was developed through participatory action research involving medical and human computer interaction researchers, subject matter expert clinicians, external developers, and target users. The recommendations for development were created from analysis of the development process. Results By working with clinicians and implementing disparate design examples from the literature, we developed the Safe-D app. From this development process, recommendations for developing interactive eHealth apps for complex messaging were created: (1) involve a multidisciplinary team in the development process, (2) manage complex messages to engage users, and (3) design for interactivity (tailor recommendations, remove barriers to use, design for simplicity). Conclusions This research has provided principles for developing interactive eHealth apps for complex messaging as guidelines by aggregating existing design concepts and expanding these concepts and new learnings from our development process. A set of guidelines to develop interactive eHealth apps generally, and specifically those for complex messaging, was previously missing from the literature; this research has contributed these principles. Safe-D delivers complex messaging simply, to aid education, and explicitly, considering user safety. PMID:26860623
Unstructured mesh algorithms for aerodynamic calculations
NASA Technical Reports Server (NTRS)
Mavriplis, D. J.
1992-01-01
The use of unstructured mesh techniques for solving complex aerodynamic flows is discussed. The principle advantages of unstructured mesh strategies, as they relate to complex geometries, adaptive meshing capabilities, and parallel processing are emphasized. The various aspects required for the efficient and accurate solution of aerodynamic flows are addressed. These include mesh generation, mesh adaptivity, solution algorithms, convergence acceleration, and turbulence modeling. Computations of viscous turbulent two-dimensional flows and inviscid three-dimensional flows about complex configurations are demonstrated. Remaining obstacles and directions for future research are also outlined.
Information processing using a single dynamical node as complex system
Appeltant, L.; Soriano, M.C.; Van der Sande, G.; Danckaert, J.; Massar, S.; Dambre, J.; Schrauwen, B.; Mirasso, C.R.; Fischer, I.
2011-01-01
Novel methods for information processing are highly desired in our information-driven society. Inspired by the brain's ability to process information, the recently introduced paradigm known as 'reservoir computing' shows that complex networks can efficiently perform computation. Here we introduce a novel architecture that reduces the usually required large number of elements to a single nonlinear node with delayed feedback. Through an electronic implementation, we experimentally and numerically demonstrate excellent performance in a speech recognition benchmark. Complementary numerical studies also show excellent performance for a time series prediction benchmark. These results prove that delay-dynamical systems, even in their simplest manifestation, can perform efficient information processing. This finding paves the way to feasible and resource-efficient technological implementations of reservoir computing. PMID:21915110
Belger, A; Banich, M T
1998-07-01
Because interaction of the cerebral hemispheres has been found to aid task performance under demanding conditions, the present study examined how this effect is moderated by computational complexity, the degree of lateralization for a task, and individual differences in asymmetric hemispheric activation (AHA). Computational complexity was manipulated across tasks either by increasing the number of inputs to be processed or by increasing the number of steps to a decision. Comparison of within- and across-hemisphere trials indicated that the size of the between-hemisphere advantage increased as a function of task complexity, except for a highly lateralized rhyme decision task that can only be performed by the left hemisphere. Measures of individual differences in AHA revealed that when task demands and an individual's AHA both load on the same hemisphere, the ability to divide the processing between the hemispheres is limited. Thus, interhemispheric division of processing improves performance at higher levels of computational complexity only when the required operations can be divided between the hemispheres.
Packet communications in satellites with multiple-beam antennas and signal processing
NASA Technical Reports Server (NTRS)
Davies, R.; Chethik, F.; Penick, M.
1980-01-01
A communication satellite with a multiple-beam antenna and onboard signal processing is considered for use in a 'message-switched' data relay system. The signal processor may incorporate demodulation, routing, storage, and remodulation of the data. A system user model is established and key functional elements for the signal processing are identified. With the throughput and delay requirements as the controlled variables, the hardware complexity, operational discipline, occupied bandwidth, and overall user end-to-end cost are estimated for (1) random-access packet switching; and (2) reservation-access packet switching. Other aspects of this network (eg, the adaptability to channel switched traffic requirements) are examined. For the given requirements and constraints, the reservation system appears to be the most attractive protocol.
"Assessment Drives Learning": Do Assessments Promote High-Level Cognitive Processing?
ERIC Educational Resources Information Center
Bezuidenhout, M. J.; Alt, H.
2011-01-01
Students tend to learn in the way they know, or think, they will be assessed. Therefore, to ensure deep, meaningful learning, assessments must be geared to promote cognitive processing that requires complex, contextualised thinking to construct meaning and create knowledge. Bloom's taxonomy of cognitive levels is used worldwide to assist in…
Neural Evidence of Allophonic Perception in Children at Risk for Dyslexia
ERIC Educational Resources Information Center
Noordenbos, M. W.; Segers, E.; Serniclaes, W.; Mitterer, H.; Verhoeven, L.
2012-01-01
Learning to read is a complex process that develops normally in the majority of children and requires the mapping of graphemes to their corresponding phonemes. Problems with the mapping process nevertheless occur in about 5% of the population and are typically attributed to poor phonological representations, which are--in turn--attributed to…
This paper reviews the controls on aeolian processes and their consequences at plant-interspace, patch-landscape, and regional-global scales. Based on this review, we define the requirements for a cross-scale model of wind erosion in structurally complex arid and semiarid ecosyst...
ERIC Educational Resources Information Center
Nelson-Erichsen, Jean; Erichsen, Heino R.
Many potential adoptive parents seeking international adoption find the process to be extremely complex. This guide details the international adoption process, including organizing a home study and fulfilling state requirements as well as selecting a country from which to adopt, working through emigration and immigration agencies, and traveling…
Intact Spectral but Abnormal Temporal Processing of Auditory Stimuli in Autism
ERIC Educational Resources Information Center
Groen, Wouter B.; van Orsouw, Linda; ter Huurne, Niels; Swinkels, Sophie; van der Gaag, Rutger-Jan; Buitelaar, Jan K.; Zwiers, Marcel P.
2009-01-01
The perceptual pattern in autism has been related to either a specific localized processing deficit or a pathway-independent, complexity-specific anomaly. We examined auditory perception in autism using an auditory disembedding task that required spectral and temporal integration. 23 children with high-functioning-autism and 23 matched controls…
Changes in Information Processing with Aging: Implications for Teaching Motor Skills.
ERIC Educational Resources Information Center
Anshel, Mark H.
Although there are marked individual differences in the effect of aging on learning and performing motor skills, there is agreement that humans process information less efficiently with advanced age. Significant decrements have been found specifically with motor tasks that are characterized as externally-paced, rapid, complex, and requiring rapid…
Integrally cored ceramic investment casting mold fabricated by ceramic stereolithography
NASA Astrophysics Data System (ADS)
Bae, Chang-Jun
Superalloy airfoils are produced by investment casting (IC), which uses ceramic cores and wax patterns with ceramic shell molds. Hollow cored superalloy airfoils in a gas turbine engine are an example of complex IC parts. The complex internal hollow cavities of the airfoil are designed to conduct cooling air through one or more passageways. These complex internal passageways have been fabricated by a lost wax process requiring several processing steps; core preparation, injection molding for wax pattern, and dipping process for ceramic shell molds. Several steps generate problems such as high cost and decreased accuracy of the ceramic mold. For example, costly tooling and production delay are required to produce mold dies for complex cores and wax patterns used in injection molding, resulting in a big obstacle for prototypes and smaller production runs. Rather than using separate cores, patterns, and shell molds, it would be advantageous to directly produce a mold that has the casting cavity and the ceramic core by one process. Ceramic stereolithography (CerSLA) can be used to directly fabricate the integrally cored ceramic casting mold (ICCM). CerSLA builds ceramic green objects from CAD files from many thin liquid layers of powder in monomer, which are solidified by polymerization with a UV laser, thereby "writing" the design for each slice. This dissertation addresses the integrally cored casting ceramic mold (ICCM), the ceramic core with a ceramic mold shell in a single patternless construction, fabricated by ceramic stereolithography (CerSLA). CerSLA is considered as an alternative method to replace lost wax processes, for small production runs or designs too complex for conventional cores and patterns. The main topic is the development of methods to successfully fabricate an ICCM by CerSLA from refractory silica, as well as related issues. The related issues are the segregation of coarse fused silica powders in a layer, the degree of segregation parameter to prevent segregation, and sintering and cristobalite transformation in fused silica compacts.
Processing of hemojuvelin requires retrograde trafficking to the Golgi in HepG2 cells.
Maxson, Julia E; Enns, Caroline A; Zhang, An-Sheng
2009-02-19
Hemojuvelin (HJV) was recently identified as a critical regulator of iron homeostasis. It is either associated with cell membranes through a glycosylphosphatidylinositol anchor or released as a soluble form. Membrane-anchored HJV acts as a coreceptor for bone morphogenetic proteins and activates the transcription of hepcidin, a hormone that regulates iron efflux from cells. Soluble HJV antagonizes bone morphogenetic protein signaling and suppresses hepcidin expression. In this study, we examined the trafficking and processing of HJV. Cellular HJV reached the plasma membrane without obtaining complex oligosaccharides, indicating that HJV avoided Golgi processing. Secreted HJV, in contrast, has complex oligosaccharides and can be derived from HJV with high-mannose oligosaccharides at the plasma membrane. Our results support a model in which retrograde trafficking of HJV before cleavage is the predominant processing pathway. Release of HJV requires it to bind to the transmembrane receptor neogenin. Neogenin does not, however, play a role in HJV trafficking to the cell surface, suggesting that it could be involved either in retrograde trafficking of HJV or in cleavage leading to HJV release.
Processing of hemojuvelin requires retrograde trafficking to the Golgi in HepG2 cells
Maxson, Julia E.; Enns, Caroline A.
2009-01-01
Hemojuvelin (HJV) was recently identified as a critical regulator of iron homeostasis. It is either associated with cell membranes through a glycosylphosphatidylinositol anchor or released as a soluble form. Membrane-anchored HJV acts as a coreceptor for bone morphogenetic proteins and activates the transcription of hepcidin, a hormone that regulates iron efflux from cells. Soluble HJV antagonizes bone morphogenetic protein signaling and suppresses hepcidin expression. In this study, we examined the trafficking and processing of HJV. Cellular HJV reached the plasma membrane without obtaining complex oligosaccharides, indicating that HJV avoided Golgi processing. Secreted HJV, in contrast, has complex oligosaccharides and can be derived from HJV with high-mannose oligosaccharides at the plasma membrane. Our results support a model in which retrograde trafficking of HJV before cleavage is the predominant processing pathway. Release of HJV requires it to bind to the transmembrane receptor neogenin. Neogenin does not, however, play a role in HJV trafficking to the cell surface, suggesting that it could be involved either in retrograde trafficking of HJV or in cleavage leading to HJV release. PMID:19029439
TRBP recruits the Dicer complex to Ago2 for microRNA processing and gene silencing.
Chendrimada, Thimmaiah P; Gregory, Richard I; Kumaraswamy, Easwari; Norman, Jessica; Cooch, Neil; Nishikura, Kazuko; Shiekhattar, Ramin
2005-08-04
MicroRNAs (miRNAs) are generated by a two-step processing pathway to yield RNA molecules of approximately 22 nucleotides that negatively regulate target gene expression at the post-transcriptional level. Primary miRNAs are processed to precursor miRNAs (pre-miRNAs) by the Microprocessor complex. These pre-miRNAs are cleaved by the RNase III Dicer to generate mature miRNAs that direct the RNA-induced silencing complex (RISC) to messenger RNAs with complementary sequence. Here we show that TRBP (the human immunodeficiency virus transactivating response RNA-binding protein), which contains three double-stranded, RNA-binding domains, is an integral component of a Dicer-containing complex. Biochemical analysis of TRBP-containing complexes revealed the association of Dicer-TRBP with Argonaute 2 (Ago2), the catalytic engine of RISC. The physical association of Dicer-TRBP and Ago2 was confirmed after the isolation of the ternary complex using Flag-tagged Ago2 cell lines. In vitro reconstitution assays demonstrated that TRBP is required for the recruitment of Ago2 to the small interfering RNA (siRNA) bound by Dicer. Knockdown of TRBP results in destabilization of Dicer and a consequent loss of miRNA biogenesis. Finally, depletion of the Dicer-TRBP complex via exogenously introduced siRNAs diminished RISC-mediated reporter gene silencing. These results support a role of the Dicer-TRBP complex not only in miRNA processing but also as a platform for RISC assembly.
MPI Runtime Error Detection with MUST: Advances in Deadlock Detection
Hilbrich, Tobias; Protze, Joachim; Schulz, Martin; ...
2013-01-01
The widely used Message Passing Interface (MPI) is complex and rich. As a result, application developers require automated tools to avoid and to detect MPI programming errors. We present the Marmot Umpire Scalable Tool (MUST) that detects such errors with significantly increased scalability. We present improvements to our graph-based deadlock detection approach for MPI, which cover future MPI extensions. Our enhancements also check complex MPI constructs that no previous graph-based detection approach handled correctly. Finally, we present optimizations for the processing of MPI operations that reduce runtime deadlock detection overheads. Existing approaches often require ( p ) analysis time permore » MPI operation, for p processes. We empirically observe that our improvements lead to sub-linear or better analysis time per operation for a wide range of real world applications.« less
Neural basis of processing threatening voices in a crowded auditory world
Mothes-Lasch, Martin; Becker, Michael P. I.; Miltner, Wolfgang H. R.
2016-01-01
In real world situations, we typically listen to voice prosody against a background crowded with auditory stimuli. Voices and background can both contain behaviorally relevant features and both can be selectively in the focus of attention. Adequate responses to threat-related voices under such conditions require that the brain unmixes reciprocally masked features depending on variable cognitive resources. It is unknown which brain systems instantiate the extraction of behaviorally relevant prosodic features under varying combinations of prosody valence, auditory background complexity and attentional focus. Here, we used event-related functional magnetic resonance imaging to investigate the effects of high background sound complexity and attentional focus on brain activation to angry and neutral prosody in humans. Results show that prosody effects in mid superior temporal cortex were gated by background complexity but not attention, while prosody effects in the amygdala and anterior superior temporal cortex were gated by attention but not background complexity, suggesting distinct emotional prosody processing limitations in different regions. Crucially, if attention was focused on the highly complex background, the differential processing of emotional prosody was prevented in all brain regions, suggesting that in a distracting, complex auditory world even threatening voices may go unnoticed. PMID:26884543
Novel physical constraints on implementation of computational processes
NASA Astrophysics Data System (ADS)
Wolpert, David; Kolchinsky, Artemy
Non-equilibrium statistical physics permits us to analyze computational processes, i.e., ways to drive a physical system such that its coarse-grained dynamics implements some desired map. It is now known how to implement any such desired computation without dissipating work, and what the minimal (dissipationless) work is that such a computation will require (the so-called generalized Landauer bound\\x9D). We consider how these analyses change if we impose realistic constraints on the computational process. First, we analyze how many degrees of freedom of the system must be controlled, in addition to the ones specifying the information-bearing degrees of freedom, in order to avoid dissipating work during a given computation, when local detailed balance holds. We analyze this issue for deterministic computations, deriving a state-space vs. speed trade-off, and use our results to motivate a measure of the complexity of a computation. Second, we consider computations that are implemented with logic circuits, in which only a small numbers of degrees of freedom are coupled at a time. We show that the way a computation is implemented using circuits affects its minimal work requirements, and relate these minimal work requirements to information-theoretic measures of complexity.
An analysis of the processing requirements of a complex perceptual-motor task
NASA Technical Reports Server (NTRS)
Kramer, A. F.; Wickens, C. D.; Donchin, E.
1983-01-01
Current concerns in the assessment of mental workload are discussed, and the event-related brain potential (ERP) is introduced as a promising mental-workload index. Subjects participated in a series of studies in which they were required to perform a target acquisition task while also covertly counting either auditory or visual probes. The effects of several task-difficulty manipulations on the P300 component of the ERP elicited by the counted stimulus probes were investigated. With sufficiently practiced subjects the amplitude of the P300 was found to decrease with increases in task difficulty. The second experiment also provided evidence that the P300 is selectively sensitive to task-relevant attributes. A third experiment demonstrated a convergence in the amplitude of the P300s elicited in the simple and difficult versions of the tracking task. The amplitude of the P300 was also found to covary with the measures of tracking performance. The results of the series of three experiments illustrate the sensitivity of the P300 to the processing requirements of a complex target acquisition task. The findings are discussed in terms of the multidimensional nature of processing resources.
Space Shuttle processing - A case study in artificial intelligence
NASA Technical Reports Server (NTRS)
Mollikarimi, Cindy; Gargan, Robert; Zweben, Monte
1991-01-01
A scheduling system incorporating AI is described and applied to the automated processing of the Space Shuttle. The unique problem of addressing the temporal, resource, and orbiter-configuration requirements of shuttle processing is described with comparisons to traditional project management for manufacturing processes. The present scheduling system is developed to handle the late inputs and complex programs that characterize shuttle processing by incorporating fixed preemptive scheduling, constraint-based simulated annealing, and the characteristics of an 'anytime' algorithm. The Space-Shuttle processing environment is modeled with 500 activities broken down into 4000 subtasks and with 1600 temporal constraints, 8000 resource constraints, and 3900 state requirements. The algorithm is shown to scale to very large problems and maintain anytime characteristics suggesting that an automated scheduling process is achievable and potentially cost-effective.
RNA search engines empower the bacterial intranet.
Dendooven, Tom; Luisi, Ben F
2017-08-15
RNA acts not only as an information bearer in the biogenesis of proteins from genes, but also as a regulator that participates in the control of gene expression. In bacteria, small RNA molecules (sRNAs) play controlling roles in numerous processes and help to orchestrate complex regulatory networks. Such processes include cell growth and development, response to stress and metabolic change, transcription termination, cell-to-cell communication, and the launching of programmes for host invasion. All these processes require recognition of target messenger RNAs by the sRNAs. This review summarizes recent results that have provided insights into how bacterial sRNAs are recruited into effector ribonucleoprotein complexes that can seek out and act upon target transcripts. The results hint at how sRNAs and their protein partners act as pattern-matching search engines that efficaciously regulate gene expression, by performing with specificity and speed while avoiding off-target effects. The requirements for efficient searches of RNA patterns appear to be common to all domains of life. © 2017 The Author(s).
Volume and Value of Big Healthcare Data.
Dinov, Ivo D
Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions.
RNA search engines empower the bacterial intranet
Dendooven, Tom
2017-01-01
RNA acts not only as an information bearer in the biogenesis of proteins from genes, but also as a regulator that participates in the control of gene expression. In bacteria, small RNA molecules (sRNAs) play controlling roles in numerous processes and help to orchestrate complex regulatory networks. Such processes include cell growth and development, response to stress and metabolic change, transcription termination, cell-to-cell communication, and the launching of programmes for host invasion. All these processes require recognition of target messenger RNAs by the sRNAs. This review summarizes recent results that have provided insights into how bacterial sRNAs are recruited into effector ribonucleoprotein complexes that can seek out and act upon target transcripts. The results hint at how sRNAs and their protein partners act as pattern-matching search engines that efficaciously regulate gene expression, by performing with specificity and speed while avoiding off-target effects. The requirements for efficient searches of RNA patterns appear to be common to all domains of life. PMID:28710287
Volume and Value of Big Healthcare Data
Dinov, Ivo D.
2016-01-01
Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions. PMID:26998309
Workflow computing. Improving management and efficiency of pathology diagnostic services.
Buffone, G J; Moreau, D; Beck, J R
1996-04-01
Traditionally, information technology in health care has helped practitioners to collect, store, and present information and also to add a degree of automation to simple tasks (instrument interfaces supporting result entry, for example). Thus commercially available information systems do little to support the need to model, execute, monitor, coordinate, and revise the various complex clinical processes required to support health-care delivery. Workflow computing, which is already implemented and improving the efficiency of operations in several nonmedical industries, can address the need to manage complex clinical processes. Workflow computing not only provides a means to define and manage the events, roles, and information integral to health-care delivery but also supports the explicit implementation of policy or rules appropriate to the process. This article explains how workflow computing may be applied to health-care and the inherent advantages of the technology, and it defines workflow system requirements for use in health-care delivery with special reference to diagnostic pathology.
Noyes, Jane; Brenner, Maria; Fox, Patricia; Guerin, Ashleigh
2014-05-01
To report a novel review to develop a health systems model of successful transition of children with complex healthcare needs from hospital to home. Children with complex healthcare needs commonly experience an expensive, ineffectual and prolonged nurse-led discharge process. Children gain no benefit from prolonged hospitalization and are exposed to significant harm. Research to enable intervention development and process evaluation across the entire health system is lacking. Novel mixed-method integrative review informed by health systems theory. DATA CINAHL, PsychInfo, EMBASE, PubMed, citation searching, personal contact. REVIEW Informed by consultation with experts. English language studies, opinion/discussion papers reporting research, best practice and experiences of children, parents and healthcare professionals and purposively selected policies/guidelines from 2002-December 2012 were abstracted using Framework synthesis, followed by iterative theory development. Seven critical factors derived from thirty-four sources across five health system levels explained successful discharge (new programme theory). All seven factors are required in an integrated care pathway, with a dynamic communication loop to facilitate effective discharge (new programme logic). Current health system responses were frequently static and critical success factors were commonly absent, thereby explaining ineffectual discharge. The novel evidence-based model, which reconceptualizes 'discharge' as a highly complex longitudinal health system intervention, makes a significant contribution to global knowledge to drive practice development. Research is required to develop process and outcome measures at different time points in the discharge process and future trials are needed to determine the effectiveness of integrated health system discharge models. © 2013 John Wiley & Sons Ltd.
An analytical approach to customer requirement information processing
NASA Astrophysics Data System (ADS)
Zhou, Zude; Xiao, Zheng; Liu, Quan; Ai, Qingsong
2013-11-01
'Customer requirements' (CRs) management is a key component of customer relationship management (CRM). By processing customer-focused information, CRs management plays an important role in enterprise systems (ESs). Although two main CRs analysis methods, quality function deployment (QFD) and Kano model, have been applied to many fields by many enterprises in the past several decades, the limitations such as complex processes and operations make them unsuitable for online businesses among small- and medium-sized enterprises (SMEs). Currently, most SMEs do not have the resources to implement QFD or Kano model. In this article, we propose a method named customer requirement information (CRI), which provides a simpler and easier way for SMEs to run CRs analysis. The proposed method analyses CRs from the perspective of information and applies mathematical methods to the analysis process. A detailed description of CRI's acquisition, classification and processing is provided.
Payload crew training complex simulation engineer's handbook
NASA Technical Reports Server (NTRS)
Shipman, D. L.
1984-01-01
The Simulation Engineer's Handbook is a guide for new engineers assigned to Experiment Simulation and a reference for engineers previously assigned. The experiment simulation process, development of experiment simulator requirements, development of experiment simulator hardware and software, and the verification of experiment simulators are discussed. The training required for experiment simulation is extensive and is only referenced in the handbook.
Biochemical Reconstitution of the WAVE Regulatory Complex
Chen, Baoyu; Padrick, Shae B.; Henry, Lisa; Rosen, Michael K.
2014-01-01
The WAVE regulatory complex (WRC) is a 400-KDa heteropentameric protein assembly that plays a central role in controlling actin cytoskeletal dynamics in many cellular processes. The WRC acts by integrating diverse cellular cues and stimulating the actin nucleating activity of the Arp2/3 complex at membranes. Biochemical and biophysical studies of the underlying mechanisms of these processes require large amounts of purified WRC. Recent success in recombinant expression, reconstitution, purification and crystallization of the WRC has greatly advanced our understanding of the inhibition, activation and membrane recruitment mechanisms of this complex. But many important questions remain to be answered. Here we summarize and update the methods developed in our laboratory, which allow reliable and flexible production of tens of milligrams of recombinant WRC of crystallographic quality, sufficient for many biochemical and structural studies. PMID:24630101
NASA Technical Reports Server (NTRS)
Wakim, Nagi T.; Srivastava, Sadanand; Bousaidi, Mehdi; Goh, Gin-Hua
1995-01-01
Agent-based technologies answer to several challenges posed by additional information processing requirements in today's computing environments. In particular, (1) users desire interaction with computing devices in a mode which is similar to that used between people, (2) the efficiency and successful completion of information processing tasks often require a high-level of expertise in complex and multiple domains, (3) information processing tasks often require handling of large volumes of data and, therefore, continuous and endless processing activities. The concept of an agent is an attempt to address these new challenges by introducing information processing environments in which (1) users can communicate with a system in a natural way, (2) an agent is a specialist and a self-learner and, therefore, it qualifies to be trusted to perform tasks independent of the human user, and (3) an agent is an entity that is continuously active performing tasks that are either delegated to it or self-imposed. The work described in this paper focuses on the development of an interface agent for users of a complex information processing environment (IPE). This activity is part of an on-going effort to build a model for developing agent-based information systems. Such systems will be highly applicable to environments which require a high degree of automation, such as, flight control operations and/or processing of large volumes of data in complex domains, such as the EOSDIS environment and other multidisciplinary, scientific data systems. The concept of an agent as an information processing entity is fully described with emphasis on characteristics of special interest to the User-System Interface Agent (USIA). Issues such as agent 'existence' and 'qualification' are discussed in this paper. Based on a definition of an agent and its main characteristics, we propose an architecture for the development of interface agents for users of an IPE that is agent-oriented and whose resources are likely to be distributed and heterogeneous in nature. The architecture of USIA is outlined in two main components: (1) the user interface which is concerned with issues as user dialog and interaction, user modeling, and adaptation to user profile, and (2) the system interface part which deals with identification of IPE capabilities, task understanding and feasibility assessment, and task delegation and coordination of assistant agents.
Scroccaro, G
2000-10-01
The hospital formulary is not only a list of drugs, but also a policy that involves several complex processes and activities. Formularies are developed based on cost-effectiveness evaluation, but maintaining cost-effectiveness requires educational strategies. The clinical pharmacist plays an active role in formulary processes. These processes are discussed in this paper. There is a need for studies on the cost-effectiveness of clinical pharmacy services in support of the hospital formulary.
Answering Questions about Complex Events
2008-12-19
in their environment. To reason about events requires a means of describing, simulating, and analyzing their underlying dynamic processes . For our...that are relevant to our goal of connecting inference and reasoning about processes to answering questions about events. 11 We start with a...different event and process descriptions, ontologies, and models. 2.1.1 Logical AI In AI, formal approaches to model the ability to reason about
Development of sensor augmented robotic weld systems for aerospace propulsion system fabrication
NASA Technical Reports Server (NTRS)
Jones, C. S.; Gangl, K. J.
1986-01-01
In order to meet stringent performance goals for power and reuseability, the Space Shuttle Main Engine was designed with many complex, difficult welded joints that provide maximum strength and minimum weight. To this end, the SSME requires 370 meters of welded joints. Automation of some welds has improved welding productivity significantly over manual welding. Application has previously been limited by accessibility constraints, requirements for complex process control, low production volumes, high part variability, and stringent quality requirements. Development of robots for welding in this application requires that a unique set of constraints be addressed. This paper shows how robotic welding can enhance production of aerospace components by addressing their specific requirements. A development program at the Marshall Space Flight Center combining industrial robots with state-of-the-art sensor systems and computer simulation is providing technology for the automation of welds in Space Shuttle Main Engine production.
Skrajna, Aleksandra; Yang, Xiao-cui; Dadlez, Michał; Marzluff, William F; Dominski, Zbigniew
2018-01-01
Abstract 3′ end cleavage of metazoan replication-dependent histone pre-mRNAs requires the multi-subunit holo-U7 snRNP and the stem–loop binding protein (SLBP). The exact composition of the U7 snRNP and details of SLBP function in processing remain unclear. To identify components of the U7 snRNP in an unbiased manner, we developed a novel approach for purifying processing complexes from Drosophila and mouse nuclear extracts. In this method, catalytically active processing complexes are assembled in vitro on a cleavage-resistant histone pre-mRNA containing biotin and a photo-sensitive linker, and eluted from streptavidin beads by UV irradiation for direct analysis by mass spectrometry. In the purified processing complexes, Drosophila and mouse U7 snRNP have a remarkably similar composition, always being associated with CPSF73, CPSF100, symplekin and CstF64. Many other proteins previously implicated in the U7-dependent processing are not present. Drosophila U7 snRNP bound to histone pre-mRNA in the absence of SLBP contains the same subset of polyadenylation factors but is catalytically inactive and addition of recombinant SLBP is sufficient to trigger cleavage. This result suggests that Drosophila SLBP promotes a structural rearrangement of the processing complex, resulting in juxtaposition of the CPSF73 endonuclease with the cleavage site in the pre-mRNA substrate. PMID:29529248
The topological requirements for robust perfect adaptation in networks of any size.
Araujo, Robyn P; Liotta, Lance A
2018-05-01
Robustness, and the ability to function and thrive amid changing and unfavorable environments, is a fundamental requirement for living systems. Until now it has been an open question how large and complex biological networks can exhibit robust behaviors, such as perfect adaptation to a variable stimulus, since complexity is generally associated with fragility. Here we report that all networks that exhibit robust perfect adaptation (RPA) to a persistent change in stimulus are decomposable into well-defined modules, of which there exist two distinct classes. These two modular classes represent a topological basis for all RPA-capable networks, and generate the full set of topological realizations of the internal model principle for RPA in complex, self-organizing, evolvable bionetworks. This unexpected result supports the notion that evolutionary processes are empowered by simple and scalable modular design principles that promote robust performance no matter how large or complex the underlying networks become.
Middleton, John; Vaks, Jeffrey E
2007-04-01
Errors of calibrator-assigned values lead to errors in the testing of patient samples. The ability to estimate the uncertainties of calibrator-assigned values and other variables minimizes errors in testing processes. International Organization of Standardization guidelines provide simple equations for the estimation of calibrator uncertainty with simple value-assignment processes, but other methods are needed to estimate uncertainty in complex processes. We estimated the assigned-value uncertainty with a Monte Carlo computer simulation of a complex value-assignment process, based on a formalized description of the process, with measurement parameters estimated experimentally. This method was applied to study uncertainty of a multilevel calibrator value assignment for a prealbumin immunoassay. The simulation results showed that the component of the uncertainty added by the process of value transfer from the reference material CRM470 to the calibrator is smaller than that of the reference material itself (<0.8% vs 3.7%). Varying the process parameters in the simulation model allowed for optimizing the process, while keeping the added uncertainty small. The patient result uncertainty caused by the calibrator uncertainty was also found to be small. This method of estimating uncertainty is a powerful tool that allows for estimation of calibrator uncertainty for optimization of various value assignment processes, with a reduced number of measurements and reagent costs, while satisfying the requirements to uncertainty. The new method expands and augments existing methods to allow estimation of uncertainty in complex processes.
Collins, Caitlin
2014-01-01
Cell–cell contact formation is a dynamic process requiring the coordination of cadherin-based cell–cell adhesion and integrin-based cell migration. A genome-wide RNA interference screen for proteins required specifically for cadherin-dependent cell–cell adhesion identified an Elmo–Dock complex. This was unexpected as Elmo–Dock complexes act downstream of integrin signaling as Rac guanine-nucleotide exchange factors. In this paper, we show that Elmo2 recruits Dock1 to initial cell–cell contacts in Madin–Darby canine kidney cells. At cell–cell contacts, both Elmo2 and Dock1 are essential for the rapid recruitment and spreading of E-cadherin, actin reorganization, localized Rac and Rho GTPase activities, and the development of strong cell–cell adhesion. Upon completion of cell–cell adhesion, Elmo2 and Dock1 no longer localize to cell–cell contacts and are not required subsequently for the maintenance of cell–cell adhesion. These studies show that Elmo–Dock complexes are involved in both integrin- and cadherin-based adhesions, which may help to coordinate the transition of cells from migration to strong cell–cell adhesion. PMID:25452388
Noise Suppression Methods for Robust Speech Processing
1980-05-01
63.8 Sustention 69.0 58.3 40.6 41.7 Sibilation 87.2 85.9 61.2 72.9 Graveness 70.1 56.2 38.0 51.3 Compactness 94.3 95.6 76.3 84.1 To~tal 85.2 79.8...important issues in assessing the algorithm complexity. Specifically, the frequency domain approach will require considerably more memory and be more complex
Fish, Rob D; Ioris, Antonio A R; Watson, Nigel M
2010-11-01
This paper examines governance requirements for integrating water and agricultural management (IWAM). The institutional arrangements for the agriculture and water sectors are complex and multi-dimensional, and integration cannot therefore be achieved through a simplistic 'additive' policy process. Effective integration requires the development of a new collaborative approach to governance that is designed to cope with scale dependencies and interactions, uncertainty and contested knowledge, and interdependency among diverse and unequal interests. When combined with interdisciplinary research, collaborative governance provides a viable normative model because of its emphasis on reciprocity, relationships, learning and creativity. Ultimately, such an approach could lead to the sorts of system adaptations and transformations that are required for IWAM. Copyright © 2009 Elsevier B.V. All rights reserved.
Harbison, K; Kelly, J; Burnell, L; Silva, J
1995-01-01
The Scenario-based Engineering Process (SEP) is a user-focused methodology for large and complex system design. This process supports new application development from requirements analysis with domain models to component selection, design and modification, implementation, integration, and archival placement. It is built upon object-oriented methodologies, domain modeling strategies, and scenario-based techniques to provide an analysis process for mapping application requirements to available components. We are using SEP in the health care applications that we are developing. The process has already achieved success in the manufacturing and military domains and is being adopted by many organizations. SEP should prove viable in any domain containing scenarios that can be decomposed into tasks.
NASA Astrophysics Data System (ADS)
Budiman, M. A.; Rachmawati, D.; Jessica
2018-03-01
This study aims to combine the trithemus algorithm and double transposition cipher in file security that will be implemented to be an Android-based application. The parameters being examined are the real running time, and the complexity value. The type of file to be used is a file in PDF format. The overall result shows that the complexity of the two algorithms with duper encryption method is reported as Θ (n 2). However, the processing time required in the encryption process uses the Trithemius algorithm much faster than using the Double Transposition Cipher. With the length of plaintext and password linearly proportional to the processing time.
Processing lunar soils for oxygen and other materials
NASA Technical Reports Server (NTRS)
Knudsen, Christian W.; Gibson, Michael A.
1992-01-01
Two types of lunar materials are excellent candidates for lunar oxygen production: ilmenite and silicates such as anorthite. Both are lunar surface minable, occurring in soils, breccias, and basalts. Because silicates are considerably more abundant than ilmenite, they may be preferred as source materials. Depending on the processing method chosen for oxygen production and the feedstock material, various useful metals and bulk materials can be produced as byproducts. Available processing techniques include hydrogen reduction of ilmenite and electrochemical and chemical reductions of silicates. Processes in these categories are generally in preliminary development stages and need significant research and development support to carry them to practical deployment, particularly as a lunar-based operation. The goal of beginning lunar processing operations by 2010 requires that planning and research and development emphasize the simplest processing schemes. However, more complex schemes that now appear to present difficult technical challenges may offer more valuable metal byproducts later. While they require more time and effort to perfect, the more complex or difficult schemes may provide important processing and product improvements with which to extend and elaborate the initial lunar processing facilities. A balanced R&D program should take this into account. The following topics are discussed: (1) ilmenite--semi-continuous process; (2) ilmenite--continuous fluid-bed reduction; (3) utilization of spent ilmenite to produce bulk materials; (4) silicates--electrochemical reduction; and (5) silicates--chemical reduction.
X-ray-enhanced cancer cell migration requires the linker of nucleoskeleton and cytoskeleton complex.
Imaizumi, Hiromasa; Sato, Katsutoshi; Nishihara, Asuka; Minami, Kazumasa; Koizumi, Masahiko; Matsuura, Nariaki; Hieda, Miki
2018-04-01
The linker of nucleoskeleton and cytoskeleton (LINC) complex is a multifunctional protein complex that is involved in various processes at the nuclear envelope, including nuclear migration, mechanotransduction, chromatin tethering and DNA damage response. We recently showed that a nuclear envelope protein, Sad1 and UNC84 domain protein 1 (SUN1), a component of the LINC complex, has a critical function in cell migration. Although ionizing radiation activates cell migration and invasion in vivo and in vitro, the underlying molecular mechanism remains unknown. Here, we examined the involvement of the LINC complex in radiation-enhanced cell migration and invasion. A sublethal dose of X-ray radiation promoted human breast cancer MDA-MB-231 cell migration and invasion, whereas carbon ion beam radiation suppressed these processes in a dose-dependent manner. Depletion of SUN1 and SUN2 significantly suppressed X-ray-enhanced cell migration and invasion. Moreover, depletion or overexpression of each SUN1 splicing variant revealed that SUN1_888 containing 888 amino acids of SUN1 but not SUN1_916 was required for X-ray-enhanced migration and invasion. In addition, the results suggested that X-ray irradiation affected the expression level of SUN1 splicing variants and a SUN protein binding partner, nesprins. Taken together, our findings supported that the LINC complex contributed to photon-enhanced cell migration and invasion. © 2018 The Authors. Cancer Science published by John Wiley & Sons Australia, Ltd on behalf of Japanese Cancer Association.
Medicine and the call for a moral epistemology, part II: constructing a synthesis of values.
Tauber, Alfred I
2008-01-01
The demands and needs of an individual patient require diverse value judgments to interpret and apply clinical data. Indeed, objective assessment takes on particular meaning in the context of the social and existential status of the patient, and thereby a complex calculus of values determines therapeutic goals. I have previously formulated how this moral thread of care becomes woven into the epistemological project as a "moral epistemology." Having argued its ethical justification elsewhere, I offer another perspective here: clinical choices employ diverse values directed at an array of goals, some of which are derived from a universal clinical science and others from the particular physiological, psychological, and social needs of the patient. Integrating these diverse elements that determine clinical care requires a complex synthesis of facts and judgments from several domains. This constructivist process relies on clinical facts, as well as on personal judgments and subjective assessments in an ongoing negotiation between patient and doctor. A philosophy of medicine must account for the conceptual basis of this process by identifying and addressing the judgments that govern the complex synthesis of these various elements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chotiyarnwong, Pojchong; Medical Molecular Biology Unit, Faculty of Medicine, Siriraj Hospital, Mahidol University; Stewart-Jones, Guillaume B.
Crystals of an MHC class I molecule bound to naturally occurring peptide variants from the dengue virus NS3 protein contained high levels of solvent and required optimization of cryoprotectant and dehydration protocols for each complex to yield well ordered diffraction, a process facilitated by the use of a free-mounting system. T-cell recognition of the antigenic peptides presented by MHC class I molecules normally triggers protective immune responses, but can result in immune enhancement of disease. Cross-reactive T-cell responses may underlie immunopathology in dengue haemorrhagic fever. To analyze these effects at the molecular level, the functional MHC class I molecule HLA-A*1101more » was crystallized bound to six naturally occurring peptide variants from the dengue virus NS3 protein. The crystals contained high levels of solvent and required optimization of the cryoprotectant and dehydration protocols for each complex to yield well ordered diffraction, a process that was facilitated by the use of a free-mounting system.« less
Webster, Alexandre; Li, Sisi; Hur, Junho K.; Wachsmuth, Malte; Bois, Justin S.; Perkins, Edward M.; Patel, Dinshaw J.; Aravin, Alexei A.
2015-01-01
In Drosophila, two Piwi proteins, Aubergine (Aub) and Argonaute-3 (Ago3) localize to perinuclear ‘nuage’ granules and use guide piRNAs to target and destroy transposable element transcripts. We find that Aub and Ago3 are recruited to nuage by two different mechanisms. Aub requires a piRNA guide for nuage recruitment, indicating that its localization depends on recognition of RNA targets. Ago3 is recruited to nuage independently of a piRNA cargo and relies on interaction with Krimper, a stable component of nuage that is able to aggregate in the absence of other nuage proteins. We show that Krimper interacts directly with Aub and Ago3 to coordinate the assembly of the ping-pong piRNA processing (4P) complex. Symmetrical dimethylated arginines are required for Aub to interact with Krimper, but are dispensable for Ago3 to bind Krimper. Our study reveals a multi-step process responsible for the assembly and function of nuage complexes in piRNA-guided transposon repression. PMID:26295961
Model-based MPC enables curvilinear ILT using either VSB or multi-beam mask writers
NASA Astrophysics Data System (ADS)
Pang, Linyong; Takatsukasa, Yutetsu; Hara, Daisuke; Pomerantsev, Michael; Su, Bo; Fujimura, Aki
2017-07-01
Inverse Lithography Technology (ILT) is becoming the choice for Optical Proximity Correction (OPC) of advanced technology nodes in IC design and production. Multi-beam mask writers promise significant mask writing time reduction for complex ILT style masks. Before multi-beam mask writers become the main stream working tools in mask production, VSB writers will continue to be the tool of choice to write both curvilinear ILT and Manhattanized ILT masks. To enable VSB mask writers for complex ILT style masks, model-based mask process correction (MB-MPC) is required to do the following: 1). Make reasonable corrections for complex edges for those features that exhibit relatively large deviations from both curvilinear ILT and Manhattanized ILT designs. 2). Control and manage both Edge Placement Errors (EPE) and shot count. 3. Assist in easing the migration to future multi-beam mask writer and serve as an effective backup solution during the transition. In this paper, a solution meeting all those requirements, MB-MPC with GPU acceleration, will be presented. One model calibration per process allows accurate correction regardless of the target mask writer.
Constructing the wonders of the bacterial world: biosynthesis of complex enzymes.
Sargent, Frank
2007-03-01
The prokaryotic cytoplasmic membrane not only maintains cell integrity and forms a barrier between the cell and its outside environment, but is also the location for essential biochemical processes. Microbial model systems provide excellent bases for the study of fundamental problems in membrane biology including signal transduction, chemotaxis, solute transport and, as will be the topic of this review, energy metabolism. Bacterial respiration requires a diverse array of complex, multi-subunit, cofactor-containing redox enzymes, many of which are embedded within, or located on the extracellular side of, the membrane. The biosynthesis of these enzymes therefore requires carefully controlled expression, assembly, targeting and transport processes. Here, focusing on the molybdenum-containing respiratory enzymes central to anaerobic respiration in Escherichia coli, recent descriptions of a chaperone-mediated 'proofreading' system involved in coordinating assembly and export of complex extracellular enzymes will be discussed. The paradigm proofreading chaperones are members of a large group of proteins known as the TorD family, and recent research in this area highlights common principles that underpin biosynthesis of both exported and non-exported respiratory enzymes.
Putting the process of care into practice.
Houck, S; Baum, N
1997-01-01
"Putting the process of care into practice" provides an interactive, visual model of outpatient resources and processes. It illustrates an episode of care from a fee-for-service as well as managed care perspective. The Care Process Matrix can be used for planning and staffing, as well as retrospectively to assess appropriate resource use within a practice. It identifies effective strategies for reducing the cost per episode of care and optimizing quality while moving from managing costs to managing the care process. Because of an overbuilt health care system, including an oversupply of physicians, success in the future will require redesigning the process of care and a coherent customer service strategy. The growing complexities of practice will require physicians to focus on several key competencies while outsourcing other functions such as billing and contracting.
Maclin, Edward L; Mathewson, Kyle E; Low, Kathy A; Boot, Walter R; Kramer, Arthur F; Fabiani, Monica; Gratton, Gabriele
2011-09-01
Changes in attention allocation with complex task learning reflect processing automatization and more efficient control. We studied these changes using ERP and EEG spectral analyses in subjects playing Space Fortress, a complex video game comprising standard cognitive task components. We hypothesized that training would free up attentional resources for a secondary auditory oddball task. Both P3 and delta EEG showed a processing trade-off between game and oddball tasks, but only some game events showed reduced attention requirements with practice. Training magnified a transient increase in alpha power following both primary and secondary task events. This contrasted with alpha suppression observed when the oddball task was performed alone, suggesting that alpha may be related to attention switching. Hence, P3 and EEG spectral data are differentially sensitive to changes in attentional processing occurring with complex task training. Copyright © 2011 Society for Psychophysiological Research.
Coding principles of the canonical cortical microcircuit in the avian brain
Calabrese, Ana; Woolley, Sarah M. N.
2015-01-01
Mammalian neocortex is characterized by a layered architecture and a common or “canonical” microcircuit governing information flow among layers. This microcircuit is thought to underlie the computations required for complex behavior. Despite the absence of a six-layered cortex, birds are capable of complex cognition and behavior. In addition, the avian auditory pallium is composed of adjacent information-processing regions with genetically identified neuron types and projections among regions comparable with those found in the neocortex. Here, we show that the avian auditory pallium exhibits the same information-processing principles that define the canonical cortical microcircuit, long thought to have evolved only in mammals. These results suggest that the canonical cortical microcircuit evolved in a common ancestor of mammals and birds and provide a physiological explanation for the evolution of neural processes that give rise to complex behavior in the absence of cortical lamination. PMID:25691736
Risk Modeling of Interdependent Complex Systems of Systems: Theory and Practice.
Haimes, Yacov Y
2018-01-01
The emergence of the complexity characterizing our systems of systems (SoS) requires a reevaluation of the way we model, assess, manage, communicate, and analyze the risk thereto. Current models for risk analysis of emergent complex SoS are insufficient because too often they rely on the same risk functions and models used for single systems. These models commonly fail to incorporate the complexity derived from the networks of interdependencies and interconnectedness (I-I) characterizing SoS. There is a need to reevaluate currently practiced risk analysis to respond to this reality by examining, and thus comprehending, what makes emergent SoS complex. The key to evaluating the risk to SoS lies in understanding the genesis of characterizing I-I of systems manifested through shared states and other essential entities within and among the systems that constitute SoS. The term "essential entities" includes shared decisions, resources, functions, policies, decisionmakers, stakeholders, organizational setups, and others. This undertaking can be accomplished by building on state-space theory, which is fundamental to systems engineering and process control. This article presents a theoretical and analytical framework for modeling the risk to SoS with two case studies performed with the MITRE Corporation and demonstrates the pivotal contributions made by shared states and other essential entities to modeling and analysis of the risk to complex SoS. A third case study highlights the multifarious representations of SoS, which require harmonizing the risk analysis process currently applied to single systems when applied to complex SoS. © 2017 Society for Risk Analysis.
Murk, Kai; Blanco Suarez, Elena M; Cockbill, Louisa M R; Banks, Paul; Hanley, Jonathan G
2013-09-01
Astrocytes exhibit a complex, branched morphology, allowing them to functionally interact with numerous blood vessels, neighboring glial processes and neuronal elements, including synapses. They also respond to central nervous system (CNS) injury by a process known as astrogliosis, which involves morphological changes, including cell body hypertrophy and thickening of major processes. Following severe injury, astrocytes exhibit drastically reduced morphological complexity and collectively form a glial scar. The mechanistic details behind these morphological changes are unknown. Here, we investigate the regulation of the actin-nucleating Arp2/3 complex in controlling dynamic changes in astrocyte morphology. In contrast to other cell types, Arp2/3 inhibition drives the rapid expansion of astrocyte cell bodies and major processes. This intervention results in a reduced morphological complexity of astrocytes in both dissociated culture and in brain slices. We show that this expansion requires functional myosin II downstream of ROCK and RhoA. Knockdown of the Arp2/3 subunit Arp3 or the Arp2/3 activator N-WASP by siRNA also results in cell body expansion and reduced morphological complexity, whereas depleting WAVE2 specifically reduces the branching complexity of astrocyte processes. By contrast, knockdown of the Arp2/3 inhibitor PICK1 increases astrocyte branching complexity. Furthermore, astrocyte expansion induced by ischemic conditions is delayed by PICK1 knockdown or N-WASP overexpression. Our findings identify a new morphological outcome for Arp2/3 activation in restricting rather than promoting outwards movement of the plasma membrane in astrocytes. The Arp2/3 regulators PICK1, and N-WASP and WAVE2 function antagonistically to control the complexity of astrocyte branched morphology, and this mechanism underlies the morphological changes seen in astrocytes during their response to pathological insult.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khalifa, Hesham
Advanced ceramic materials exhibit properties that enable safety and fuel cycle efficiency improvements in advanced nuclear reactors. In order to fully exploit these desirable properties, new processing techniques are required to produce the complex geometries inherent to nuclear fuel assemblies and support structures. Through this project, the state of complex SiC-SiC composite fabrication for nuclear components has advanced significantly. New methods to produce complex SiC-SiC composite structures have been demonstrated in the form factors needed for in-core structural components in advanced high temperature nuclear reactors. Advanced characterization techniques have been employed to demonstrate that these complex SiC-SiC composite structures providemore » the strength, toughness and hermeticity required for service in harsh reactor conditions. The complex structures produced in this project represent a significant step forward in leveraging the excellent high temperature strength, resistance to neutron induced damage, and low neutron cross section of silicon carbide in nuclear applications.« less
Exploiting the User: Adapting Personas for Use in Security Visualization Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stoll, Jennifer C.; McColgin, David W.; Gregory, Michelle L.
It has long been noted that visual representations of complex information can facilitate rapid understanding of data {citation], even with respect to ComSec applications {citation]. Recognizing that visualizations can increase usability in ComSec applications, [Zurko, Sasse] have argued that there is a need to create more usable security visualizations. (VisSec) However, usability of applications generally fall into the domain of Human Computer Interaction (HCI), which generally relies on heavy-weight user-centered design (UCD) processes. For example, the UCD process can involve many prototype iterations, or an ethnographic field study that can take months to complete. The problem is that VisSec projectsmore » generally do not have the resources to perform ethnographic field studies, or to employ complex UCD methods. They often are running on tight deadlines and budgets that can not afford standard UCD methods. In order to help resolve the conflict of needing more usable designs in ComSec, but not having the resources to employ complex UCD methods, in this paper we offer a stripped-down lighter weight version of a UCD process which can help with capturing user requirements. The approach we use is personas which a user requirements capturing method arising out of the Participatory Design philosophy [Grudin02].« less
ERIC Educational Resources Information Center
Rosado Feger, Ana L.
2009-01-01
Supply Chain Management, the coordination of upstream and downstream flows of product, services, finances, and information from a source to a customer, has risen in prominence over the past fifteen years. The delivery of a product to the consumer is a complex process requiring action from several independent entities. An individual firm consists…
Sources of Difficulty in the Processing of Written Language. Report Series 4.3.
ERIC Educational Resources Information Center
Chafe, Wallace
Ease of language processing varies with the nature of the language involved. Ordinary spoken language is the easiest kind to produce and understand, while writing is a relatively new development. On thoughtful inspection, the readability of writing has shown itself to be a complex topic requiring insights from many academic disciplines and…
ERIC Educational Resources Information Center
Singh-Pillay, Asheena; Alant, Busisiwe
2015-01-01
This paper accounts for the enacted realities of curriculum reform in South Africa, in particular the mediation of curriculum change. Curriculum implementation is viewed as a complex networked process of transforming or mediating policy into classroom practice. The fact that curriculum implementation is seen as problematic requires attention for…
ERIC Educational Resources Information Center
Garner, Johny T.
2015-01-01
Organizational communication processes are complex, but all too often, researchers oversimplify the study of these processes by relying on a single method. Particularly when scholars and practitioners partner together to solve organizational problems, meaningful results require methodological flexibility and diversity. As an exemplar of the fit…
Assessment Competence through In Situ Practice for Preservice Educators
ERIC Educational Resources Information Center
Hurley, Kimberly S.
2018-01-01
Effective assessment is the cornerstone of the teaching and learning process and a benchmark of teaching competency. P-12 assessment in physical activity can be complex and dynamic, often requiring a set of skills developed over time through trial and error. Novice teachers have limited time to hone an assessment process that can showcase their…
ERIC Educational Resources Information Center
Barajas-Saavedra, Arturo; Álvarez-Rodriguez, Francisco J.; Mendoza-González, Ricardo; Oviedo-De-Luna, Ana C.
2015-01-01
Development of digital resources is difficult due to their particular complexity relying on pedagogical aspects. Another aspect is the lack of well-defined development processes, experiences documented, and standard methodologies to guide and organize game development. Added to this, there is no documented technique to ensure correct…
The complexity and cost of vaccine manufacturing - An overview.
Plotkin, Stanley; Robinson, James M; Cunningham, Gerard; Iqbal, Robyn; Larsen, Shannon
2017-07-24
As companies, countries, and governments consider investments in vaccine production for routine immunization and outbreak response, understanding the complexity and cost drivers associated with vaccine production will help to inform business decisions. Leading multinational corporations have good understanding of the complex manufacturing processes, high technological and R&D barriers to entry, and the costs associated with vaccine production. However, decision makers in developing countries, donors and investors may not be aware of the factors that continue to limit the number of new manufacturers and have caused attrition and consolidation among existing manufacturers. This paper describes the processes and cost drivers in acquiring and maintaining licensure of childhood vaccines. In addition, when export is the goal, we describe the requirements to supply those vaccines at affordable prices to low-resource markets, including the process of World Health Organization (WHO) prequalification and supporting policy recommendation. By providing a generalized and consolidated view of these requirements we seek to build awareness in the global community of the benefits and costs associated with vaccine manufacturing and the challenges associated with maintaining consistent supply. We show that while vaccine manufacture may prima facie seem an economic growth opportunity, the complexity and high fixed costs of vaccine manufacturing limit potential profit. Further, for most lower and middle income countries a large majority of the equipment, personnel and consumables will need to be imported for years, further limiting benefits to the local economy. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
WHAMM links actin assembly via the Arp2/3 complex to autophagy.
Kast, David J; Dominguez, Roberto
2015-01-01
Macroautophagy (hereafter autophagy) is the process by which cytosolic material destined for degradation is enclosed inside a double-membrane cisterna known as the autophagosome and processed for secretion and/or recycling. This process requires a large collection of proteins that converge on certain sites of the ER membrane to generate the autophagosome membrane. Recently, it was shown that actin accumulates around autophagosome precursors and could play a role in this process, but the mechanism and role of actin polymerization in autophagy were unknown. Here, we discuss our recent finding that the nucleation-promoting factor (NPF) WHAMM recruits and activates the Arp2/3 complex for actin assembly at sites of autophagosome formation on the ER. Using high-resolution, live-cell imaging, we showed that WHAMM forms dynamic puncta on the ER that comigrate with several autophagy markers, and propels the spiral movement of these puncta by an Arp2/3 complex-dependent actin comet tail mechanism. In starved cells, WHAMM accumulates at the interface between neighboring autophagosomes, whose number and size increases with WHAMM expression. Conversely, knocking down WHAMM, inhibiting the Arp2/3 complex or interfering with actin polymerization reduces the size and number of autophagosomes. These findings establish a link between Arp2/3 complex-mediated actin assembly and autophagy.
Romes, Erin M.; Sobhany, Mack; Stanley, Robin E.
2016-01-01
The synthesis of eukaryotic ribosomes is a complex, energetically demanding process requiring the aid of numerous non-ribosomal factors, such as the PeBoW complex. The mammalian PeBoW complex, composed of Pes1, Bop1, and WDR12, is essential for the processing of the 32S preribosomal RNA. Previous work in Saccharomyces cerevisiae has shown that release of the homologous proteins in this complex (Nop7, Erb1, and Ytm1, respectively) from preribosomal particles requires Rea1 (midasin or MDN1 in humans), a large dynein-like protein. Midasin contains a C-terminal metal ion-dependent adhesion site (MIDAS) domain that interacts with the N-terminal ubiquitin-like (UBL) domain of Ytm1/WDR12 as well as the UBL domain of Rsa4/Nle1 in a later step in the ribosome maturation pathway. Here we present the crystal structure of the UBL domain of the WDR12 homologue from S. cerevisiae at 1.7 Å resolution and demonstrate that human midasin binds to WDR12 as well as Nle1 through their respective UBL domains. Midasin contains a well conserved extension region upstream of the MIDAS domain required for binding WDR12 and Nle1, and the interaction is dependent upon metal ion coordination because removal of the metal or mutation of residues that coordinate the metal ion diminishes the interaction. Mammalian WDR12 displays prominent nucleolar localization that is dependent upon active ribosomal RNA transcription. Based upon these results, we propose that release of the PeBoW complex and subsequent release of Nle1 by midasin is a well conserved step in the ribosome maturation pathway in both yeast and mammalian cells. PMID:26601951
NASA Astrophysics Data System (ADS)
Homberg, Werner; Hornjak, Daniel
2011-05-01
Friction spinning is a new innovative and promising incremental forming technology implying high potential regarding the manufacturing of complex functionally graded workpieces and enhancing existing forming limits of conventional metal spinning processes. The friction spinning process is based on the integration of thermo-mechanical friction subprocesses in this incremental forming process. By choosing the appropriate process parameters, e.g. axial feed rate or relative motion, the contact conditions between tool and workpiece can be influenced in a defined way and, thus, a required temperature profile can be obtained. Friction spinning allows the extension of forming limits compared to conventional metal spinning in order to produce multifunctional components with locally varying properties and the manufacturing of e.g. complex hollow parts made of tubes, profiles, or sheet metals. In this way, it meets the demands regarding efficiency and the manufacturing of functionally graded lightweight components. There is e.g. the possibility of locally increasing the wall thickness in joining zones and, as a consequence, achieving higher quality of the joint at decreased expense. These products are not or only hardly producible by conventional processes so far. In order to benefit from the advantages and potentials of this new innovative process new tooling systems and concepts are indispensable which fulfill the special requirements of this thermo-mechanical process concerning thermal and tribological loads and which allow simultaneous and defined forming and friction operations. An important goal of the corresponding research work at the Chair of Forming and Machining Technology at the University of Paderborn is the development of tool systems that allow the manufacturing of such complex parts by simple uniaxial or sequential biaxial linear tool paths. In the paper, promising tool systems and geometries as well as results of theoretical and experimental research work (e.g. regarding the influence and interaction of process parameters on the workpiece quality) will be discussed. Furthermore, possibilities regarding the manufacturing of geometries (demonstrator workpieces) which are not or only hardly producible with conventional processes will be presented.
Special Advanced Studies for Pollution Prevention. Delivery Order 0058: The Monitor - Spring 2000
2001-04-01
Process complexity ➨ Strippability ➨ Maturity ➨ Process type and chemistry ➨ Licensing requirements ➨ Vendor information ➨ Niplate 700 (Surface...Courses of Action 4 Identify & Evaluate Potential Alternatives 5 Select Best Alternative & Develop Project 6 Prioritize Projects by Commodity 7 Rank...Burden CS Priority Process Specific P2 OASolution Selection Solution Planning Solution Implementation Solution Evaluation Phase 2 Phase 3 Phase 1
Licensing of future mobile satellite systems
NASA Technical Reports Server (NTRS)
Lepkowski, Ronald J.
1990-01-01
The regulatory process for licensing mobile satellite systems is complex and can require many years to complete. This process involves frequency allocations, national licensing, and frequency coordination. The regulatory process that resulted in the establishment of the radiodetermination satellite service (RDSS) between 1983 and 1987 is described. In contrast, each of these steps in the licensing of the mobile satellite service (MSS) is taking a significantly longer period of time to complete.
Cortical Specializations Underlying Fast Computations
Volgushev, Maxim
2016-01-01
The time course of behaviorally relevant environmental events sets temporal constraints on neuronal processing. How does the mammalian brain make use of the increasingly complex networks of the neocortex, while making decisions and executing behavioral reactions within a reasonable time? The key parameter determining the speed of computations in neuronal networks is a time interval that neuronal ensembles need to process changes at their input and communicate results of this processing to downstream neurons. Theoretical analysis identified basic requirements for fast processing: use of neuronal populations for encoding, background activity, and fast onset dynamics of action potentials in neurons. Experimental evidence shows that populations of neocortical neurons fulfil these requirements. Indeed, they can change firing rate in response to input perturbations very quickly, within 1 to 3 ms, and encode high-frequency components of the input by phase-locking their spiking to frequencies up to 300 to 1000 Hz. This implies that time unit of computations by cortical ensembles is only few, 1 to 3 ms, which is considerably faster than the membrane time constant of individual neurons. The ability of cortical neuronal ensembles to communicate on a millisecond time scale allows for complex, multiple-step processing and precise coordination of neuronal activity in parallel processing streams, while keeping the speed of behavioral reactions within environmentally set temporal constraints. PMID:25689988
Modern Techniques in Acoustical Signal and Image Processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Candy, J V
2002-04-04
Acoustical signal processing problems can lead to some complex and intricate techniques to extract the desired information from noisy, sometimes inadequate, measurements. The challenge is to formulate a meaningful strategy that is aimed at performing the processing required even in the face of uncertainties. This strategy can be as simple as a transformation of the measured data to another domain for analysis or as complex as embedding a full-scale propagation model into the processor. The aims of both approaches are the same--to extract the desired information and reject the extraneous, that is, develop a signal processing scheme to achieve thismore » goal. In this paper, we briefly discuss this underlying philosophy from a ''bottom-up'' approach enabling the problem to dictate the solution rather than visa-versa.« less
A PBOM configuration and management method based on templates
NASA Astrophysics Data System (ADS)
Guo, Kai; Qiao, Lihong; Qie, Yifan
2018-03-01
The design of Process Bill of Materials (PBOM) holds a hinge position in the process of product development. The requirements of PBOM configuration design and management for complex products are analysed in this paper, which include the reuse technique of configuration procedure and urgent management need of huge quantity of product family PBOM data. Based on the analysis, the function framework of PBOM configuration and management has been established. Configuration templates and modules are defined in the framework to support the customization and the reuse of configuration process. The configuration process of a detection sensor PBOM is shown as an illustration case in the end. The rapid and agile PBOM configuration and management can be achieved utilizing template-based method, which has a vital significance to improve the development efficiency for complex products.
Remote Earth Sciences data collection using ACTS
NASA Technical Reports Server (NTRS)
Evans, Robert H.
1992-01-01
Given the focus on global change and the attendant scope of such research, we anticipate significant growth of requirements for investigator interaction, processing system capabilities, and availability of data sets. The increased complexity of global processes requires interdisciplinary teams to address them; the investigators will need to interact on a regular basis; however, it is unlikely that a single institution will house sufficient investigators with the required breadth of skills. The complexity of the computations may also require resources beyond those located within a single institution; this lack of sufficient computational resources leads to a distributed system located at geographically dispersed institutions. Finally the combination of long term data sets like the Pathfinder datasets and the data to be gathered by new generations of satellites such as SeaWiFS and MODIS-N yield extra-ordinarily large amounts of data. All of these factors combine to increase demands on the communications facilities available; the demands are generating requirements for highly flexible, high capacity networks. We have been examining the applicability of the Advanced Communications Technology Satellite (ACTS) to address the scientific, computational, and, primarily, communications questions resulting from global change research. As part of this effort three scenarios for oceanographic use of ACTS have been developed; a full discussion of this is contained in Appendix B.
The JPL functional requirements tool
NASA Technical Reports Server (NTRS)
Giffin, Geoff; Skinner, Judith; Stoller, Richard
1987-01-01
Planetary spacecraft are complex vehicles which are built according to many thousands of requirements. Problems encountered in documenting and maintaining these requirements led to the current attempt to reduce or eliminate these problems by a computer automated data base Functional Requirements Tool. The tool developed at JPL and in use on several JPL Projects is described. The organization and functionality of the Tool, together with an explanation of the data base inputs, their relationships, and use are presented. Methods of interfacing with external documents, representation of tables and figures, and methods of approval and change processing are discussed. The options available for disseminating information from the Tool are identified. The implementation of the Requirements Tool is outlined, and the operation is summarized. The conclusions drawn from this work is that the Requirements Tool represents a useful addition to the System Engineer's Tool kit, it is not currently available elsewhere, and a clear development path exists to expand the capabilities of the Tool to serve larger and more complex projects.
NASA Astrophysics Data System (ADS)
Hantry, Francois; Papazoglou, Mike; van den Heuvel, Willem-Jan; Haque, Rafique; Whelan, Eoin; Carroll, Noel; Karastoyanova, Dimka; Leymann, Frank; Nikolaou, Christos; Lammersdorf, Winfried; Hacid, Mohand-Said
Business process management is one of the core drivers of business innovation and is based on strategic technology and capable of creating and successfully executing end-to-end business processes. The trend will be to move from relatively stable, organization-specific applications to more dynamic, high-value ones where business process interactions and trends are examined closely to understand more accurately an application's requirements. Such collaborative, complex end-to-end service interactions give rise to the concept of Service Networks (SNs).
Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N. A.; Zaheer, Kashif Bin
2015-01-01
Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called ‘StakeMeter’. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error. PMID:25799490
CHEMICAL PROCESSES AND MODELING IN ECOSYSTEMS
Trends in regulatory strategies require EPA to understand better chemical behavior in natural and impacted ecosystems and in biological systems to carry out the increasingly complex array of exposure and risk assessments needed to develop scientifically defensible regulations (GP...
Analytic network process model for sustainable lean and green manufacturing performance indicator
NASA Astrophysics Data System (ADS)
Aminuddin, Adam Shariff Adli; Nawawi, Mohd Kamal Mohd; Mohamed, Nik Mohd Zuki Nik
2014-09-01
Sustainable manufacturing is regarded as the most complex manufacturing paradigm to date as it holds the widest scope of requirements. In addition, its three major pillars of economic, environment and society though distinct, have some overlapping among each of its elements. Even though the concept of sustainability is not new, the development of the performance indicator still needs a lot of improvement due to its multifaceted nature, which requires integrated approach to solve the problem. This paper proposed the best combination of criteria en route a robust sustainable manufacturing performance indicator formation via Analytic Network Process (ANP). The integrated lean, green and sustainable ANP model can be used to comprehend the complex decision system of the sustainability assessment. The finding shows that green manufacturing is more sustainable than lean manufacturing. It also illustrates that procurement practice is the most important criteria in the sustainable manufacturing performance indicator.
Implementation of Complexity Analyzing Based on Additional Effect
NASA Astrophysics Data System (ADS)
Zhang, Peng; Li, Na; Liang, Yanhong; Liu, Fang
According to the Complexity Theory, there is complexity in the system when the functional requirement is not be satisfied. There are several study performances for Complexity Theory based on Axiomatic Design. However, they focus on reducing the complexity in their study and no one focus on method of analyzing the complexity in the system. Therefore, this paper put forth a method of analyzing the complexity which is sought to make up the deficiency of the researches. In order to discussing the method of analyzing the complexity based on additional effect, this paper put forth two concepts which are ideal effect and additional effect. The method of analyzing complexity based on additional effect combines Complexity Theory with Theory of Inventive Problem Solving (TRIZ). It is helpful for designers to analyze the complexity by using additional effect. A case study shows the application of the process.
Simplified microprocessor design for VLSI control applications
NASA Technical Reports Server (NTRS)
Cameron, K.
1991-01-01
A design technique for microprocessors combining the simplicity of reduced instruction set computers (RISC's) with the richer instruction sets of complex instruction set computers (CISC's) is presented. They utilize the pipelined instruction decode and datapaths common to RISC's. Instruction invariant data processing sequences which transparently support complex addressing modes permit the formulation of simple control circuitry. Compact implementations are possible since neither complicated controllers nor large register sets are required.
Global Analysis of Yeast Endosomal Transport Identifies the Vps55/68 Sorting Complex
Schluter, Cayetana; Lam, Karen K.Y.; Brumm, Jochen; Wu, Bella W.; Saunders, Matthew; Stevens, Tom H.
2008-01-01
Endosomal transport is critical for cellular processes ranging from receptor down-regulation and retroviral budding to the immune response. A full understanding of endosome sorting requires a comprehensive picture of the multiprotein complexes that orchestrate vesicle formation and fusion. Here, we use unsupervised, large-scale phenotypic analysis and a novel computational approach for the global identification of endosomal transport factors. This technique effectively identifies components of known and novel protein assemblies. We report the characterization of a previously undescribed endosome sorting complex that contains two well-conserved proteins with four predicted membrane-spanning domains. Vps55p and Vps68p form a complex that acts with or downstream of ESCRT function to regulate endosomal trafficking. Loss of Vps68p disrupts recycling to the TGN as well as onward trafficking to the vacuole without preventing the formation of lumenal vesicles within the MVB. Our results suggest the Vps55/68 complex mediates a novel, conserved step in the endosomal maturation process. PMID:18216282
Analysis of splicing complexes on native gels.
Ares, Manuel
2013-10-01
Splicing requires a complex set of ATP-dependent macromolecular associations that lead to the rearrangement of just a few covalent bonds in the pre-mRNA substrate. Seeing only the covalent bonds breaking and forming is seeing only a very small part of the process. Analysis of native splicing complexes into which the radiolabeled substrate has been assembled, but not necessarily completely reacted, has provided a good understanding of the process. This protocol describes a gel method for detecting and analyzing yeast splicing complexes formed in vitro on a radiolabeled pre-mRNA substrate. Complexes formed during the splicing reaction are quenched by dilution and addition of an excess of RNA, which is thought to strip nonspecifically bound proteins from the labeled substrate RNA. After loading on a low-percentage, low-cross-linking ratio composite agarose-acrylamide gel (in 10% glycerol), labeled bands are detected. These can be extracted and shown to contain small nuclear RNAs (snRNAs) and partly reacted pre-mRNA.
Parametric Cost Analysis: A Design Function
NASA Technical Reports Server (NTRS)
Dean, Edwin B.
1989-01-01
Parametric cost analysis uses equations to map measurable system attributes into cost. The measures of the system attributes are called metrics. The equations are called cost estimating relationships (CER's), and are obtained by the analysis of cost and technical metric data of products analogous to those to be estimated. Examples of system metrics include mass, power, failure_rate, mean_time_to_repair, energy _consumed, payload_to_orbit, pointing_accuracy, manufacturing_complexity, number_of_fasteners, and percent_of_electronics_weight. The basic assumption is that a measurable relationship exists between system attributes and the cost of the system. If a function exists, the attributes are cost drivers. Candidates for metrics include system requirement metrics and engineering process metrics. Requirements are constraints on the engineering process. From optimization theory we know that any active constraint generates cost by not permitting full optimization of the objective. Thus, requirements are cost drivers. Engineering processes reflect a projection of the requirements onto the corporate culture, engineering technology, and system technology. Engineering processes are an indirect measure of the requirements and, hence, are cost drivers.
Epidemic processes in complex networks
NASA Astrophysics Data System (ADS)
Pastor-Satorras, Romualdo; Castellano, Claudio; Van Mieghem, Piet; Vespignani, Alessandro
2015-07-01
In recent years the research community has accumulated overwhelming evidence for the emergence of complex and heterogeneous connectivity patterns in a wide range of biological and sociotechnical systems. The complex properties of real-world networks have a profound impact on the behavior of equilibrium and nonequilibrium phenomena occurring in various systems, and the study of epidemic spreading is central to our understanding of the unfolding of dynamical processes in complex networks. The theoretical analysis of epidemic spreading in heterogeneous networks requires the development of novel analytical frameworks, and it has produced results of conceptual and practical relevance. A coherent and comprehensive review of the vast research activity concerning epidemic processes is presented, detailing the successful theoretical approaches as well as making their limits and assumptions clear. Physicists, mathematicians, epidemiologists, computer, and social scientists share a common interest in studying epidemic spreading and rely on similar models for the description of the diffusion of pathogens, knowledge, and innovation. For this reason, while focusing on the main results and the paradigmatic models in infectious disease modeling, the major results concerning generalized social contagion processes are also presented. Finally, the research activity at the forefront in the study of epidemic spreading in coevolving, coupled, and time-varying networks is reported.
NASA Astrophysics Data System (ADS)
Swallow, B.; Rigby, M. L.; Rougier, J.; Manning, A.; Thomson, D.; Webster, H. N.; Lunt, M. F.; O'Doherty, S.
2016-12-01
In order to understand underlying processes governing environmental and physical phenomena, a complex mathematical model is usually required. However, there is an inherent uncertainty related to the parameterisation of unresolved processes in these simulators. Here, we focus on the specific problem of accounting for uncertainty in parameter values in an atmospheric chemical transport model. Systematic errors introduced by failing to account for these uncertainties have the potential to have a large effect on resulting estimates in unknown quantities of interest. One approach that is being increasingly used to address this issue is known as emulation, in which a large number of forward runs of the simulator are carried out, in order to approximate the response of the output to changes in parameters. However, due to the complexity of some models, it is often unfeasible to run large numbers of training runs that is usually required for full statistical emulators of the environmental processes. We therefore present a simplified model reduction method for approximating uncertainties in complex environmental simulators without the need for very large numbers of training runs. We illustrate the method through an application to the Met Office's atmospheric transport model NAME. We show how our parameter estimation framework can be incorporated into a hierarchical Bayesian inversion, and demonstrate the impact on estimates of UK methane emissions, using atmospheric mole fraction data. We conclude that accounting for uncertainties in the parameterisation of complex atmospheric models is vital if systematic errors are to be minimized and all relevant uncertainties accounted for. We also note that investigations of this nature can prove extremely useful in highlighting deficiencies in the simulator that might otherwise be missed.
Structural model of control system for hydraulic stepper motor complex
NASA Astrophysics Data System (ADS)
Obukhov, A. D.; Dedov, D. L.; Kolodin, A. N.
2018-03-01
The article considers the problem of developing a structural model of the control system for a hydraulic stepper drive complex. A comparative analysis of stepper drives and assessment of the applicability of HSM for solving problems, requiring accurate displacement in space with subsequent positioning of the object, are carried out. The presented structural model of the automated control system of the multi-spindle complex of hydraulic stepper drives reflects the main components of the system, as well as the process of its control based on the control signals transfer to the solenoid valves by the controller. The models and methods described in the article can be used to formalize the control process in technical systems based on the application hydraulic stepper drives and allow switching from mechanical control to automated control.
The roles of cohesins in mitosis, meiosis, and human health and disease
Brooker, Amanda S.; Berkowitz, Karen M.
2015-01-01
Summary Mitosis and meiosis are essential processes that occur during development. Throughout these processes, cohesion is required to keep the sister chromatids together until their separation at anaphase. Cohesion is created by multi-protein subunit complexes called cohesins. Although the subunits differ slightly in mitosis and meiosis, the canonical cohesin complex is composed of four subunits that are quite diverse. The cohesin complexes are also important for DNA repair, gene expression, development, and genome integrity. Here we provide an overview of the roles of cohesins during these different events, as well as their roles in human health and disease, including the cohesinopathies. Although the exact roles and mechanisms of these proteins are still being elucidated, this review will serve as a guide for the current knowledge of cohesins. PMID:24906316
NASA Astrophysics Data System (ADS)
Jalba, C. K.; Muminovic, A.; Epple, S.; Barz, C.; Nasui, V.
2017-05-01
With increasing automation, many work processes become more and more complex. Most technical products can no longer be developed and manufactured by a single department. They are often the product of different divisions and require cooperation from different specialist areas. For example, in the Western world, a simple coffee maker is no longer so much in demand. If the buyer has the possibility to choose between a simple coffee maker and a coffee machine with very complex functions, the choice will probably fall to the more complex variant. Technical progress also applies to other technical products, such as grippers and manipulators. In this paper, it is shown how grasping processes can be redefined and developed with interdisciplinary technical approaches. Both conventional and latest developments in mechanical engineering, production technology, mechatronics and sensor technology will be considered.
Superradiance Transition and Nonphotochemical Quenching in Photosynthetic Complexes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berman, Gennady Petrovich; Nesterov, Alexander; Lopez, Gustavo
2015-04-23
Photosynthetic organisms have evolved protective strategies to allow them to survive in cases of intense sunlight fluctuation with the development of nonphotochemical quenching (NPQ). This process allows light harvesting complexes to transfer the excess sunlight energy to non-damaging quenching channels. This report compares the NPQ process with the superradiance transition (ST). We demonstrated that the maximum of the NPQ efficiency is caused by the ST to the sink associated with the CTS. However, experimental verifications are required in order to determine whether or not the NPQ regime is associated with the ST transition for real photosynthetic complexes. Indeed, it canmore » happen that, in the photosynthetic apparatus, the NPQ regime occurs in the “non-optimal” region of parameters, and it could be independent of the ST.« less
Patterns of physiological activity accompanying performance on a perceptual-motor task.
DOT National Transportation Integrated Search
1969-04-01
Air traffic controllers are required to spend considerable periods of time observing radar displays. Yet, information regarding physiological measures which best reflect the attentional process in complex vigilance tasks is generally lacking. As an i...
Explicit Pharmacokinetic Modeling: Tools for Documentation, Verification, and Portability
Quantitative estimates of tissue dosimetry of environmental chemicals due to multiple exposure pathways require the use of complex mathematical models, such as physiologically-based pharmacokinetic (PBPK) models. The process of translating the abstract mathematics of a PBPK mode...
The evolution of cerebellum structure correlates with nest complexity.
Hall, Zachary J; Street, Sally E; Healy, Susan D
2013-01-01
Across the brains of different bird species, the cerebellum varies greatly in the amount of surface folding (foliation). The degree of cerebellar foliation is thought to correlate positively with the processing capacity of the cerebellum, supporting complex motor abilities, particularly manipulative skills. Here, we tested this hypothesis by investigating the relationship between cerebellar foliation and species-typical nest structure in birds. Increasing complexity of nest structure is a measure of a bird's ability to manipulate nesting material into the required shape. Consistent with our hypothesis, avian cerebellar foliation increases as the complexity of the nest built increases, setting the scene for the exploration of nest building at the neural level.
Evaluation in context: ATC automation in the field
NASA Technical Reports Server (NTRS)
Harwood, Kelly; Sanford, Beverly
1994-01-01
The process for incorporating advanced technologies into complex aviation systems is as important as the final product itself. This paper described a process that is currently being applied to the development and assessment of an advanced ATC automation system, CTAS. The key element of the process is field exposure early in the system development cycle. The process deviates from current established practices of system development -- where field testing is an implementation endpoint -- and has been deemed necessary by the FAA for streamlining development and bringing system functions to a level of stability and usefulness. Methods and approaches for field assessment are borrowed from human factors engineering, cognitive engineering, and usability engineering and are tailored for the constraints of an operational ATC environment. To date, the focus has been on the qualitative assessment of the match between TMA capabilities and the context for their use. Capturing the users' experience with the automation tool and understanding tool use in the context of the operational environment is important, not only for developing a tool that is an effective problem-solving instrument but also for defining meaningful operational requirements. Such requirements form the basis for certifying the safety and efficiency of the system. CTAS is the first U.S. advanced ATC automation system of its scope and complexity to undergo this field development and assessment process. With the rapid advances in aviation technologies and our limited understanding of their impact on system performance, it is time we opened our eyes to new possibilities for developing, validating, and ultimately certifying complex aviation systems.
A Rare Terminal Dinitrogen Complex of Chromium
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mock, Michael T.; Chen, Shentan; Rousseau, Roger J.
The reduction of dinitrogen to ammonia from N2 and H2 is currently carried out by the Haber-Bosch process, an energy intensive process that requires high pressures and high temperatures and accounts for the production of millions of tons of ammonia per year. The development of a catalytic, energy-efficient process for N2 reduction is of great interest and remains a formidable challenge. In this communication, we are reporting the preparation, characterization and computational electronic structure analysis of a rare 'Chatt-type' ((P-P)2M(N2)2, P-P = diphosphine ligand) complex of chromium, cis-[Cr(N2)2(PPh2NBn2)2] and its reactivity with CO. This complex is supported by the diphosphinemore » ligand PPh2NBn2, containing non-coordinating pendant amine bases, to serve as proton relays. Future studies for this complex are aimed at answering fundamental questions regarding the role of proton relays in the second coordination sphere in their ability to facilitate proton movement from an external acid to metal-bound dinitrogen ligands in the challenging multi-proton/electron reduction of N2 to ammonia.« less
On Complex Water Conflicts: Role of Enabling Conditions for Pragmatic Resolution
NASA Astrophysics Data System (ADS)
Islam, S.; Choudhury, E.
2016-12-01
Many of our current and emerging water problems are interconnected and cross boundaries, domains, scales, and sectors. These boundary crossing water problems are neither static nor linear; but often are interconnected nonlinearly with other problems and feedback. The solution space for these complex problems - involving interdependent variables, processes, actors, and institutions - can't be pre-stated. We need to recognize the disconnect among values, interests, and tools as well as problems, policies, and politics. Scientific and technological solutions are desired for efficiency and reliability, but need to be politically feasible and actionable. Governing and managing complex water problems require difficult tradeoffs in exploring and sharing benefits and burdens through carefully crafted negotiation processes. The crafting of such negotiation process, we argue, constitutes a pragmatic approach to negotiation - one that is based on the identification of enabling conditions - as opposed to mechanistic casual explanations, and rooted in contextual conditions to specify and ensure the principles of equity and sustainability. We will use two case studies to demonstrate the efficacy of the proposed principled pragmatic approcah to address complex water problems.
Dynamic pathway modeling of signal transduction networks: a domain-oriented approach.
Conzelmann, Holger; Gilles, Ernst-Dieter
2008-01-01
Mathematical models of biological processes become more and more important in biology. The aim is a holistic understanding of how processes such as cellular communication, cell division, regulation, homeostasis, or adaptation work, how they are regulated, and how they react to perturbations. The great complexity of most of these processes necessitates the generation of mathematical models in order to address these questions. In this chapter we provide an introduction to basic principles of dynamic modeling and highlight both problems and chances of dynamic modeling in biology. The main focus will be on modeling of s transduction pathways, which requires the application of a special modeling approach. A common pattern, especially in eukaryotic signaling systems, is the formation of multi protein signaling complexes. Even for a small number of interacting proteins the number of distinguishable molecular species can be extremely high. This combinatorial complexity is due to the great number of distinct binding domains of many receptors and scaffold proteins involved in signal transduction. However, these problems can be overcome using a new domain-oriented modeling approach, which makes it possible to handle complex and branched signaling pathways.
NASA Astrophysics Data System (ADS)
Polosin, A. N.; Chistyakova, T. B.
2018-05-01
In this article, the authors describe mathematical modeling of polymer processing in extruders of various types used in extrusion and calender productions of film materials. The method consists of the synthesis of a static model for calculating throughput, energy consumption of the extruder, extrudate quality indices, as well as a dynamic model for evaluating polymer residence time in the extruder, on which the quality indices depend. Models are adjusted according to the extruder type (single-screw, reciprocating, twin-screw), its screw and head configuration, extruder’s work temperature conditions, and the processed polymer type. Models enable creating extruder screw configurations and determining extruder controlling action values that provide the extrudate of required quality while satisfying extruder throughput and energy consumption requirements. Model adequacy has been verified using polyolefins’ and polyvinylchloride processing data in different extruders. The program complex, based on mathematical models, has been developed in order to control extruders of various types in order to ensure resource and energy saving in multi-assortment productions of polymeric films. Using the program complex in the control system for the extrusion stage of the polymeric film productions enables improving film quality, reducing spoilage, lessening the time required for production line change-over to other throughput and film type assignment.
[Clinical decision making and critical thinking in the nursing diagnostic process].
Müller-Staub, Maria
2006-10-01
The daily routine requires complex thinking processes of nurses, but clinical decision making and critical thinking are underestimated in nursing. A great demand for educational measures in clinical judgement related with the diagnostic process was found in nurses. The German literature hardly describes nursing diagnoses as clinical judgements about human reactions on health problems / life processes. Critical thinking is described as an intellectual, disciplined process of active conceptualisation, application and synthesis of information. It is gained through observation, experience, reflection and communication and leads thinking and action. Critical thinking influences the aspects of clinical decision making a) diagnostic judgement, b) therapeutic reasoning and c) ethical decision making. Human reactions are complex processes and in their course, human behavior is interpreted in the focus of health. Therefore, more attention should be given to the nursing diagnostic process. This article presents the theoretical framework of the paper "Clinical decision making: Fostering critical thinking in the nursing diagnostic process through case studies".
Producing a functional eukaryotic messenger RNA (mRNA) requires the coordinated activity of several large protein complexes to initiate transcription, elongate nascent transcripts, splice together exons, and cleave and polyadenylate the 3’ end. Kinetic competition between these various processes has been proposed to regulate mRNA maturation, but this model could lead to
Writer's Block, Merit, and the Market: Working in the University of Excellence.
ERIC Educational Resources Information Center
Crosby, Christina
2003-01-01
Argues that scholarly writing entails entering into a complex network of relationships and engages the writer in a process that may have a multitude of ends. Discusses how professional writing is related to the logic of market in that writers must produce an exchangeable commodity, but the process is governed by the requirements of the profession…
Cleaning and Cleanliness Measurement of Additive Manufactured Parts
NASA Technical Reports Server (NTRS)
Mitchell, Mark A.; Edwards, Kevin; Fox, Eric; Boothe, Richard
2017-01-01
Additive Manufacturing processes allow for the manufacture of complex three dimensional components that otherwise could not be manufactured. Post treatment processes require the removal of any remnant bulk powder that may become entrapped within small cavities and channels within a component. This project focuses on several gross cleaning methods and the verification metrics associated with additive manufactured parts for oxygen propulsion usage.
ERIC Educational Resources Information Center
Suto, W. M. Irenka; Greatorex, Jackie
2008-01-01
The process of examination marking is complex, requiring examiners to engage in a variety of cognitive operations. While consideration has been given to marking practices in a few specific contexts, those of General Certificate of Secondary Education (GCSE) examiners have yet to receive serious attention. This study's aims, therefore, were: first,…
Imam, Neena; Barhen, Jacob
2009-01-01
For real-time acoustic source localization applications, one of the primary challenges is the considerable growth in computational complexity associated with the emergence of ever larger, active or passive, distributed sensor networks. These sensors rely heavily on battery-operated system components to achieve highly functional automation in signal and information processing. In order to keep communication requirements minimal, it is desirable to perform as much processing on the receiver platforms as possible. However, the complexity of the calculations needed to achieve accurate source localization increases dramatically with the size of sensor arrays, resulting in substantial growth of computational requirements that cannot bemore » readily met with standard hardware. One option to meet this challenge builds upon the emergence of digital optical-core devices. The objective of this work was to explore the implementation of key building block algorithms used in underwater source localization on the optical-core digital processing platform recently introduced by Lenslet Inc. This demonstration of considerably faster signal processing capability should be of substantial significance to the design and innovation of future generations of distributed sensor networks.« less
Kovshoff, Hanna; Williams, Sarah; Vrijens, May; Danckaerts, Marina; Thompson, Margaret; Yardley, Lucy; Hodgkins, Paul; Sonuga-Barke, Edmund J S
2012-02-01
Clinical decision making is influenced by a range of factors and constitutes an inherently complex task. Here we present results from the decisions regarding ADHD management (DRAMa) study in which we undertook a thematic analysis of clinicians' experiences and attitudes to assessment, diagnosis and treatment of ADHD. Fifty prescribing child psychiatrists and paediatricians from Belgium and the UK took part in semi-structured interviews about their decisions regarding the assessment, diagnosis and treatment of ADHD. Interviews were transcribed and processed using thematic analysis and the principles of grounded theory. Clinicians described the assessment and diagnostic process as inherently complicated and requiring time and experience to piece together the accounts of children made by multiple sources and through the use of varying information gathering techniques. Treatment decisions were viewed as a shared process between families, children, and the clinician. Published guidelines were viewed as vague, and few clinicians spoke about the use of symptom thresholds or specific impairment criteria. Furthermore, systematic or operationalised criteria to assess treatment outcomes were rarely used. Decision making in ADHD is regarded as a complicated, time consuming process which requires extensive use of clinical impression, and involves a partnership with parents. Clinicians want to separate biological from environmental causal factors to understand the level of impairment and the subsequent need for a diagnosis of ADHD. Clinical guidelines would benefit from revisions to take into account the real-world complexities of clinical decision making for ADHD.
Canonical Initiation Factor Requirements of the Myc Family of Internal Ribosome Entry Segments▿ †
Spriggs, Keith A.; Cobbold, Laura C.; Jopling, Catherine L.; Cooper, Rebecca E.; Wilson, Lindsay A.; Stoneley, Mark; Coldwell, Mark J.; Poncet, Didier; Shen, Ya-Ching; Morley, Simon J.; Bushell, Martin; Willis, Anne E.
2009-01-01
Initiation of protein synthesis in eukaryotes requires recruitment of the ribosome to the mRNA and its translocation to the start codon. There are at least two distinct mechanisms by which this process can be achieved; the ribosome can be recruited either to the cap structure at the 5′ end of the message or to an internal ribosome entry segment (IRES), a complex RNA structural element located in the 5′ untranslated region (5′-UTR) of the mRNA. However, it is not well understood how cellular IRESs function to recruit the ribosome or how the 40S ribosomal subunits translocate from the initial recruitment site on the mRNA to the AUG initiation codon. We have investigated the canonical factors that are required by the IRESs found in the 5′-UTRs of c-, L-, and N-myc, using specific inhibitors and a tissue culture-based assay system, and have shown that they differ considerably in their requirements. The L-myc IRES requires the eIF4F complex and the association of PABP and eIF3 with eIF4G for activity. The minimum requirements of the N- and c-myc IRESs are the C-terminal domain of eIF4G to which eIF4A is bound and eIF3, although interestingly this protein does not appear to be recruited to the IRES RNA via eIF4G. Finally, our data show that all three IRESs require a ternary complex, although in contrast to c- and L-myc IRESs, the N-myc IRES has a lesser requirement for a ternary complex. PMID:19124605
ERIC Educational Resources Information Center
Oberauer, Klaus; Bialkova, Svetlana
2009-01-01
Processing information in working memory requires selective access to a subset of working-memory contents by a focus of attention. Complex cognition often requires joint access to 2 items in working memory. How does the focus select 2 items? Two experiments with an arithmetic task and 1 with a spatial task investigate time demands for successive…
Development of a numerical methodology for flowforming process simulation of complex geometry tubes
NASA Astrophysics Data System (ADS)
Varela, Sonia; Santos, Maite; Arroyo, Amaia; Pérez, Iñaki; Puigjaner, Joan Francesc; Puigjaner, Blanca
2017-10-01
Nowadays, the incremental flowforming process is widely explored because of the usage of complex tubular products is increasing due to the light-weighting trend and the use of expensive materials. The enhanced mechanical properties of finished parts combined with the process efficiency in terms of raw material and energy consumption are the key factors for its competitiveness and sustainability, which is consistent with EU industry policy. As a promising technology, additional steps for extending the existing flowforming limits in the production of tubular products are required. The objective of the present research is to further expand the current state of the art regarding limitations on tube thickness and diameter, exploring the feasibility to flowform complex geometries as tubes of elevated thickness of up to 60 mm. In this study, the analysis of the backward flowforming process of 7075 aluminum tubular preform is carried out to define the optimum process parameters, machine requirements and tooling geometry as demonstration case. Numerical simulation studies on flowforming of thin walled tubular components have been considered to increase the knowledge of the technology. The calculation of the rotational movement of the mesh preform, the high ratio thickness/length and the thermomechanical condition increase significantly the computation time of the numerical simulation model. This means that efficient and reliable tools able to predict the forming loads and the quality of flowformed thick tubes are not available. This paper aims to overcome this situation by developing a simulation methodology based on FEM simulation code including new strategies. Material characterization has also been performed through tensile test to able to design the process. Finally, to check the reliability of the model, flowforming tests at industrial environment have been developed.
NASA Astrophysics Data System (ADS)
Marsh, C.; Pomeroy, J. W.; Wheater, H. S.
2017-12-01
Accurate management of water resources is necessary for social, economic, and environmental sustainability worldwide. In locations with seasonal snowcovers, the accurate prediction of these water resources is further complicated due to frozen soils, solid-phase precipitation, blowing snow transport, and snowcover-vegetation-atmosphere interactions. Complex process interactions and feedbacks are a key feature of hydrological systems and may result in emergent phenomena, i.e., the arising of novel and unexpected properties within a complex system. One example is the feedback associated with blowing snow redistribution, which can lead to drifts that cause locally-increased soil moisture, thus increasing plant growth that in turn subsequently impacts snow redistribution, creating larger drifts. Attempting to simulate these emergent behaviours is a significant challenge, however, and there is concern that process conceptualizations within current models are too incomplete to represent the needed interactions. An improved understanding of the role of emergence in hydrological systems often requires high resolution distributed numerical hydrological models that incorporate the relevant process dynamics. The Canadian Hydrological Model (CHM) provides a novel tool for examining cold region hydrological systems. Key features include efficient terrain representation, allowing simulations at various spatial scales, reduced computational overhead, and a modular process representation allowing for an alternative-hypothesis framework. Using both physics-based and conceptual process representations sourced from long term process studies and the current cold regions literature allows for comparison of process representations and importantly, their ability to produce emergent behaviours. Examining the system in a holistic, process-based manner can hopefully derive important insights and aid in development of improved process representations.
Requirements Development Issues for Advanced Life Support Systems: Solid Waste Management
NASA Technical Reports Server (NTRS)
Levri, Julie A.; Fisher, John W.; Alazraki, Michael P.; Hogan, John A.
2002-01-01
Long duration missions pose substantial new challenges for solid waste management in Advanced Life Support (ALS) systems. These possibly include storing large volumes of waste material in a safe manner, rendering wastes stable or sterilized for extended periods of time, and/or processing wastes for recovery of vital resources. This is further complicated because future missions remain ill-defined with respect to waste stream quantity, composition and generation schedule. Without definitive knowledge of this information, development of requirements is hampered. Additionally, even if waste streams were well characterized, other operational and processing needs require clarification (e.g. resource recovery requirements, planetary protection constraints). Therefore, the development of solid waste management (SWM) subsystem requirements for long duration space missions is an inherently uncertain, complex and iterative process. The intent of this paper is to address some of the difficulties in writing requirements for missions that are not completely defined. This paper discusses an approach and motivation for ALS SWM requirements development, the characteristics of effective requirements, and the presence of those characteristics in requirements that are developed for uncertain missions. Associated drivers for life support system technological capability are also presented. A general means of requirements forecasting is discussed, including successive modification of requirements and the need to consider requirements integration among subsystems.
Representation control increases task efficiency in complex graphical representations.
Moritz, Julia; Meyerhoff, Hauke S; Meyer-Dernbecher, Claudia; Schwan, Stephan
2018-01-01
In complex graphical representations, the relevant information for a specific task is often distributed across multiple spatial locations. In such situations, understanding the representation requires internal transformation processes in order to extract the relevant information. However, digital technology enables observers to alter the spatial arrangement of depicted information and therefore to offload the transformation processes. The objective of this study was to investigate the use of such a representation control (i.e. the users' option to decide how information should be displayed) in order to accomplish an information extraction task in terms of solution time and accuracy. In the representation control condition, the participants were allowed to reorganize the graphical representation and reduce information density. In the control condition, no interactive features were offered. We observed that participants in the representation control condition solved tasks that required reorganization of the maps faster and more accurate than participants without representation control. The present findings demonstrate how processes of cognitive offloading, spatial contiguity, and information coherence interact in knowledge media intended for broad and diverse groups of recipients.
NASA Astrophysics Data System (ADS)
Xing, Xi; Rey-de-Castro, Roberto; Rabitz, Herschel
2014-12-01
Optimally shaped femtosecond laser pulses can often be effectively identified in adaptive feedback quantum control experiments, but elucidating the underlying control mechanism can be a difficult task requiring significant additional analysis. We introduce landscape Hessian analysis (LHA) as a practical experimental tool to aid in elucidating control mechanism insights. This technique is applied to the dissociative ionization of CH2BrI using shaped fs laser pulses for optimization of the absolute yields of ionic fragments as well as their ratios for the competing processes of breaking the C-Br and C-I bonds. The experimental results suggest that these nominally complex problems can be reduced to a low-dimensional control space with insights into the control mechanisms. While the optimal yield for some fragments is dominated by a non-resonant intensity-driven process, the optimal generation of other fragments maa difficult task requiring significant additionaly be explained by a non-resonant process coupled to few level resonant dynamics. Theoretical analysis and modeling is consistent with the experimental observations.
Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment
NASA Technical Reports Server (NTRS)
Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.
2016-01-01
Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.
Representation control increases task efficiency in complex graphical representations
Meyerhoff, Hauke S.; Meyer-Dernbecher, Claudia; Schwan, Stephan
2018-01-01
In complex graphical representations, the relevant information for a specific task is often distributed across multiple spatial locations. In such situations, understanding the representation requires internal transformation processes in order to extract the relevant information. However, digital technology enables observers to alter the spatial arrangement of depicted information and therefore to offload the transformation processes. The objective of this study was to investigate the use of such a representation control (i.e. the users' option to decide how information should be displayed) in order to accomplish an information extraction task in terms of solution time and accuracy. In the representation control condition, the participants were allowed to reorganize the graphical representation and reduce information density. In the control condition, no interactive features were offered. We observed that participants in the representation control condition solved tasks that required reorganization of the maps faster and more accurate than participants without representation control. The present findings demonstrate how processes of cognitive offloading, spatial contiguity, and information coherence interact in knowledge media intended for broad and diverse groups of recipients. PMID:29698443
Harnessing QbD, Programming Languages, and Automation for Reproducible Biology.
Sadowski, Michael I; Grant, Chris; Fell, Tim S
2016-03-01
Building robust manufacturing processes from biological components is a task that is highly complex and requires sophisticated tools to describe processes, inputs, and measurements and administrate management of knowledge, data, and materials. We argue that for bioengineering to fully access biological potential, it will require application of statistically designed experiments to derive detailed empirical models of underlying systems. This requires execution of large-scale structured experimentation for which laboratory automation is necessary. This requires development of expressive, high-level languages that allow reusability of protocols, characterization of their reliability, and a change in focus from implementation details to functional properties. We review recent developments in these areas and identify what we believe is an exciting trend that promises to revolutionize biotechnology. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
The RNA Polymerase-Associated Factor 1 Complex Is Required for Plant Touch Responses
Jensen, Gregory S.; Fal, Kateryna; Hamant, Olivier
2017-01-01
Abstract Thigmomorphogenesis is a stereotypical developmental alteration in the plant body plan that can be induced by repeatedly touching plant organs. To unravel how plants sense and record multiple touch stimuli we performed a novel forward genetic screen based on the development of a shorter stem in response to repetitive touch. The touch insensitive (ths1) mutant identified in this screen is defective in some aspects of shoot and root thigmomorphogenesis. The ths1 mutant is an intermediate loss-of-function allele of VERNALIZATION INDEPENDENCE 3 (VIP3), a previously characterized gene whose product is part of the RNA polymerase II-associated factor 1 (Paf1) complex. The Paf1 complex is found in yeast, plants and animals, and has been implicated in histone modification and RNA processing. Several components of the Paf1 complex are required for reduced stem height in response to touch and normal root slanting and coiling responses. Global levels of histone H3K36 trimethylation are reduced in VIP3 mutants. In addition, THS1/VIP3 is required for wild type histone H3K36 trimethylation at the TOUCH3 (TCH3) and TOUCH4 (TCH4) loci and for rapid touch-induced upregulation of TCH3 and TCH4 transcripts. Thus, an evolutionarily conserved chromatin-modifying complex is required for both short- and long-term responses to mechanical stimulation, providing insight into how plants record mechanical signals for thigmomorphogenesis. PMID:28204553
Mapping of MPEG-4 decoding on a flexible architecture platform
NASA Astrophysics Data System (ADS)
van der Tol, Erik B.; Jaspers, Egbert G.
2001-12-01
In the field of consumer electronics, the advent of new features such as Internet, games, video conferencing, and mobile communication has triggered the convergence of television and computers technologies. This requires a generic media-processing platform that enables simultaneous execution of very diverse tasks such as high-throughput stream-oriented data processing and highly data-dependent irregular processing with complex control flows. As a representative application, this paper presents the mapping of a Main Visual profile MPEG-4 for High-Definition (HD) video onto a flexible architecture platform. A stepwise approach is taken, going from the decoder application toward an implementation proposal. First, the application is decomposed into separate tasks with self-contained functionality, clear interfaces, and distinct characteristics. Next, a hardware-software partitioning is derived by analyzing the characteristics of each task such as the amount of inherent parallelism, the throughput requirements, the complexity of control processing, and the reuse potential over different applications and different systems. Finally, a feasible implementation is proposed that includes amongst others a very-long-instruction-word (VLIW) media processor, one or more RISC processors, and some dedicated processors. The mapping study of the MPEG-4 decoder proves the flexibility and extensibility of the media-processing platform. This platform enables an effective HW/SW co-design yielding a high performance density.
Establishing and Maintaining an Extensive Library of Patient-Derived Xenograft Models.
Mattar, Marissa; McCarthy, Craig R; Kulick, Amanda R; Qeriqi, Besnik; Guzman, Sean; de Stanchina, Elisa
2018-01-01
Patient-derived xenograft (PDX) models have recently emerged as a highly desirable platform in oncology and are expected to substantially broaden the way in vivo studies are designed and executed and to reshape drug discovery programs. However, acquisition of patient-derived samples, and propagation, annotation and distribution of PDXs are complex processes that require a high degree of coordination among clinic, surgery and laboratory personnel, and are fraught with challenges that are administrative, procedural and technical. Here, we examine in detail the major aspects of this complex process and relate our experience in establishing a PDX Core Laboratory within a large academic institution.
Tree physiology research in a changing world.
Kaufmann, Merrill R.; Linder, Sune
1996-01-01
Changes in issues and advances in methodology have contributed to substantial progress in tree physiology research during the last several decades. Current research focuses on process interactions in complex systems and the integration of processes across multiple spatial and temporal scales. An increasingly important challenge for future research is assuring sustainability of production systems and forested ecosystems in the face of increased demands for natural resources and human disturbance of forests. Meeting this challenge requires significant shifts in research approach, including the study of limitations of productivity that may accompany achievement of system sustainability, and a focus on the biological capabilities of complex land bases altered by human activity.
Yao, Wei; Beckwith, Sean L.; Zheng, Tina; Young, Thomas; Dinh, Van T.; Ranjan, Anand; Morrison, Ashby J.
2015-01-01
ATP-dependent chromatin remodeling, which repositions and restructures nucleosomes, is essential to all DNA-templated processes. The INO80 chromatin remodeling complex is an evolutionarily conserved complex involved in diverse cellular processes, including transcription, DNA repair, and replication. The functional diversity of the INO80 complex can, in part, be attributed to specialized activities of distinct subunits that compose the complex. Furthermore, structural analyses have identified biochemically discrete subunit modules that assemble along the Ino80 ATPase scaffold. Of particular interest is the Saccharomyces cerevisiae Arp5-Ies6 module located proximal to the Ino80 ATPase and the Rvb1-Rvb2 helicase module needed for INO80-mediated in vitro activity. In this study we demonstrate that the previously uncharacterized Ies2 subunit is required for Arp5-Ies6 association with the catalytic components of the INO80 complex. In addition, Arp5-Ies6 module assembly with the INO80 complex is dependent on distinct conserved domains within Arp5, Ies6, and Ino80, including the spacer region within the Ino80 ATPase domain. Arp5-Ies6 interacts with chromatin via assembly with the INO80 complex, as IES2 and INO80 deletion results in loss of Arp5-Ies6 chromatin association. Interestingly, ectopic addition of the wild-type Arp5-Ies6 module stimulates INO80-mediated ATP hydrolysis and nucleosome sliding in vitro. However, the addition of mutant Arp5 lacking unique insertion domains facilitates ATP hydrolysis in the absence of nucleosome sliding. Collectively, these results define the requirements of Arp5-Ies6 assembly, which are needed to couple ATP hydrolysis to productive nucleosome movement. PMID:26306040
DOT National Transportation Integrated Search
2007-01-01
Surface transportation planning in the United States has become a complex system of intergovernmental planning : and environmental compliance requirements over the past several decades. As a result, the process from planning : stage to project implem...
The Future Role of Information Technology in Erosion Modelling
USDA-ARS?s Scientific Manuscript database
Natural resources management and decision-making is a complex process requiring cooperation and communication among federal, state, and local stakeholders balancing biophysical and socio-economic concerns. Predicting soil erosion is common practice in natural resource management for assessing the e...
36 CFR 1250.26 - How quickly will NARA respond to my FOIA request?
Code of Federal Regulations, 2011 CFR
2011-07-01
... requesters of any complexity in processing their request, which may lengthen the time required to reach a final decision on the release of the records. (b) In most cases, NARA will make a decision on the...
Management Information Systems.
ERIC Educational Resources Information Center
Finlayson, Jean, Ed.
1989-01-01
This collection of papers addresses key questions facing college managers and others choosing, introducing, and living with big, complex computer-based systems. "What Use the User Requirement?" (Tony Coles) stresses the importance of an information strategy driven by corporate objectives, not technology. "Process of Selecting a…
Staib, Andrew; Sullivan, Clair; Jones, Matt; Griffin, Bronwyn; Bell, Anthony; Scott, Ian
2017-06-01
Patients who require emergency admission to hospital require complex care that can be fragmented, occurring in the ED, across the ED-inpatient interface (EDii) and subsequently, in their destination inpatient ward. Our hospital had poor process efficiency with slow transit times for patients requiring emergency care. ED clinicians alone were able to improve the processes and length of stay for the patients discharged directly from the ED. However, improving the efficiency of care for patients requiring emergency admission to true inpatient wards required collaboration with reluctant inpatient clinicians. The inpatient teams were uninterested in improving time-based measures of care in isolation, but they were motivated by improving patient outcomes. We developed a dashboard showing process measures such as 4 h rule compliance rate coupled with clinically important outcome measures such as inpatient mortality. The EDii dashboard helped unite both ED and inpatient teams in clinical redesign to improve both efficiencies of care and patient outcomes. © 2016 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.
Precision manufacturing for clinical-quality regenerative medicines.
Williams, David J; Thomas, Robert J; Hourd, Paul C; Chandra, Amit; Ratcliffe, Elizabeth; Liu, Yang; Rayment, Erin A; Archer, J Richard
2012-08-28
Innovations in engineering applied to healthcare make a significant difference to people's lives. Market growth is guaranteed by demographics. Regulation and requirements for good manufacturing practice-extreme levels of repeatability and reliability-demand high-precision process and measurement solutions. Emerging technologies using living biological materials add complexity. This paper presents some results of work demonstrating the precision automated manufacture of living materials, particularly the expansion of populations of human stem cells for therapeutic use as regenerative medicines. The paper also describes quality engineering techniques for precision process design and improvement, and identifies the requirements for manufacturing technology and measurement systems evolution for such therapies.
Can spectro-temporal complexity explain the autistic pattern of performance on auditory tasks?
Samson, Fabienne; Mottron, Laurent; Jemel, Boutheina; Belin, Pascal; Ciocca, Valter
2006-01-01
To test the hypothesis that level of neural complexity explain the relative level of performance and brain activity in autistic individuals, available behavioural, ERP and imaging findings related to the perception of increasingly complex auditory material under various processing tasks in autism were reviewed. Tasks involving simple material (pure tones) and/or low-level operations (detection, labelling, chord disembedding, detection of pitch changes) show a superior level of performance and shorter ERP latencies. In contrast, tasks involving spectrally- and temporally-dynamic material and/or complex operations (evaluation, attention) are poorly performed by autistics, or generate inferior ERP activity or brain activation. Neural complexity required to perform auditory tasks may therefore explain pattern of performance and activation of autistic individuals during auditory tasks.
Automatic differential analysis of NMR experiments in complex samples.
Margueritte, Laure; Markov, Petar; Chiron, Lionel; Starck, Jean-Philippe; Vonthron-Sénécheau, Catherine; Bourjot, Mélanie; Delsuc, Marc-André
2018-06-01
Liquid state nuclear magnetic resonance (NMR) is a powerful tool for the analysis of complex mixtures of unknown molecules. This capacity has been used in many analytical approaches: metabolomics, identification of active compounds in natural extracts, and characterization of species, and such studies require the acquisition of many diverse NMR measurements on series of samples. Although acquisition can easily be performed automatically, the number of NMR experiments involved in these studies increases very rapidly, and this data avalanche requires to resort to automatic processing and analysis. We present here a program that allows the autonomous, unsupervised processing of a large corpus of 1D, 2D, and diffusion-ordered spectroscopy experiments from a series of samples acquired in different conditions. The program provides all the signal processing steps, as well as peak-picking and bucketing of 1D and 2D spectra, the program and its components are fully available. In an experiment mimicking the search of a bioactive species in a natural extract, we use it for the automatic detection of small amounts of artemisinin added to a series of plant extracts and for the generation of the spectral fingerprint of this molecule. This program called Plasmodesma is a novel tool that should be useful to decipher complex mixtures, particularly in the discovery of biologically active natural products from plants extracts but can also in drug discovery or metabolomics studies. Copyright © 2017 John Wiley & Sons, Ltd.
Pixel-based OPC optimization based on conjugate gradients.
Ma, Xu; Arce, Gonzalo R
2011-01-31
Optical proximity correction (OPC) methods are resolution enhancement techniques (RET) used extensively in the semiconductor industry to improve the resolution and pattern fidelity of optical lithography. In pixel-based OPC (PBOPC), the mask is divided into small pixels, each of which is modified during the optimization process. Two critical issues in PBOPC are the required computational complexity of the optimization process, and the manufacturability of the optimized mask. Most current OPC optimization methods apply the steepest descent (SD) algorithm to improve image fidelity augmented by regularization penalties to reduce the complexity of the mask. Although simple to implement, the SD algorithm converges slowly. The existing regularization penalties, however, fall short in meeting the mask rule check (MRC) requirements often used in semiconductor manufacturing. This paper focuses on developing OPC optimization algorithms based on the conjugate gradient (CG) method which exhibits much faster convergence than the SD algorithm. The imaging formation process is represented by the Fourier series expansion model which approximates the partially coherent system as a sum of coherent systems. In order to obtain more desirable manufacturability properties of the mask pattern, a MRC penalty is proposed to enlarge the linear size of the sub-resolution assistant features (SRAFs), as well as the distances between the SRAFs and the main body of the mask. Finally, a projection method is developed to further reduce the complexity of the optimized mask pattern.
Adaptive learning in complex reproducing kernel Hilbert spaces employing Wirtinger's subgradients.
Bouboulis, Pantelis; Slavakis, Konstantinos; Theodoridis, Sergios
2012-03-01
This paper presents a wide framework for non-linear online supervised learning tasks in the context of complex valued signal processing. The (complex) input data are mapped into a complex reproducing kernel Hilbert space (RKHS), where the learning phase is taking place. Both pure complex kernels and real kernels (via the complexification trick) can be employed. Moreover, any convex, continuous and not necessarily differentiable function can be used to measure the loss between the output of the specific system and the desired response. The only requirement is the subgradient of the adopted loss function to be available in an analytic form. In order to derive analytically the subgradients, the principles of the (recently developed) Wirtinger's calculus in complex RKHS are exploited. Furthermore, both linear and widely linear (in RKHS) estimation filters are considered. To cope with the problem of increasing memory requirements, which is present in almost all online schemes in RKHS, the sparsification scheme, based on projection onto closed balls, has been adopted. We demonstrate the effectiveness of the proposed framework in a non-linear channel identification task, a non-linear channel equalization problem and a quadrature phase shift keying equalization scheme, using both circular and non circular synthetic signal sources.
Enrolling adolescents in HIV vaccine trials: reflections on legal complexities from South Africa.
Slack, Catherine; Strode, Ann; Fleischer, Theodore; Gray, Glenda; Ranchod, Chitra
2007-05-13
South Africa is likely to be the first country in the world to host an adolescent HIV vaccine trial. Adolescents may be enrolled in late 2007. In the development and review of adolescent HIV vaccine trial protocols there are many complexities to consider, and much work to be done if these important trials are to become a reality. This article sets out essential requirements for the lawful conduct of adolescent research in South Africa including compliance with consent requirements, child protection laws, and processes for the ethical and regulatory approval of research. This article outlines likely complexities for researchers and research ethics committees, including determining that trial interventions meet current risk standards for child research. Explicit recommendations are made for role-players in other jurisdictions who may also be planning such trials. This article concludes with concrete steps for implementing these important trials in South Africa and other jurisdictions, including planning for consent processes; delineating privacy rights; compiling information necessary for ethics committees to assess risks to child participants; training trial site staff to recognize when disclosures trig mandatory reporting response; networking among relevant ethics committees; and lobbying the National Regulatory Authority for guidance.
Enrolling adolescents in HIV vaccine trials: reflections on legal complexities from South Africa
Slack, Catherine; Strode, Ann; Fleischer, Theodore; Gray, Glenda; Ranchod, Chitra
2007-01-01
Background South Africa is likely to be the first country in the world to host an adolescent HIV vaccine trial. Adolescents may be enrolled in late 2007. In the development and review of adolescent HIV vaccine trial protocols there are many complexities to consider, and much work to be done if these important trials are to become a reality. Discussion This article sets out essential requirements for the lawful conduct of adolescent research in South Africa including compliance with consent requirements, child protection laws, and processes for the ethical and regulatory approval of research. Summary This article outlines likely complexities for researchers and research ethics committees, including determining that trial interventions meet current risk standards for child research. Explicit recommendations are made for role-players in other jurisdictions who may also be planning such trials. This article concludes with concrete steps for implementing these important trials in South Africa and other jurisdictions, including planning for consent processes; delineating privacy rights; compiling information necessary for ethics committees to assess risks to child participants; training trial site staff to recognize when disclosures trig mandatory reporting response; networking among relevant ethics commitees; and lobbying the National Regulatory Authority for guidance. PMID:17498316
Mission activities planning for a Hermes mission by means of AI-technology
NASA Technical Reports Server (NTRS)
Pape, U.; Hajen, G.; Schielow, N.; Mitschdoerfer, P.; Allard, F.
1993-01-01
Mission Activities Planning is a complex task to be performed by mission control centers. AI technology can offer attractive solutions to the planning problem. This paper presents the use of a new AI-based Mission Planning System for crew activity planning. Based on a HERMES servicing mission to the COLUMBUS Man Tended Free Flyer (MTFF) with complex time and resource constraints, approximately 2000 activities with 50 different resources have been generated, processed, and planned with parametric variation of operationally sensitive parameters. The architecture, as well as the performance of the mission planning system, is discussed. An outlook to future planning scenarios, the requirements, and how a system like MARS can fulfill those requirements is given.
NASA Technical Reports Server (NTRS)
Fatig, Michael
1993-01-01
Flight operations and the preparation for it has become increasingly complex as mission complexities increase. Further, the mission model dictates that a significant increase in flight operations activities is upon us. Finally, there is a need for process improvement and economy in the operations arena. It is therefore time that we recognize flight operations as a complex process requiring a defined, structured, and life cycle approach vitally linked to space segment, ground segment, and science operations processes. With this recognition, an FOT Tool Kit consisting of six major components designed to provide tools to guide flight operations activities throughout the mission life cycle was developed. The major components of the FOT Tool Kit and the concepts behind the flight operations life cycle process as developed at NASA's GSFC for GSFC-based missions are addressed. The Tool Kit is therefore intended to increase productivity, quality, cost, and schedule performance of the flight operations tasks through the use of documented, structured methodologies; knowledge of past lessons learned and upcoming new technology; and through reuse and sharing of key products and special application programs made possible through the development of standardized key products and special program directories.
Hybrid acousto-optic and digital equalization for microwave digital radio channels
NASA Astrophysics Data System (ADS)
Anderson, C. S.; Vanderlugt, A.
1990-11-01
Digital radio transmission systems use complex modulation schemes that require powerful signal-processing techniques to correct channel distortions and to minimize BERs. This paper proposes combining the computation power of acoustooptic processing and the accuracy of digital processing to produce a hybrid channel equalizer that exceeds the performance of digital equalization alone. Analysis shows that a hybrid equalizer for 256-level quadrature amplitude modulation (QAM) performs better than a digital equalizer for 64-level QAM.
An expert systems application to space base data processing
NASA Technical Reports Server (NTRS)
Babb, Stephen M.
1988-01-01
The advent of space vehicles with their increased data requirements are reflected in the complexity of future telemetry systems. Space based operations with its immense operating costs will shift the burden of data processing and routine analysis from the space station to the Orbital Transfer Vehicle (OTV). A research and development project is described which addresses the real time onboard data processing tasks associated with a space based vehicle, specifically focusing on an implementation of an expert system.
Information processing for aerospace structural health monitoring
NASA Astrophysics Data System (ADS)
Lichtenwalner, Peter F.; White, Edward V.; Baumann, Erwin W.
1998-06-01
Structural health monitoring (SHM) technology provides a means to significantly reduce life cycle of aerospace vehicles by eliminating unnecessary inspections, minimizing inspection complexity, and providing accurate diagnostics and prognostics to support vehicle life extension. In order to accomplish this, a comprehensive SHM system will need to acquire data from a wide variety of diverse sensors including strain gages, accelerometers, acoustic emission sensors, crack growth gages, corrosion sensors, and piezoelectric transducers. Significant amounts of computer processing will then be required to convert this raw sensor data into meaningful information which indicates both the diagnostics of the current structural integrity as well as the prognostics necessary for planning and managing the future health of the structure in a cost effective manner. This paper provides a description of the key types of information processing technologies required in an effective SHM system. These include artificial intelligence techniques such as neural networks, expert systems, and fuzzy logic for nonlinear modeling, pattern recognition, and complex decision making; signal processing techniques such as Fourier and wavelet transforms for spectral analysis and feature extraction; statistical algorithms for optimal detection, estimation, prediction, and fusion; and a wide variety of other algorithms for data analysis and visualization. The intent of this paper is to provide an overview of the role of information processing for SHM, discuss various technologies which can contribute to accomplishing this role, and present some example applications of information processing for SHM implemented at the Boeing Company.
A hybrid optimization approach in non-isothermal glass molding
NASA Astrophysics Data System (ADS)
Vu, Anh-Tuan; Kreilkamp, Holger; Krishnamoorthi, Bharathwaj Janaki; Dambon, Olaf; Klocke, Fritz
2016-10-01
Intensively growing demands on complex yet low-cost precision glass optics from the today's photonic market motivate the development of an efficient and economically viable manufacturing technology for complex shaped optics. Against the state-of-the-art replication-based methods, Non-isothermal Glass Molding turns out to be a promising innovative technology for cost-efficient manufacturing because of increased mold lifetime, less energy consumption and high throughput from a fast process chain. However, the selection of parameters for the molding process usually requires a huge effort to satisfy precious requirements of the molded optics and to avoid negative effects on the expensive tool molds. Therefore, to reduce experimental work at the beginning, a coupling CFD/FEM numerical modeling was developed to study the molding process. This research focuses on the development of a hybrid optimization approach in Non-isothermal glass molding. To this end, an optimal configuration with two optimization stages for multiple quality characteristics of the glass optics is addressed. The hybrid Back-Propagation Neural Network (BPNN)-Genetic Algorithm (GA) is first carried out to realize the optimal process parameters and the stability of the process. The second stage continues with the optimization of glass preform using those optimal parameters to guarantee the accuracy of the molded optics. Experiments are performed to evaluate the effectiveness and feasibility of the model for the process development in Non-isothermal glass molding.
Biosimilarity Versus Manufacturing Change: Two Distinct Concepts.
Declerck, Paul; Farouk-Rezk, Mourad; Rudd, Pauline M
2016-02-01
As products of living cells, biologics are far more complicated than small molecular-weight drugs not only with respect to size and structural complexity but also their sensitivity to manufacturing processes and post-translational changes. Most of the information on the manufacturing process of biotherapeutics is proprietary and hence not fully accessible to the public. This information gap represents a key challenge for biosimilar developers and plays a key role in explaining the differences in regulatory pathways required to demonstrate biosimilarity versus those required to ensure that a change in manufacturing process did not have implications on safety and efficacy. Manufacturing process changes are frequently needed for a variety of reasons including response to regulatory requirements, up scaling production, change in facility, change in raw materials, improving control of quality (consistency) or optimising production efficiency. The scope of the change is usually a key indicator of the scale of analysis required to evaluate the quality. In most cases, where the scope of the process change is limited, only quality and analytical studies should be sufficient while comparative clinical studies can be required in case of major changes (e.g., cell line changes). Biosimilarity exercises have been addressed differently by regulators on the understanding that biosimilar developers start with fundamental differences being a new cell line and also a knowledge gap of the innovator's processes, including culture media, purification processes, and potentially different formulations, and are thus required to ensure that differences from innovators do not result in differences in efficacy and safety.
A practitioner's guide to service development.
Lees, Liz
2010-11-01
Service development and service improvement are complex concepts, but this should not prevent practitioners engaging in, or initiating, them. There is no set blueprint for service development so this article examines the process, describes the skills required, lists some change management tools and offers a guide to the stages involved. The article aims to demystify service development for those considering embarking on the process for the first time.
NASA Technical Reports Server (NTRS)
Mavris, Dimitri; Osburg, Jan
2005-01-01
An important enabler of the new national Vision for Space Exploration is the ability to rapidly and efficiently develop optimized concepts for the manifold future space missions that this effort calls for. The design of such complex systems requires a tight integration of all the engineering disciplines involved, in an environment that fosters interaction and collaboration. The research performed under this grant explored areas where the space systems design process can be enhanced: by integrating risk models into the early stages of the design process, and by including rapid-turnaround variable-fidelity tools for key disciplines. Enabling early assessment of mission risk will allow designers to perform trades between risk and design performance during the initial design space exploration. Entry into planetary atmospheres will require an increased emphasis of the critical disciplines of aero- and thermodynamics. This necessitates the pulling forward of EDL disciplinary expertise into the early stage of the design process. Radiation can have a large potential impact on overall mission designs, in particular for the planned nuclear-powered robotic missions under Project Prometheus and for long-duration manned missions to the Moon, Mars and beyond under Project Constellation. This requires that radiation and associated risk and hazards be assessed and mitigated at the earliest stages of the design process. Hence, RPS is another discipline needed to enhance the engineering competencies of conceptual design teams. Researchers collaborated closely with NASA experts in those disciplines, and in overall space systems design, at Langley Research Center and at the Jet Propulsion Laboratory. This report documents the results of this initial effort.
Software Process Assurance for Complex Electronics
NASA Technical Reports Server (NTRS)
Plastow, Richard A.
2007-01-01
Complex Electronics (CE) now perform tasks that were previously handled in software, such as communication protocols. Many methods used to develop software bare a close resemblance to CE development. Field Programmable Gate Arrays (FPGAs) can have over a million logic gates while system-on-chip (SOC) devices can combine a microprocessor, input and output channels, and sometimes an FPGA for programmability. With this increased intricacy, the possibility of software-like bugs such as incorrect design, logic, and unexpected interactions within the logic is great. With CE devices obscuring the hardware/software boundary, we propose that mature software methodologies may be utilized with slight modifications in the development of these devices. Software Process Assurance for Complex Electronics (SPACE) is a research project that used standardized S/W Assurance/Engineering practices to provide an assurance framework for development activities. Tools such as checklists, best practices and techniques were used to detect missing requirements and bugs earlier in the development cycle creating a development process for CE that was more easily maintained, consistent and configurable based on the device used.
Developments and advances concerning the hyperpolarisation technique SABRE.
Mewis, Ryan E
2015-10-01
To overcome the inherent sensitivity issue in NMR and MRI, hyperpolarisation techniques are used. Signal Amplification By Reversible Exchange (SABRE) is a hyperpolarisation technique that utilises parahydrogen, a molecule that possesses a nuclear singlet state, as the source of polarisation. A metal complex is required to break the singlet order of parahydrogen and, by doing so, facilitates polarisation transfer to analyte molecules ligated to the same complex through the J-coupled network that exists. The increased signal intensities that the analyte molecules possess as a result of this process have led to investigations whereby their potential as MRI contrast agents has been probed and to understand the fundamental processes underpinning the polarisation transfer mechanism. As well as discussing literature relevant to both of these areas, the chemical structure of the complex, the physical constraints of the polarisation transfer process and the successes of implementing SABRE at low and high magnetic fields are discussed. Copyright © 2015 John Wiley & Sons, Ltd.
Design applications for supercomputers
NASA Technical Reports Server (NTRS)
Studerus, C. J.
1987-01-01
The complexity of codes for solutions of real aerodynamic problems has progressed from simple two-dimensional models to three-dimensional inviscid and viscous models. As the algorithms used in the codes increased in accuracy, speed and robustness, the codes were steadily incorporated into standard design processes. The highly sophisticated codes, which provide solutions to the truly complex flows, require computers with large memory and high computational speed. The advent of high-speed supercomputers, such that the solutions of these complex flows become more practical, permits the introduction of the codes into the design system at an earlier stage. The results of several codes which either were already introduced into the design process or are rapidly in the process of becoming so, are presented. The codes fall into the area of turbomachinery aerodynamics and hypersonic propulsion. In the former category, results are presented for three-dimensional inviscid and viscous flows through nozzle and unducted fan bladerows. In the latter category, results are presented for two-dimensional inviscid and viscous flows for hypersonic vehicle forebodies and engine inlets.
An improved method for polarimetric image restoration in interferometry
NASA Astrophysics Data System (ADS)
Pratley, Luke; Johnston-Hollitt, Melanie
2016-11-01
Interferometric radio astronomy data require the effects of limited coverage in the Fourier plane to be accounted for via a deconvolution process. For the last 40 years this process, known as `cleaning', has been performed almost exclusively on all Stokes parameters individually as if they were independent scalar images. However, here we demonstrate for the case of the linear polarization P, this approach fails to properly account for the complex vector nature resulting in a process which is dependent on the axes under which the deconvolution is performed. We present here an improved method, `Generalized Complex CLEAN', which properly accounts for the complex vector nature of polarized emission and is invariant under rotations of the deconvolution axes. We use two Australia Telescope Compact Array data sets to test standard and complex CLEAN versions of the Högbom and SDI (Steer-Dwedney-Ito) CLEAN algorithms. We show that in general the complex CLEAN version of each algorithm produces more accurate clean components with fewer spurious detections and lower computation cost due to reduced iterations than the current methods. In particular, we find that the complex SDI CLEAN produces the best results for diffuse polarized sources as compared with standard CLEAN algorithms and other complex CLEAN algorithms. Given the move to wide-field, high-resolution polarimetric imaging with future telescopes such as the Square Kilometre Array, we suggest that Generalized Complex CLEAN should be adopted as the deconvolution method for all future polarimetric surveys and in particular that the complex version of an SDI CLEAN should be used.
Work-Facilitating Information Visualization Techniques for Complex Wastewater Systems
NASA Astrophysics Data System (ADS)
Ebert, Achim; Einsfeld, Katja
The design and the operation of urban drainage systems and wastewater treatment plants (WWTP) have become increasingly complex. This complexity is due to increased requirements concerning process technology, technical, environmental, economical, and occupational safety aspects. The plant operator has access not only to some timeworn filers and measured parameters but also to numerous on-line and off-line parameters that characterize the current state of the plant in detail. Moreover, expert databases and specific support pages of plant manufactures are accessible through the World Wide Web. Thus, the operator is overwhelmed with predominantly unstructured data.
Tissue fusion during early mammalian development requires coordination of multiple cell types, the extracellular matrix, and complex signaling pathways. Fusion events during processes including heart development, neural tube closure, and palatal fusion are dependent on signaling ...
ISSUES AND CHALLENGES IN MODELING CHILDREN'S LONGITUDINAL EXPOSURES: AN OZONE STUDY
Modeling children's exposures is a complicated, data-intensive process. Modeling longitudinal exposures, which are important for regulatory decision making, especially for most air toxics, adds another level of complexity and data requirements. Because it is difficult to model in...
PROCEEDINGS OF THE CROSS DISCIPLINE ECOSYTEM MODELING AND ANALYSIS WORKSHOP
The complexity of environmental problems we face now and in the future is ever increasing. Process linkages among air, land, surface and subsurface water require interdisciplinary modeling approaches. The dynamics of land use change spurred by population and economic growth, ...
You Cannot Hit What You Do Not Shoot
2015-12-30
Psychology from the University of Texas at El Paso . Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2015...requires complex skill that involves a combination of fine and gross motor skills coupled with mental processes before, during, and after the shot (Chung
Reliability-Based Model to Analyze the Performance and Cost of a Transit Fare Collection System.
DOT National Transportation Integrated Search
1985-06-01
The collection of transit system fares has become more sophisticated in recent years, with more flexible structures requiring more sophisticated fare collection equipment to process tickets and admit passengers. However, this new and complex equipmen...
78 FR 22922 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-17
...-consuming applications annually, 4 applications of medium difficulty, and 10 of the least difficult... the exemptive order application process, including preparation and revision of an application and... the costs required to prepare other more complex and novel applications. See also Political...
Notebook Computers Increase Communication.
ERIC Educational Resources Information Center
Carey, Doris M.; Sale, Paul
1994-01-01
Project FIT (Full Inclusion through Technology) provides notebook computers for children with severe disabilities. The computers offer many input and output options. Assessing the students' equipment needs is a complex process, requiring determination of communication goals and baseline abilities, and consideration of equipment features such as…
Electromagnetic Counter-Counter Measure (ECCM) Techniques of the Digital Microwave Radio.
1982-05-01
Frequency hopping requires special synthesizers and filter banks. Large bandwidth expansion in a microwave radio relay application can best be achieved with...34 processing gain " performance as a function of jammer modulation type " pulse jammer performance • emission bandwidth and spectral shaping 0... spectral efficiency, implementation complexity, and suitability for ECCK techniques will be considered. A sumary of the requirements and characteristics of
Microwave intersatellite links for communications satellites
NASA Technical Reports Server (NTRS)
Welti, G. R.
1982-01-01
Applications and interface requirements for intersatellite links (ISLs) between commercial communications satellites are reviewed, ranging from ISLs between widely separated satellites to ISLs between clustered satellites. On-board processing architectures for ISLs employing a variety of modulation schemes are described. These schemes include FM remodulation and QPSK regeneration in combination with switching and buffering. The various architectures are compared in terms of complexity, required performance, antenna size, mass, and power.
Ultra-Wideband Impulse Radio for Tactical Ad-Hoc Military Communications
2010-09-02
Synchronization, Channel Estimation, and Detection for DS - CDMA Impulse-Radio Systems,” IEEE Transactions on Wireless Communications, vol. 4, no. 6, pp...desired user. Complex matrix operations required by other techniques found in the CDMA literature are not required in our suppression process...domain while a frequency-domain procedure for synchronization is studied in [52]. 5 In the CDMA literature, near-far resistant synchronization is studied
NASA Technical Reports Server (NTRS)
Stewart, Helen; Spence, Matt Chew; Holm, Jeanne; Koga, Dennis (Technical Monitor)
2001-01-01
This white paper explores how to increase the success and operation of critical, complex, national systems by effectively capturing knowledge management requirements within the federal acquisition process. Although we focus on aerospace flight systems, the principles outlined within may have a general applicability to other critical federal systems as well. Fundamental design deficiencies in federal, mission-critical systems have contributed to recent, highly visible system failures, such as the V-22 Osprey and the Delta rocket family. These failures indicate that the current mechanisms for knowledge management and risk management are inadequate to meet the challenges imposed by the rising complexity of critical systems. Failures of aerospace system operations and vehicles may have been prevented or lessened through utilization of better knowledge management and information management techniques.
Lesbats, Paul; Botbol, Yair; Chevereau, Guillaume; Vaillant, Cédric; Calmels, Christina; Arneodo, Alain; Andreola, Marie-Line; Lavigne, Marc; Parissi, Vincent
2011-01-01
Establishment of stable HIV-1 infection requires the efficient integration of the retroviral genome into the host DNA. The molecular mechanism underlying the control of this process by the chromatin structure has not yet been elucidated. We show here that stably associated nucleosomes strongly inhibit in vitro two viral-end integration by decreasing the accessibility of DNA to integrase. Remodeling of the chromatinized template by the SWI/SNF complex, whose INI1 major component interacts with IN, restores and redirects the full-site integration into the stable nucleosome region. These effects are not observed after remodeling by other human remodeling factors such as SNF2H or BRG1 lacking the integrase binding protein INI1. This suggests that the restoration process depends on the direct interaction between IN and the whole SWI/SNF complex, supporting a functional coupling between the remodeling and integration complexes. Furthermore, in silico comparison between more than 40,000 non-redundant cellular integration sites selected from literature and nucleosome occupancy predictions also supports that HIV-1 integration is promoted in the genomic region of weaker intrinsic nucleosome density in the infected cell. Our data indicate that some chromatin structures can be refractory for integration and that coupling between nucleosome remodeling and HIV-1 integration is required to overcome this natural barrier. PMID:21347347
From path models to commands during additive printing of large-scale architectural designs
NASA Astrophysics Data System (ADS)
Chepchurov, M. S.; Zhukov, E. M.; Yakovlev, E. A.; Matveykin, V. G.
2018-05-01
The article considers the problem of automation of the formation of large complex parts, products and structures, especially for unique or small-batch objects produced by a method of additive technology [1]. Results of scientific research in search for the optimal design of a robotic complex, its modes of operation (work), structure of its control helped to impose the technical requirements on the technological process for manufacturing and design installation of the robotic complex. Research on virtual models of the robotic complexes allowed defining the main directions of design improvements and the main goal (purpose) of testing of the the manufactured prototype: checking the positioning accuracy of the working part.
Structure-Based Characterization of Multiprotein Complexes
Wiederstein, Markus; Gruber, Markus; Frank, Karl; Melo, Francisco; Sippl, Manfred J.
2014-01-01
Summary Multiprotein complexes govern virtually all cellular processes. Their 3D structures provide important clues to their biological roles, especially through structural correlations among protein molecules and complexes. The detection of such correlations generally requires comprehensive searches in databases of known protein structures by means of appropriate structure-matching techniques. Here, we present a high-speed structure search engine capable of instantly matching large protein oligomers against the complete and up-to-date database of biologically functional assemblies of protein molecules. We use this tool to reveal unseen structural correlations on the level of protein quaternary structure and demonstrate its general usefulness for efficiently exploring complex structural relationships among known protein assemblies. PMID:24954616
A new route for the synthesis of titanium silicalite-1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vasile, Aurelia, E-mail: aurelia_vasile@yahoo.com; Busuioc-Tomoiaga, Alina Maria; Catalysis Research Department, ChemPerformance SRL, Iasi 700337
2012-01-15
Graphical abstract: Well-prepared TS-1 was synthesized by an innovative procedure using inexpensive reagents such as fumed silica and TPABr as structure-directing agent. This is the first time when highly crystalline TS-1 is obtained in basic medium, using sodium hydroxide as HO{sup -} ion source required for the crystallization process. Hydrolysis of titanium source has been prevented by titanium complexation with acetylacetone before structuring gel. Highlights: Black-Right-Pointing-Pointer TS-1 was obtained using cheap reagents as fumed silica and tetrapropylammonium bromide. Black-Right-Pointing-Pointer First time NaOH was used as source of OH{sup -} ions required for crystallization process. Black-Right-Pointing-Pointer The hydrolysis Ti alkoxides wasmore » controlled by Ti complexation with 2,4-pentanedione. -- Abstract: A new and efficient route using inexpensive reagents such as fumed silica and tetrapropylammonium bromide is proposed for the synthesis of titanium silicalite-1. High crystalline titanium silicalite-1 was obtained in alkaline medium, using sodium hydroxide as HO{sup -} ion source required for the crystallization process. Hydrolysis of titanium source with formation of insoluble oxide species was prevented by titanium complexation with before structuring gel. The final solids were fully characterized by powder X-ray diffraction, scanning electron microscopy, Fourier transform infrared, ultraviolet-visible diffuse reflectance, Raman and atomic absorption spectroscopies, as well as nitrogen sorption analysis. It was found that a molar ratio Ti:Si of about 0.04 in the initial reaction mixture is the upper limit to which well formed titanium silicalite-1 with channels free of crystalline or amorphous material can be obtained. Above this value, solids with MFI type structure containing both Ti isomorphously substituted in the network and extralattice anatase nanoparticles inside of channels is formed.« less
Geology is the Key to Explain Igneous Activity in the Mediterranean Area
NASA Astrophysics Data System (ADS)
Lustrino, M.
2014-12-01
Igneous activity in tectonically complex areas can be interpreted in many different ways, producing completely different petrogenetic models. Processes such as oceanic and continental subduction, lithospheric delamination, changes in subduction polarity, slab break-off and mantle plumes have all been advocated as causes for changes in plate boundaries and magma production, including rate and temporal distribution, in the circum-Mediterranean area. This region thus provides a natural laboratory to investigate a range of geodynamic and magmatic processes. Although many petrologic and tectonic models have been proposed, a number of highly controversial questions still remain. No consensus has yet been reached about the capacity of plate-tectonic processes to explain the origin and style of the magmatism. Similarly, there is still not consensus on the ability of geochemical and petrological arguments to reveal the geodynamic evolution of the area. The wide range of chemical and mineralogical magma compositions produced within and around the Mediterranean, from carbonatites to strongly silica-undersaturated silico-carbonatites and melilitites to strongly silica-oversaturated rhyolites, complicate models and usually require a large number of unconstrained assumptions. Can the calcalkaline-sodic alkaline transition be related to any common petrogenetic point? Is igneous activity plate-tectonic- (top-down) or deep-mantle-controlled (bottom-up)? Do the rare carbonatites and carbonate-rich igneous rocks derive from the deep mantle or a normal, CO2-bearing upper mantle? Do ultrapotassic compositions require continental subduction? Understanding chemically complex magmas emplaced in tectonically complex areas require open minds, and avoiding dogma and assumptions. Studying the geology and shallow dynamics, not speculating about the deep lower mantle, is the key to understanding the igneous activity.
NASA Astrophysics Data System (ADS)
Komarova, Natalia L.; Urwin, Erin; Wodarz, Dominik
2012-12-01
Complex traits can require the accumulation of multiple mutations that are individually deleterious. Their evolution requires a fitness valley to be crossed, which can take relatively long time spans. A new evolutionary mechanism is described that accelerates the emergence of complex phenotypes, based on a ``division of labor'' game and the occurrence of cheaters. If each intermediate mutation leads to a product that can be shared with others, the complex type can arise relatively quickly as an emergent property among cooperating individuals, without any given individual having to accumulate all mutations. Moreover, the emergence of cheaters that destroy cooperative interactions can lead to the emergence of individuals that have accumulated all necessary mutations on a time scale that is significantly faster than observed in the absence of cooperation and cheating. Application of this mechanism to somatic and microbial evolution is discussed, including evolutionary processes in tumors, biofilms, and viral infections.
Komarova, Natalia L.; Urwin, Erin; Wodarz, Dominik
2012-01-01
Complex traits can require the accumulation of multiple mutations that are individually deleterious. Their evolution requires a fitness valley to be crossed, which can take relatively long time spans. A new evolutionary mechanism is described that accelerates the emergence of complex phenotypes, based on a “division of labor” game and the occurrence of cheaters. If each intermediate mutation leads to a product that can be shared with others, the complex type can arise relatively quickly as an emergent property among cooperating individuals, without any given individual having to accumulate all mutations. Moreover, the emergence of cheaters that destroy cooperative interactions can lead to the emergence of individuals that have accumulated all necessary mutations on a time scale that is significantly faster than observed in the absence of cooperation and cheating. Application of this mechanism to somatic and microbial evolution is discussed, including evolutionary processes in tumors, biofilms, and viral infections. PMID:23209877
Advanced process control framework initiative
NASA Astrophysics Data System (ADS)
Hill, Tom; Nettles, Steve
1997-01-01
The semiconductor industry, one the world's most fiercely competitive industries, is driven by increasingly complex process technologies and global competition to improve cycle time, quality, and process flexibility. Due to the complexity of these problems, current process control techniques are generally nonautomated, time-consuming, reactive, nonadaptive, and focused on individual fabrication tools and processes. As the semiconductor industry moves into higher density processes, radical new approaches are required. To address the need for advanced factory-level process control in this environment, Honeywell, Advanced Micro Devices (AMD), and SEMATECH formed the Advanced Process Control Framework Initiative (APCFI) joint research project. The project defines and demonstrates an Advanced Process Control (APC) approach based on SEMATECH's Computer Integrated Manufacturing (CIM) Framework. Its scope includes the coordination of Manufacturing Execution Systems, process control tools, and wafer fabrication equipment to provide necessary process control capabilities. Moreover, it takes advantage of the CIM Framework to integrate and coordinate applications from other suppliers that provide services necessary for the overall system to function. This presentation discusses the key concept of model-based process control that differentiates the APC Framework. This major improvement over current methods enables new systematic process control by linking the knowledge of key process settings to desired product characteristics that reside in models created with commercial model development tools The unique framework-based approach facilitates integration of commercial tools and reuse of their data by tying them together in an object-based structure. The presentation also explores the perspective of each organization's involvement in the APCFI project. Each has complementary goals and expertise to contribute; Honeywell represents the supplier viewpoint, AMD represents the user with 'real customer requirements', and SEMATECH provides a consensus-building organization that widely disseminates technology to suppliers and users in the semiconductor industry that face similar equipment and factory control systems challenges.
Competition of simple and complex adoption on interdependent networks
NASA Astrophysics Data System (ADS)
Czaplicka, Agnieszka; Toral, Raul; San Miguel, Maxi
2016-12-01
We consider the competition of two mechanisms for adoption processes: a so-called complex threshold dynamics and a simple susceptible-infected-susceptible (SIS) model. Separately, these mechanisms lead, respectively, to first-order and continuous transitions between nonadoption and adoption phases. We consider two interconnected layers. While all nodes on the first layer follow the complex adoption process, all nodes on the second layer follow the simple adoption process. Coupling between the two adoption processes occurs as a result of the inclusion of some additional interconnections between layers. We find that the transition points and also the nature of the transitions are modified in the coupled dynamics. In the complex adoption layer, the critical threshold required for extension of adoption increases with interlayer connectivity whereas in the case of an isolated single network it would decrease with average connectivity. In addition, the transition can become continuous depending on the detailed interlayer and intralayer connectivities. In the SIS layer, any interlayer connectivity leads to the extension of the adopter phase. Besides, a new transition appears as a sudden drop of the fraction of adopters in the SIS layer. The main numerical findings are described by a mean-field type analytical approach appropriately developed for the threshold-SIS coupled system.
Murk, Kai; Blanco Suarez, Elena M.; Cockbill, Louisa M. R.; Banks, Paul; Hanley, Jonathan G.
2013-01-01
Summary Astrocytes exhibit a complex, branched morphology, allowing them to functionally interact with numerous blood vessels, neighboring glial processes and neuronal elements, including synapses. They also respond to central nervous system (CNS) injury by a process known as astrogliosis, which involves morphological changes, including cell body hypertrophy and thickening of major processes. Following severe injury, astrocytes exhibit drastically reduced morphological complexity and collectively form a glial scar. The mechanistic details behind these morphological changes are unknown. Here, we investigate the regulation of the actin-nucleating Arp2/3 complex in controlling dynamic changes in astrocyte morphology. In contrast to other cell types, Arp2/3 inhibition drives the rapid expansion of astrocyte cell bodies and major processes. This intervention results in a reduced morphological complexity of astrocytes in both dissociated culture and in brain slices. We show that this expansion requires functional myosin II downstream of ROCK and RhoA. Knockdown of the Arp2/3 subunit Arp3 or the Arp2/3 activator N-WASP by siRNA also results in cell body expansion and reduced morphological complexity, whereas depleting WAVE2 specifically reduces the branching complexity of astrocyte processes. By contrast, knockdown of the Arp2/3 inhibitor PICK1 increases astrocyte branching complexity. Furthermore, astrocyte expansion induced by ischemic conditions is delayed by PICK1 knockdown or N-WASP overexpression. Our findings identify a new morphological outcome for Arp2/3 activation in restricting rather than promoting outwards movement of the plasma membrane in astrocytes. The Arp2/3 regulators PICK1, and N-WASP and WAVE2 function antagonistically to control the complexity of astrocyte branched morphology, and this mechanism underlies the morphological changes seen in astrocytes during their response to pathological insult. PMID:23843614
Optimized design of embedded DSP system hardware supporting complex algorithms
NASA Astrophysics Data System (ADS)
Li, Yanhua; Wang, Xiangjun; Zhou, Xinling
2003-09-01
The paper presents an optimized design method for a flexible and economical embedded DSP system that can implement complex processing algorithms as biometric recognition, real-time image processing, etc. It consists of a floating-point DSP, 512 Kbytes data RAM, 1 Mbytes FLASH program memory, a CPLD for achieving flexible logic control of input channel and a RS-485 transceiver for local network communication. Because of employing a high performance-price ratio DSP TMS320C6712 and a large FLASH in the design, this system permits loading and performing complex algorithms with little algorithm optimization and code reduction. The CPLD provides flexible logic control for the whole DSP board, especially in input channel, and allows convenient interface between different sensors and DSP system. The transceiver circuit can transfer data between DSP and host computer. In the paper, some key technologies are also introduced which make the whole system work efficiently. Because of the characters referred above, the hardware is a perfect flat for multi-channel data collection, image processing, and other signal processing with high performance and adaptability. The application section of this paper presents how this hardware is adapted for the biometric identification system with high identification precision. The result reveals that this hardware is easy to interface with a CMOS imager and is capable of carrying out complex biometric identification algorithms, which require real-time process.
Haffeld, Just
2013-11-01
Increasing complexity is following in the wake of rampant globalization. Thus, the discussion about Sustainable Development Goals (SDGs) requires new thinking that departs from a critique of current policy tools in exploration of a complexity-friendly approach. This article argues that potential SDGs should: treat stakeholders, like states, business and civil society actors, as agents on different aggregate levels of networks; incorporate good governance processes that facilitate early involvement of relevant resources, as well as equitable participation, consultative processes, and regular policy and programme implementation reviews; anchor adoption and enforcement of such rules to democratic processes in accountable organizations; and include comprehensive systems evaluations, including procedural indicators. A global framework convention for health could be a suitable instrument for handling some of the challenges related to the governance of a complex environment. It could structure and legitimize government involvement, engage stakeholders, arrange deliberation and decision-making processes with due participation and regular policy review, and define minimum standards for health services. A monitoring scheme could ensure that agents in networks comply according to whole-systems targets, locally defined outcome indicators, and process indicators, thus resolving the paradox of government control vs. local policy space. A convention could thus exploit the energy created in the encounter between civil society, international organizations and national authorities. Copyright © 2013 Reproductive Health Matters. Published by Elsevier Ltd. All rights reserved.
Neufeld, Nathan J; Hoyer, Erik H; Cabahug, Philippines; González-Fernández, Marlís; Mehta, Megha; Walker, N Colbey; Powers, Richard L; Mayer, R Samuel
2013-01-01
Lean Six Sigma (LSS) process analysis can be used to increase completeness of discharge summary reports used as a critical communication tool when a patient transitions between levels of care. The authors used the LSS methodology as an intervention to improve systems process. Over the course of the project, 8 required elements were analyzed in the discharge paperwork. The authors analyzed the discharge paperwork of patients (42 patients preintervention and 143 patients postintervention) of a comprehensive integrated inpatient rehabilitation program (CIIRP). Prior to this LSS project, 61.8% of required discharge elements were present. The intervention improved the completeness to 94.2% of the required elements. The percentage of charts that were 100% complete increased from 11.9% to 67.8%. LSS is a well-established process improvement methodology that can be used to make significant improvements in complex health care workflow issues. Specifically, the completeness of discharge documentation required for transition of care to CIIRP can be improved.
Considerations In The Design And Specifications Of An Automatic Inspection System
NASA Astrophysics Data System (ADS)
Lee, David T.
1980-05-01
Considerable activities have been centered around the automation of manufacturing quality control and inspection functions. Several reasons can be cited for this development. The continuous pressure of direct and indirect labor cost increase is only one of the obvious motivations. With the drive for electronics miniaturization come more and more complex processes where control parameters are critical and the yield is highly susceptible to inadequate process monitor and inspection. With multi-step, multi-layer process for substrate fabrication, process defects that are not detected and corrected at certain critical points may render the entire subassembly useless. As a process becomes more complex, the time required to test the product increases significantly in the total build cycle. The urgency to reduce test time brings more pressure to improve in-process control and inspection. The advances and improvements of components, assemblies and systems such as micro-processors, micro-computers, programmable controllers, and other intelligent devices, have made the automation of quality control much more cost effective and justifiable.
One-step manufacturing of innovative flat-knitted 3D net-shape preforms for composite applications
NASA Astrophysics Data System (ADS)
Bollengier, Quentin; Wieczorek, Florian; Hellmann, Sven; Trümper, Wolfgang; Cherif, Chokri
2017-10-01
Mostly due to the cost-intensive manually performed processing operations, the production of complex-shaped fibre reinforced plastic composites (FRPC) is currently very expensive and therefore either restricted to sectors with high added value or for small batch applications (e.g. in the aerospace or automotive industry). Previous works suggest that the successful integration of conventional textile manufacturing processes in the FRPC-process chain is the key to a cost-efficient manufacturing of complex three-dimensional (3D) FRPC-components with stress-oriented fibre arrangement. Therefore, this work focuses on the development of the multilayer weft knitting technology for the one-step manufacturing of complex 3D net-shaped preforms for high performance FRPC applications. In order to highlight the advantages of net-shaped multilayer weft knitted fabrics for the production of complex FRPC parts, seamless preforms such as 3D skin-stringer structures and tubular fabrics with load oriented fibre arrangement are realised. In this paper, the development of the textile bindings and performed technical modifications on flat knitting machines are presented. The results show that the multilayer weft knitting technology meets perfectly the requirements for a fully automated and reproducible manufacturing of complex 3D textile preforms with stress-oriented fibre arrangement.
NASA Technical Reports Server (NTRS)
Vickers, John H.; Pelham, Larry I.
1993-01-01
Automated fiber placement is a manufacturing process used for producing complex composite structures. It is a notable leap to the state-of-the-art in technology for automated composite manufacturing. The fiber placement capability was established at the Marshall Space Flight Center's (MSFC) Productivity Enhancement Complex in 1992 in collaboration with Thiokol Corporation to provide materials and processes research and development, and to fabricate components for many of the Center's Programs. The Fiber Placement System (FPX) was developed as a distinct solution to problems inherent to other automated composite manufacturing systems. This equipment provides unique capabilities to build composite parts in complex 3-D shapes with concave and other asymmetrical configurations. Components with complex geometries and localized reinforcements usually require labor intensive efforts resulting in expensive, less reproducible components; the fiber placement system has the features necessary to overcome these conditions. The mechanical systems of the equipment have the motion characteristics of a filament winder and the fiber lay-up attributes of a tape laying machine, with the additional capabilities of differential tow payout speeds, compaction and cut-restart to selectively place the correct number of fibers where the design dictates. This capability will produce a repeatable process resulting in lower cost and improved quality and reliability.
Overview of Aro Program on Network Science for Human Decision Making
NASA Astrophysics Data System (ADS)
West, Bruce J.
This program brings together researchers from disparate disciplines to work on a complex research problem that defies confinement within any single discipline. Consequently, not only are new and rewarding solutions sought and obtained for a problem of importance to society and the Army, that is, the human dimension of complex networks, but, in addition, collaborations are established that would not otherwise have formed given the traditional disciplinary compartmentalization of research. This program develops the basic research foundation of a science of networks supporting the linkage between the physical and human (cognitive and social) domains as they relate to human decision making. The strategy is to extend the recent methods of non-equilibrium statistical physics to non-stationary, renewal stochastic processes that appear to be characteristic of the interactions among nodes in complex networks. We also pursue understanding of the phenomenon of synchronization, whose mathematical formulation has recently provided insight into how complex networks reach accommodation and cooperation. The theoretical analyses of complex networks, although mathematically rigorous, often elude analytic solutions and require computer simulation and computation to analyze the underlying dynamic process.
Dubrau, Danilo; Tortorici, M Alejandra; Rey, Félix A; Tautz, Norbert
2017-02-01
The viruses of the family Flaviviridae possess a positive-strand RNA genome and express a single polyprotein which is processed into functional proteins. Initially, the nonstructural (NS) proteins, which are not part of the virions, form complexes capable of genome replication. Later on, the NS proteins also play a critical role in virion formation. The molecular basis to understand how the same proteins form different complexes required in both processes is so far unknown. For pestiviruses, uncleaved NS2-3 is essential for virion morphogenesis while NS3 is required for RNA replication but is not functional in viral assembly. Recently, we identified two gain of function mutations, located in the C-terminal region of NS2 and in the serine protease domain of NS3 (NS3 residue 132), which allow NS2 and NS3 to substitute for uncleaved NS2-3 in particle assembly. We report here the crystal structure of pestivirus NS3-4A showing that the NS3 residue 132 maps to a surface patch interacting with the C-terminal region of NS4A (NS4A-kink region) suggesting a critical role of this contact in virion morphogenesis. We show that destabilization of this interaction, either by alanine exchanges at this NS3/4A-kink interface, led to a gain of function of the NS3/4A complex in particle formation. In contrast, RNA replication and thus replicase assembly requires a stable association between NS3 and the NS4A-kink region. Thus, we propose that two variants of NS3/4A complexes exist in pestivirus infected cells each representing a basic building block required for either RNA replication or virion morphogenesis. This could be further corroborated by trans-complementation studies with a replication-defective NS3/4A double mutant that was still functional in viral assembly. Our observations illustrate the presence of alternative overlapping surfaces providing different contacts between the same proteins, allowing the switch from RNA replication to virion formation.
LABORATORY PROCESS CONTROLLER USING NATURAL LANGUAGE COMMANDS FROM A PERSONAL COMPUTER
NASA Technical Reports Server (NTRS)
Will, H.
1994-01-01
The complex environment of the typical research laboratory requires flexible process control. This program provides natural language process control from an IBM PC or compatible machine. Sometimes process control schedules require changes frequently, even several times per day. These changes may include adding, deleting, and rearranging steps in a process. This program sets up a process control system that can either run without an operator, or be run by workers with limited programming skills. The software system includes three programs. Two of the programs, written in FORTRAN77, record data and control research processes. The third program, written in Pascal, generates the FORTRAN subroutines used by the other two programs to identify the user commands with the user-written device drivers. The software system also includes an input data set which allows the user to define the user commands which are to be executed by the computer. To set the system up the operator writes device driver routines for all of the controlled devices. Once set up, this system requires only an input file containing natural language command lines which tell the system what to do and when to do it. The operator can make up custom commands for operating and taking data from external research equipment at any time of the day or night without the operator in attendance. This process control system requires a personal computer operating under MS-DOS with suitable hardware interfaces to all controlled devices. The program requires a FORTRAN77 compiler and user-written device drivers. This program was developed in 1989 and has a memory requirement of about 62 Kbytes.
NASA Technical Reports Server (NTRS)
Shaw, Asa L., Jr.; Kivett, William R.; Taylor, James Y.
1990-01-01
A guide is presented to assist requestors in formulating and submitting the required Complete Package for Information Resources (IR) acquisitions. Advance discussions with cognizant procurement personnel are strongly recommended for complex IR requirements or for those requestors new to the acquisition process. Open Market means the requirement either is not available on GSA Schedule Contract or exceeds the $300,000 threshold and/or the quantity Maximum Order Limitation of the GSA Schedule Contract. Only open market contract acquisitions (i.e., in excess of the $25,000 small purchase threshold), are addressed.
Ran1 functions to control the Cdc10/Sct1 complex through Puc1.
Caligiuri, M; Connolly, T; Beach, D
1997-01-01
We have undertaken a biochemical analysis of the regulation of the G1/S-phase transition and commitment to the cell cycle in the fission yeast Schizosaccharomyces pombe. The execution of Start requires the activity of the Cdc2 protein kinase and the Sct1/Cdc10 transcription complex. Progression through G1 also requires the Ran1 protein kinase whose inactivation leads to activation of the meiotic pathway under conditions normally inhibitory to this process. We have found that in addition to Cdc2, Sct1/Cdc10 complex formation requires Ran1. We demonstrate that the Puc1 cyclin associates with Ran1 and Cdc10 in vivo and that the Ran1 protein kinase functions to control the association between Puc1 and Cdc10. In addition, we present evidence that the phosphorylation state of Cdc10 is altered upon inactivation of Ran1. These results provide biochemical evidence that demonstrate one mechanism by which the Ran1 protein kinase serves to control cell fate through Cdc10 and Puc1. Images PMID:9201720
Links, Amanda E.; Draper, David; Lee, Elizabeth; Guzman, Jessica; Valivullah, Zaheer; Maduro, Valerie; Lebedev, Vlad; Didenko, Maxim; Tomlin, Garrick; Brudno, Michael; Girdea, Marta; Dumitriu, Sergiu; Haendel, Melissa A.; Mungall, Christopher J.; Smedley, Damian; Hochheiser, Harry; Arnold, Andrew M.; Coessens, Bert; Verhoeven, Steven; Bone, William; Adams, David; Boerkoel, Cornelius F.; Gahl, William A.; Sincan, Murat
2016-01-01
The National Institutes of Health Undiagnosed Diseases Program (NIH UDP) applies translational research systematically to diagnose patients with undiagnosed diseases. The challenge is to implement an information system enabling scalable translational research. The authors hypothesized that similar complex problems are resolvable through process management and the distributed cognition of communities. The team, therefore, built the NIH UDP integrated collaboration system (UDPICS) to form virtual collaborative multidisciplinary research networks or communities. UDPICS supports these communities through integrated process management, ontology-based phenotyping, biospecimen management, cloud-based genomic analysis, and an electronic laboratory notebook. UDPICS provided a mechanism for efficient, transparent, and scalable translational research and thereby addressed many of the complex and diverse research and logistical problems of the NIH UDP. Full definition of the strengths and deficiencies of UDPICS will require formal qualitative and quantitative usability and process improvement measurement. PMID:27785453
Direct brain recordings reveal hippocampal rhythm underpinnings of language processing.
Piai, Vitória; Anderson, Kristopher L; Lin, Jack J; Dewar, Callum; Parvizi, Josef; Dronkers, Nina F; Knight, Robert T
2016-10-04
Language is classically thought to be supported by perisylvian cortical regions. Here we provide intracranial evidence linking the hippocampal complex to linguistic processing. We used direct recordings from the hippocampal structures to investigate whether theta oscillations, pivotal in memory function, track the amount of contextual linguistic information provided in sentences. Twelve participants heard sentences that were either constrained ("She locked the door with the") or unconstrained ("She walked in here with the") before presentation of the final word ("key"), shown as a picture that participants had to name. Hippocampal theta power increased for constrained relative to unconstrained contexts during sentence processing, preceding picture presentation. Our study implicates hippocampal theta oscillations in a language task using natural language associations that do not require memorization. These findings reveal that the hippocampal complex contributes to language in an active fashion, relating incoming words to stored semantic knowledge, a necessary process in the generation of sentence meaning.
Using AI and Semantic Web Technologies to attack Process Complexity in Open Systems
NASA Astrophysics Data System (ADS)
Thompson, Simon; Giles, Nick; Li, Yang; Gharib, Hamid; Nguyen, Thuc Duong
Recently many vendors and groups have advocated using BPEL and WS-BPEL as a workflow language to encapsulate business logic. While encapsulating workflow and process logic in one place is a sensible architectural decision the implementation of complex workflows suffers from the same problems that made managing and maintaining hierarchical procedural programs difficult. BPEL lacks constructs for logical modularity such as the requirements construct from the STL [12] or the ability to adapt constructs like pure abstract classes for the same purpose. We describe a system that uses semantic web and agent concepts to implement an abstraction layer for BPEL based on the notion of Goals and service typing. AI planning was used to enable process engineers to create and validate systems that used services and goals as first class concepts and compiled processes at run time for execution.
1992-01-01
T cell stimulation by the human immunodeficiency virus 1 gp160-derived peptide p18 presented by H-2Dd class I major histocompatibility complex molecules in a cell-free system was found to require proteolytic cleavage. This extracellular processing was mediated by peptidases present in fetal calf serum. In vitro processing of p18 resulted in a distinct reverse phase high performance liquid chromatography profile, from which a biologically active product was isolated and sequenced. This peptide processing can be specifically blocked by the angiotensin- 1 converting enzyme (ACE) inhibitor captopril, and can occur by exposing p18 to purified ACE. The ability of naturally occurring extracellular proteases to convert inactive peptides to T cell antigens has important implications for understanding cytotoxic T lymphocyte responses in vivo, and for rational peptide vaccine design. PMID:1316930
Arginine production in the neonate
USDA-ARS?s Scientific Manuscript database
Endogenous arginine synthesis in adults is a complex multiorgan process, in which citrulline is synthesized in the gut, enters the general circulation, and is converted into arginine in the kidney, by what is known as the intestinal-renal axis. In neonates, the enzymes required to convert citrulline...
Understand and Advocate for Communities First
ERIC Educational Resources Information Center
Khalifa, Muhammad; Arnold, Noelle Witherspoon; Newcomb, Whitney
2015-01-01
Culturally responsive parent-school relationships require educators to consider the cultural practices and understandings of families as a necessary condition of greater academic achievement. The establishment of healthy parent-school relationships is a complex and dynamic process. A school-community overlap, with a priority given to community…
Hydrological modeling of upper Indus Basin and assessment of deltaic ecology
USDA-ARS?s Scientific Manuscript database
Managing water resources is mostly required at watershed scale where the complex hydrology processes and interactions linking land surface, climatic factors and human activities can be studied. Geographical Information System based watershed model; Soil and Water Assessment Tool (SWAT) is applied f...
Hydrological modeling in forested systems
H.E. Golden; G.R. Evenson; S. Tian; Devendra Amatya; Ge Sun
2015-01-01
Characterizing and quantifying interactions among components of the forest hydrological cycle is complex and usually requires a combination of field monitoring and modelling approaches (Weiler and McDonnell, 2004; National Research Council, 2008). Models are important tools for testing hypotheses, understanding hydrological processes and synthesizing experimental data...
Using DSP technology to simplify deep space ranging
NASA Technical Reports Server (NTRS)
Bryant, S.
2000-01-01
Commercially available Digital Signal Processing (DSP) technology has enabled a new spacecraft ranging design. The new design reduces overall size, parts count, and complexity. The design implementation will also meet the Jet Propulsion Laboratory (JPL) requirements for both near-Earth and deep space ranging.
Conversion from Tree to Graph Representation of Requirements
NASA Technical Reports Server (NTRS)
Mayank, Vimal; Everett, David Frank; Shmunis, Natalya; Austin, Mark
2009-01-01
A procedure and software to implement the procedure have been devised to enable conversion from a tree representation to a graph representation of the requirements governing the development and design of an engineering system. The need for this procedure and software and for other requirements-management tools arises as follows: In systems-engineering circles, it is well known that requirements- management capability improves the likelihood of success in the team-based development of complex systems involving multiple technological disciplines. It is especially desirable to be able to visualize (in order to identify and manage) requirements early in the system- design process, when errors can be corrected most easily and inexpensively.
NASA Technical Reports Server (NTRS)
1979-01-01
Information to identify viable coal gasification and utilization technologies is presented. Analysis capabilities required to support design and implementation of coal based synthetic fuels complexes are identified. The potential market in the Southeast United States for coal based synthetic fuels is investigated. A requirements analysis to identify the types of modeling and analysis capabilities required to conduct and monitor coal gasification project designs is discussed. Models and methodologies to satisfy these requirements are identified and evaluated, and recommendations are developed. Requirements for development of technology and data needed to improve gasification feasibility and economies are examined.
NASA Technical Reports Server (NTRS)
Rummel, J. A.
1982-01-01
The Mission Science Requirements Document (MSRD) for the First Dedicated Life Sciences Mission (LS-1) represents the culmination of thousands of hours of experiment selection, and science requirement definition activities. NASA life sciences has never before attempted to integrate, both scientifically and operationally, a single mission dedicated to life sciences research, and the complexity of the planning required for such an endeavor should be apparent. This set of requirements completes the first phase of a continual process which will attempt to optimize (within available programmatic and mission resources) the science accomplished on this mission.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Due to the increase in the use of Coordinate Measuring Machines (CMMs) to measure fine details and complex geometries in manufacturing, many programs have been made to compile and analyze the data. These programs typically require extensive setup to determine the expected results in order to not only track the pass/fail of a dimension, but also to use statistical process control (SPC). These extra steps and setup times have been addressed through the CMM Data Analysis Tool, which only requires the output of the CMM to provide both pass/fail analysis on all parts run to the same inspection program asmore » well as provide graphs which help visualize where the part measures within the allowed tolerances. This provides feedback not only to the customer for approval of a part during development, but also to machining process engineers to identify when any dimension is drifting towards an out of tolerance condition during production. This program can handle hundreds of parts with complex dimensions and will provide an analysis within minutes.« less
Arenavirus Budding: A Common Pathway with Mechanistic Differences
Wolff, Svenja; Ebihara, Hideki; Groseth, Allison
2013-01-01
The Arenaviridae is a diverse and growing family of viruses that includes several agents responsible for important human diseases. Despite the importance of this family for public health, particularly in Africa and South America, much of its biology remains poorly understood. However, in recent years significant progress has been made in this regard, particularly relating to the formation and release of new enveloped virions, which is an essential step in the viral lifecycle. While this process is mediated chiefly by the viral matrix protein Z, recent evidence suggests that for some viruses the nucleoprotein (NP) is also required to enhance the budding process. Here we highlight and compare the distinct budding mechanisms of different arenaviruses, concentrating on the role of the matrix protein Z, its known late domain sequences, and the involvement of cellular endosomal sorting complex required for transport (ESCRT) pathway components. Finally we address the recently described roles for the nucleoprotein NP in budding and ribonucleoprotein complex (RNP) incorporation, as well as discussing possible mechanisms related to its involvement. PMID:23435234
From Metacognition to Practice Cognition: The DNP e-Portfolio to Promote Integrated Learning.
Anderson, Kelley M; DesLauriers, Patricia; Horvath, Catherine H; Slota, Margaret; Farley, Jean Nelson
2017-08-01
Educating Doctor of Nursing Practice (DNP) students for an increasingly complex health care environment requires novel applications of learning concepts and technology. A deliberate and thoughtful process is required to integrate concepts of the DNP program into practice paradigm changes to subsequently improve students' abilities to innovate solutions to complex practice problems. The authors constructed or participated in electronic portfolio development inspired by theories of metacognition and integrated learning. The objective was to develop DNP student's reflection, integration of concepts, and technological capabilities to foster the deliberative competencies related to the DNP Essentials and the foundations of the DNP program. The pedagogical process demonstrates how e-portfolios adapted into the doctoral-level curriculum for DNP students can address the Essentials and foster the development of metacognitive capabilities, which translates into practice changes. The authors suggest that this pedagogical approach has the potential to optimize reflective and deliberative competencies among DNP students. [J Nurs Educ. 2017;56(8):497-500.]. Copyright 2017, SLACK Incorporated.
Thiam, Hawa-Racine; Vargas, Pablo; Carpi, Nicolas; Crespo, Carolina Lage; Raab, Matthew; Terriac, Emmanuel; King, Megan C.; Jacobelli, Jordan; Alberts, Arthur S.; Stradal, Theresia; Lennon-Dumenil, Ana-Maria; Piel, Matthieu
2016-01-01
Cell migration has two opposite faces: although necessary for physiological processes such as immune responses, it can also have detrimental effects by enabling metastatic cells to invade new organs. In vivo, migration occurs in complex environments and often requires a high cellular deformability, a property limited by the cell nucleus. Here we show that dendritic cells, the sentinels of the immune system, possess a mechanism to pass through micrometric constrictions. This mechanism is based on a rapid Arp2/3-dependent actin nucleation around the nucleus that disrupts the nuclear lamina, the main structure limiting nuclear deformability. The cells' requirement for Arp2/3 to pass through constrictions can be relieved when nuclear stiffness is decreased by suppressing lamin A/C expression. We propose a new role for Arp2/3 in three-dimensional cell migration, allowing fast-moving cells such as leukocytes to rapidly and efficiently migrate through narrow gaps, a process probably important for their function. PMID:26975831
Efficient implementation of neural network deinterlacing
NASA Astrophysics Data System (ADS)
Seo, Guiwon; Choi, Hyunsoo; Lee, Chulhee
2009-02-01
Interlaced scanning has been widely used in most broadcasting systems. However, there are some undesirable artifacts such as jagged patterns, flickering, and line twitters. Moreover, most recent TV monitors utilize flat panel display technologies such as LCD or PDP monitors and these monitors require progressive formats. Consequently, the conversion of interlaced video into progressive video is required in many applications and a number of deinterlacing methods have been proposed. Recently deinterlacing methods based on neural network have been proposed with good results. On the other hand, with high resolution video contents such as HDTV, the amount of video data to be processed is very large. As a result, the processing time and hardware complexity become an important issue. In this paper, we propose an efficient implementation of neural network deinterlacing using polynomial approximation of the sigmoid function. Experimental results show that these approximations provide equivalent performance with a considerable reduction of complexity. This implementation of neural network deinterlacing can be efficiently incorporated in HW implementation.
Nondimensional parameter for conformal grinding: combining machine and process parameters
NASA Astrophysics Data System (ADS)
Funkenbusch, Paul D.; Takahashi, Toshio; Gracewski, Sheryl M.; Ruckman, Jeffrey L.
1999-11-01
Conformal grinding of optical materials with CNC (Computer Numerical Control) machining equipment can be used to achieve precise control over complex part configurations. However complications can arise due to the need to fabricate complex geometrical shapes at reasonable production rates. For example high machine stiffness is essential, but the need to grind 'inside' small or highly concave surfaces may require use of tooling with less than ideal stiffness characteristics. If grinding generates loads sufficient for significant tool deflection, the programmed removal depth will not be achieved. Moreover since grinding load is a function of the volumetric removal rate the amount of load deflection can vary with location on the part, potentially producing complex figure errors. In addition to machine/tool stiffness and removal rate, load generation is a function of the process parameters. For example by reducing the feed rate of the tool into the part, both the load and resultant deflection/removal error can be decreased. However this must be balanced against the need for part through put. In this paper a simple model which permits combination of machine stiffness and process parameters into a single non-dimensional parameter is adapted for a conformal grinding geometry. Errors in removal can be minimized by maintaining this parameter above a critical value. Moreover, since the value of this parameter depends on the local part geometry, it can be used to optimize process settings during grinding. For example it may be used to guide adjustment of the feed rate as a function of location on the part to eliminate figure errors while minimizing the total grinding time required.
Route to one-step microstructure mold fabrication for PDMS microfluidic chip
NASA Astrophysics Data System (ADS)
Lv, Xiaoqing; Geng, Zhaoxin; Fan, Zhiyuan; Wang, Shicai; Su, Yue; Fang, Weihao; Pei, Weihua; Chen, Hongda
2018-04-01
The microstructure mold fabrication for PDMS microfluidic chip remains complex and time-consuming process requiring special equipment and protocols: photolithography and etching. Thus, a rapid and cost-effective method is highly needed. Comparing with the traditional microfluidic chip fabricating process based on the micro-electromechanical system (MEMS), this method is simple and easy to implement, and the whole fabrication process only requires 1-2 h. Different size of microstructure from 100 to 1000 μm was fabricated, and used to culture four kinds of breast cancer cell lines. Cell viability and morphology was assessed when they were cultured in the micro straight channels, micro square holes and the bonding PDMS-glass microfluidic chip. The experimental results indicate that the microfluidic chip is good and meet the experimental requirements. This method can greatly reduce the process time and cost of the microfluidic chip, and provide a simple and effective way for the structure design and in the field of biological microfabrications and microfluidic chips.
1987-03-01
3/4 hours. Performance tests evaluated simple and choice reaction time to visual stimuli, vigilance, and processing of symbolic, numerical, verbal...minimize the adverse consequences of these stressors. Tyrosine enhanced performance (e.g. complex information processing , vigilance, and reaction time... processes inherent in many real-world tasks. For example, Map Compass requires association of Wsi PL AFCm uA O-SV CHETCLtISS) direction and degree
Use Zircon-Ilmenite Concentrate in Steelmaking
NASA Astrophysics Data System (ADS)
Fedoseev, S. N.; Volkova, T. N.
2016-08-01
Market requirements cause a constant search for new materials and technologies, for their immediate use in increasing requirements for material and energy efficiency, as well as to the quality of steel. In practice, steel production in the tended recently of more stringent requirements for the chemical composition of the steel and its contamination by nonmetallic inclusions, gas and non-ferrous metals. The main ways of increasing of strength and performance characteristics fabricated metal products related to the profound and effective influence on the crystallizing metal structure by furnace processing of the melt with refining and modifying additives. It can be argued that the furnace processing of steel and iron chemically active metals (alkali-earth metals, rare-earth metals, and others.) is an integral part of modern production of high quality products and competitive technologies. Important condition for development of methods secondary metallurgy of steel is the use of relatively inexpensive materials in a variety of complex alloys and blends, allowing targeted control of physical and chemical state of the molten metal and, therefore, receive steel with improved performance. In this connection the development of modifying natural materials metallurgy technologies presented complex ores containing titanium and zirconium, is a very urgent task.
Santini, Emanuela; Huynh, Thu N.; Klann, Eric
2018-01-01
The complexity of memory formation and its persistence is a phenomenon that has been studied intensely for centuries. Memory exists in many forms and is stored in various brain regions. Generally speaking, memories are reorganized into broadly distributed cortical networks over time through systems level consolidation. At the cellular level, storage of information is believed to initially occur via altered synaptic strength by processes such as long-term potentiation (LTP). New protein synthesis is required for long-lasting synaptic plasticity as well as for the formation of long-term memory. The mammalian target of rapamycin complex 1 (mTORC1) is a critical regulator of cap-dependent protein synthesis and is required for numerous forms of long-lasting synaptic plasticity and long-term memory. As such, the study of mTORC1 and protein factors that control translation initiation and elongation have enhanced our understanding of how the process of protein synthesis is regulated during memory formation. Herein we will discuss the molecular mechanisms that regulate protein synthesis as well as pharmacological and genetic manipulations that demonstrate the requirement for proper translational control in long-lasting synaptic plasticity and long-term memory formation. PMID:24484700
Specialized Environmental Chamber Test Complex: User Test Planning Guide
NASA Technical Reports Server (NTRS)
Montz, Michael E.
2011-01-01
Test process, milestones and inputs are unknowns to first-time users of the Specialized Environmental Test Complex. The User Test Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their test engineering personnel in test planning and execution. Material covered includes a roadmap of the test process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, test article interfaces, and inputs necessary to define test scope, cost, and schedule are included as an appendix to the guide.
Koerner, John F; Coleman, C Norman; Murrain-Hill, Paula; FitzGerald, Denis J; Sullivan, Julie M
2014-06-01
Effective decision making during a rapidly evolving emergency such as a radiological or nuclear incident requires timely interim decisions and communications from onsite decision makers while further data processing, consultation, and review are ongoing by reachback experts. The authors have recently proposed a medical decision model for use during a radiological or nuclear disaster, which is similar in concept to that used in medical care, especially when delay in action can have disastrous effects. For decision makers to function most effectively during a complex response, they require access to onsite subject matter experts who can provide information, recommendations, and participate in public communication efforts. However, in the time before this expertise is available or during the planning phase, just-in-time tools are essential that provide critical overview of the subject matter written specifically for the decision makers. Recognizing the complexity of the science, risk assessment, and multitude of potential response assets that will be required after a nuclear incident, the Office of the Assistant Secretary for Preparedness and Response, in collaboration with other government and non-government experts, has prepared a practical guide for decision makers. This paper illustrates how the medical decision model process could facilitate onsite decision making that includes using the deliberative reachback process from science and policy experts and describes the tools now available to facilitate timely and effective incident management.
Visual short-term memory capacity for simple and complex objects.
Luria, Roy; Sessa, Paola; Gotler, Alex; Jolicoeur, Pierre; Dell'Acqua, Roberto
2010-03-01
Does the capacity of visual short-term memory (VSTM) depend on the complexity of the objects represented in memory? Although some previous findings indicated lower capacity for more complex stimuli, other results suggest that complexity effects arise during retrieval (due to errors in the comparison process with what is in memory) that is not related to storage limitations of VSTM, per se. We used ERPs to track neuronal activity specifically related to retention in VSTM by measuring the sustained posterior contralateral negativity during a change detection task (which required detecting if an item was changed between a memory and a test array). The sustained posterior contralateral negativity, during the retention interval, was larger for complex objects than for simple objects, suggesting that neurons mediating VSTM needed to work harder to maintain more complex objects. This, in turn, is consistent with the view that VSTM capacity depends on complexity.
Parallel processing via a dual olfactory pathway in the honeybee.
Brill, Martin F; Rosenbaum, Tobias; Reus, Isabelle; Kleineidam, Christoph J; Nawrot, Martin P; Rössler, Wolfgang
2013-02-06
In their natural environment, animals face complex and highly dynamic olfactory input. Thus vertebrates as well as invertebrates require fast and reliable processing of olfactory information. Parallel processing has been shown to improve processing speed and power in other sensory systems and is characterized by extraction of different stimulus parameters along parallel sensory information streams. Honeybees possess an elaborate olfactory system with unique neuronal architecture: a dual olfactory pathway comprising a medial projection-neuron (PN) antennal lobe (AL) protocerebral output tract (m-APT) and a lateral PN AL output tract (l-APT) connecting the olfactory lobes with higher-order brain centers. We asked whether this neuronal architecture serves parallel processing and employed a novel technique for simultaneous multiunit recordings from both tracts. The results revealed response profiles from a high number of PNs of both tracts to floral, pheromonal, and biologically relevant odor mixtures tested over multiple trials. PNs from both tracts responded to all tested odors, but with different characteristics indicating parallel processing of similar odors. Both PN tracts were activated by widely overlapping response profiles, which is a requirement for parallel processing. The l-APT PNs had broad response profiles suggesting generalized coding properties, whereas the responses of m-APT PNs were comparatively weaker and less frequent, indicating higher odor specificity. Comparison of response latencies within and across tracts revealed odor-dependent latencies. We suggest that parallel processing via the honeybee dual olfactory pathway provides enhanced odor processing capabilities serving sophisticated odor perception and olfactory demands associated with a complex olfactory world of this social insect.
To repair or not to repair: with FAVOR there is no question
NASA Astrophysics Data System (ADS)
Garetto, Anthony; Schulz, Kristian; Tabbone, Gilles; Himmelhaus, Michael; Scheruebl, Thomas
2016-10-01
In the mask shop the challenges associated with today's advanced technology nodes, both technical and economic, are becoming increasingly difficult. The constant drive to continue shrinking features means more masks per device, smaller manufacturing tolerances and more complexity along the manufacturing line with respect to the number of manufacturing steps required. Furthermore, the extremely competitive nature of the industry makes it critical for mask shops to optimize asset utilization and processes in order to maximize their competitive advantage and, in the end, profitability. Full maximization of profitability in such a complex and technologically sophisticated environment simply cannot be achieved without the use of smart automation. Smart automation allows productivity to be maximized through better asset utilization and process optimization. Reliability is improved through the minimization of manual interactions leading to fewer human error contributions and a more efficient manufacturing line. In addition to these improvements in productivity and reliability, extra value can be added through the collection and cross-verification of data from multiple sources which provides more information about our products and processes. When it comes to handling mask defects, for instance, the process consists largely of time consuming manual interactions that are error prone and often require quick decisions from operators and engineers who are under pressure. The handling of defects itself is a multiple step process consisting of several iterations of inspection, disposition, repair, review and cleaning steps. Smaller manufacturing tolerances and features with higher complexity contribute to a higher number of defects which must be handled as well as a higher level of complexity. In this paper the recent efforts undertaken by ZEISS to provide solutions which address these challenges, particularly those associated with defectivity, will be presented. From automation of aerial image analysis to the use of data driven decision making to predict and propose the optimized back end of line process flow, productivity and reliability improvements are targeted by smart automation. Additionally the generation of the ideal aerial image from the design and several repair enhancement features offer additional capabilities to improve the efficiency and yield associated with defect handling.
Hoskinson, Anne-Marie
2010-01-01
Biological problems in the twenty-first century are complex and require mathematical insight, often resulting in mathematical models of biological systems. Building mathematical-biological models requires cooperation among biologists and mathematicians, and mastery of building models. A new course in mathematical modeling presented the opportunity to build both content and process learning of mathematical models, the modeling process, and the cooperative process. There was little guidance from the literature on how to build such a course. Here, I describe the iterative process of developing such a course, beginning with objectives and choosing content and process competencies to fulfill the objectives. I include some inductive heuristics for instructors seeking guidance in planning and developing their own courses, and I illustrate with a description of one instructional model cycle. Students completing this class reported gains in learning of modeling content, the modeling process, and cooperative skills. Student content and process mastery increased, as assessed on several objective-driven metrics in many types of assessments.
2010-01-01
Biological problems in the twenty-first century are complex and require mathematical insight, often resulting in mathematical models of biological systems. Building mathematical–biological models requires cooperation among biologists and mathematicians, and mastery of building models. A new course in mathematical modeling presented the opportunity to build both content and process learning of mathematical models, the modeling process, and the cooperative process. There was little guidance from the literature on how to build such a course. Here, I describe the iterative process of developing such a course, beginning with objectives and choosing content and process competencies to fulfill the objectives. I include some inductive heuristics for instructors seeking guidance in planning and developing their own courses, and I illustrate with a description of one instructional model cycle. Students completing this class reported gains in learning of modeling content, the modeling process, and cooperative skills. Student content and process mastery increased, as assessed on several objective-driven metrics in many types of assessments. PMID:20810966
A modular assembling platform for manufacturing of microsystems by optical tweezers
NASA Astrophysics Data System (ADS)
Ksouri, Sarah Isabelle; Aumann, Andreas; Ghadiri, Reza; Prüfer, Michael; Baer, Sebastian; Ostendorf, Andreas
2013-09-01
Due to the increased complexity in terms of materials and geometries for microsystems new assembling techniques are required. Assembling techniques from the semiconductor industry are often very specific and cannot fulfill all specifications in more complex microsystems. Therefore, holographic optical tweezers are applied to manipulate structures in micrometer range with highest flexibility and precision. As is well known non-spherical assemblies can be trapped and controlled by laser light and assembled with an additional light modulator application, where the incident laser beam is rearranged into flexible light patterns in order to generate multiple spots. The complementary building blocks are generated by a two-photon-polymerization process. The possibilities of manufacturing arbitrary microstructures and the potential of optical tweezers lead to the idea of combining manufacturing techniques with manipulation processes to "microrobotic" processes. This work presents the manipulation of generated complex microstructures with optical tools as well as a storage solution for 2PP assemblies. A sample holder has been developed for the manual feeding of 2PP building blocks. Furthermore, a modular assembling platform has been constructed for an `all-in-one' 2PP manufacturing process as a dedicated storage system. The long-term objective is the automation process of feeding and storage of several different 2PP micro-assemblies to realize an automated assembly process.
Arenavirus Stable Signal Peptide Is the Keystone Subunit for Glycoprotein Complex Organization
Bederka, Lydia H.; Bonhomme, Cyrille J.; Ling, Emily L.
2014-01-01
ABSTRACT The rodent arenavirus glycoprotein complex encodes a stable signal peptide (SSP) that is an essential structural component of mature virions. The SSP, GP1, and GP2 subunits of the trimeric glycoprotein complex noncovalently interact to stud the surface of virions and initiate arenavirus infectivity. Nascent glycoprotein production undergoes two proteolytic cleavage events: first within the endoplasmic reticulum (ER) to cleave SSP from the remaining precursor GP1/2 (glycoprotein complex [GPC]) glycoprotein and second within the Golgi stacks by the cellular SKI-1/S1P for GP1/2 processing to yield GP1 and GP2 subunits. Cleaved SSP is not degraded but retained as an essential glycoprotein subunit. Here, we defined functions of the 58-amino-acid lymphocytic choriomeningitis virus (LCMV) SSP in regard to glycoprotein complex processing and maturation. Using molecular biology techniques, confocal microscopy, and flow cytometry, we detected SSP at the plasma membrane of transfected cells. Further, we identified a sorting signal (FLLL) near the carboxyl terminus of SSP that is required for glycoprotein maturation and trafficking. In the absence of SSP, the glycoprotein accumulated within the ER and was unable to undergo processing by SKI-1/S1P. Mutation of this highly conserved FLLL motif showed impaired glycoprotein processing and secretory pathway trafficking, as well as defective surface expression and pH-dependent membrane fusion. Immunoprecipitation of SSP confirmed an interaction between the signal peptide and the GP2 subunit; however, mutations within this FLLL motif disrupted the association of the GP1 subunit with the remaining glycoprotein complex. PMID:25352624
The Importance of Water for High Fidelity Information Processing and for Life
NASA Technical Reports Server (NTRS)
Hoehler, Tori M.; Pohorille, Andrew
2011-01-01
Is water an absolute prerequisite for life? Life depends on a variety of non-covalent interactions among molecules, the nature of which is determined as much by the solvent in which they occur as by the molecules themselves. Catalysis and information processing, two essential functions of life, require non-covalent molecular recognition with very high specificity. For example, to correctly reproduce a string consisting of 600,000 units of information (e.g ., 600 kilobases, equivalent to the genome of the smallest free living terrestrial organisms) with a 90% success rate requires specificity > 107 : 1 for the target molecule vs. incorrect alternatives. Such specificity requires (i) that the correct molecular association is energetically stabilized by at least 40 kJ/mol relative to alternatives, and (ii) that the system is able to sample among possible states (alternative molecular associations) rapidly enough to allow the system to fall under thermodynamic control and express the energetic stabilization. We argue that electrostatic interactions are required to confer the necessary energetic stabilization vs. a large library of molecular alternatives, and that a solvent with polarity and dielectric properties comparable to water is required for the system to sample among possible states and express thermodynamic control. Electrostatic associations can be made in non-polar solvents, but the resulting complexes are too stable to be "unmade" with sufficient frequency to confer thermodynamic control on the system. An electrostatic molecular complex representing 3 units of information (e.g., 3 base pairs) with specificity > 107 per unit has a stability in non-polar solvent comparable to that of a carbon-carbon bond at room temperature. These considerations suggest that water, or a solvent with properties very like water, is necessary to support high-fidelity information processing, and can therefore be considered a critical prerequisite for life.
ERIC Educational Resources Information Center
Bifuh-Ambe, Elizabeth
2013-01-01
Writing is a complex, recursive and difficult process that requires strategic decision-making across multiple domains (Graham, 2006; Pritchard & Honeycutt, 2006). Students are expected to use this process to communicate with a variety of audiences for a variety of purposes. Modelling and providing effective instruction is critical, especially…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernardin, John D; Baca, Allen G
This paper presents the mechanical design, fabrication and dynamic testing of an electrostatic analyzer spacecraft instrument. The functional and environmental requirements combined with limited spacecraft accommodations, resulted in complex component geometries, unique material selections, and difficult fabrication processes. The challenging aspects of the mechanical design and several of the more difficult production processes are discussed. In addition, the successes, failures, and lessons learned from acoustic and random vibration testing of a full-scale prototype instrument are presented.
Gsflow-py: An integrated hydrologic model development tool
NASA Astrophysics Data System (ADS)
Gardner, M.; Niswonger, R. G.; Morton, C.; Henson, W.; Huntington, J. L.
2017-12-01
Integrated hydrologic modeling encompasses a vast number of processes and specifications, variable in time and space, and development of model datasets can be arduous. Model input construction techniques have not been formalized or made easily reproducible. Creating the input files for integrated hydrologic models (IHM) requires complex GIS processing of raster and vector datasets from various sources. Developing stream network topology that is consistent with the model resolution digital elevation model is important for robust simulation of surface water and groundwater exchanges. Distribution of meteorologic parameters over the model domain is difficult in complex terrain at the model resolution scale, but is necessary to drive realistic simulations. Historically, development of input data for IHM models has required extensive GIS and computer programming expertise which has restricted the use of IHMs to research groups with available financial, human, and technical resources. Here we present a series of Python scripts that provide a formalized technique for the parameterization and development of integrated hydrologic model inputs for GSFLOW. With some modifications, this process could be applied to any regular grid hydrologic model. This Python toolkit automates many of the necessary and laborious processes of parameterization, including stream network development and cascade routing, land coverages, and meteorological distribution over the model domain.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoshikatsu, Yuki; Ishida, Yo-ichi; Sudo, Haruka
Nuclear VCP-like 2 (NVL2) is a member of the chaperone-like AAA-ATPase family and is involved in the biosynthesis of 60S ribosomal subunits in mammalian cells. We previously showed the interaction of NVL2 with a DExD/H-box RNA helicase MTR4/DOB1, which is a known cofactor for an exoribonuclease complex, the exosome. This finding implicated NVL2 in RNA metabolic processes during ribosome biogenesis. In the present study, we found that a series of mutations within the ATPase domain of NVL2 causes a defect in pre-rRNA processing into mature 28S and 5.8S rRNAs. Co-immunoprecipitation analysis showed that NVL2 was associated with the nuclear exosomemore » complex, which includes RRP6 as a nucleus-specific catalytic subunit. This interaction was prevented by depleting either MTR4 or RRP6, indicating their essential role in mediating this interaction with NVL2. Additionally, knockdown of MPP6, another cofactor for the nuclear exosome, also prevented the interaction by causing MTR4 to dissociate from the nuclear exosome. These results suggest that NVL2 is involved in pre-rRNA processing by associating with the nuclear exosome complex and that MPP6 is required for maintaining the integrity of this rRNA processing complex. - Highlights: • ATPase-deficient mutants of NVL2 have decreased pre-rRNA processing. • NVL2 associates with the nuclear exosome through interactions with MTR4 and RRP6. • MPP6 stabilizes MTR4-RRP6 interaction and allows NVL2 to interact with the complex.« less
A Metrics-Based Approach to Intrusion Detection System Evaluation for Distributed Real-Time Systems
2002-04-01
Based Approach to Intrusion Detection System Evaluation for Distributed Real - Time Systems Authors: G. A. Fink, B. L. Chappell, T. G. Turner, and...Distributed, Security. 1 Introduction Processing and cost requirements are driving future naval combat platforms to use distributed, real - time systems of...distributed, real - time systems . As these systems grow more complex, the timing requirements do not diminish; indeed, they may become more constrained
Morris, Kevin J; Corbett, Anita H
2018-06-15
The polyadenosine RNA-binding protein ZC3H14 is important in RNA processing. Although ZC3H14 is ubiquitously expressed, mutation of the ZC3H14 gene causes a non-syndromic form of intellectual disability. Here, we examine the function of ZC3H14 in the brain by identifying ZC3H14-interacting proteins using unbiased mass spectrometry. Through this analysis, we identified physical interactions between ZC3H14 and multiple RNA processing factors. Notably, proteins that comprise the THO complex were amongst the most enriched proteins. We demonstrate that ZC3H14 physically interacts with THO components and that these proteins are required for proper RNA processing, as loss of ZC3H14 or THO components leads to extended bulk poly(A) tail length. Furthermore, we identified the transcripts Atp5g1 and Psd95 as shared RNA targets of ZC3H14 and the THO complex. Our data suggest that ZC3H14 and the THO complex are important for proper processing of Atp5g1 and Psd95 RNA, as depletion of ZC3H14 or THO components leads to decreased steady-state levels of each mature transcript accompanied by accumulation of Atp5g1 and Psd95 pre-mRNA in the cytoplasm. Taken together, this work provides the first unbiased identification of nuclear ZC3H14-interacting proteins from the brain and links the functions of ZC3H14 and the THO complex in the processing of RNA.
Thermal Control Technologies for Complex Spacecraft
NASA Technical Reports Server (NTRS)
Swanson, Theodore D.
2004-01-01
Thermal control is a generic need for all spacecraft. In response to ever more demanding science and exploration requirements, spacecraft are becoming ever more complex, and hence their thermal control systems must evolve. This paper briefly discusses the process of technology development, the state-of-the-art in thermal control, recent experiences with on-orbit two-phase systems, and the emerging thermal control technologies to meet these evolving needs. Some "lessons learned" based on experience with on-orbit systems are also presented.
[Visual hygiene in LED lighting. Modern scientific imaginations].
Deynego, V N; Kaptsov, V A
2014-01-01
There are considered a classic and modern paradigm of perception of light and its impact on human health. To consider the perception of light as a complex self-organizing synergistic system of compression of information in the process of its sequencing was supposed. This allowed to develop a complex of interrelated measures, which may become the basis for modern hygiene, and determine requirements for the led lamp with biologically adequate spectrum of the light, for which there were obtained patents in Russia, Europe and USA.
The Detection Method of Escherichia coli in Water Resources: A Review
NASA Astrophysics Data System (ADS)
Nurliyana, M. R.; Sahdan, M. Z.; Wibowo, K. M.; Muslihati, A.; Saim, H.; Ahmad, S. A.; Sari, Y.; Mansor, Z.
2018-04-01
This article reviews several approaches for Escherichia coli (E. coli) bacteria detection from conventional methods, emerging method and goes to biosensor-based techniques. Detection and enumeration of E. coli bacteria usually required long duration of time in obtaining the result since laboratory-based approach is normally used in its assessment. It requires 24 hours to 72 hours after sampling to process the culturing samples before results are available. Although faster technique for detecting E. coli in water such as Polymerase Chain Reaction (PCR) and Enzyme-Linked Immunosorbent Assay (ELISA) have been developed, it still required transporting the samples from water resources to the laboratory, high-cost, complicated equipment usage, complex procedures, as well as the requirement of skilled specialist to cope with the complexity which limit their wide spread practice in water quality detection. Recently, development of biosensor device that is easy to perform, portable, highly sensitive and selective becomes indispensable in detecting extremely lower consolidation of pathogenic E. coli bacteria in water samples.
Metamodeling and optimization of the THF process with pulsating pressure
NASA Astrophysics Data System (ADS)
Bucconi, Marco; Strano, Matteo
2018-05-01
Tube hydroforming is a process used in various applications to form the tube in a desired complex shape, by combining the use of internal pressure, which provides the required stress to yield the material, and axial feeding, which helps the material to flow towards the bulging zone. In many studies it has been demonstrated how wrinkling and bursting defects can be severely reduced by means of a pulsating pressure, and how the so-called hammering hydroforming enhances the formability of the material. The definition of the optimum pressure and axial feeding profiles represent a daunting challenge in the designing phase of the hydroforming operation of a new part. The quality of the formed part is highly dependent on the amplitude and the peak value of the pulsating pressure, along with the axial stroke. In this paper, a research is reported, conducted by means of explicit finite element simulations of a hammering THF operation and metamodeling techniques aimed at optimizing the process parameters for the production of a complex part. The improved formability is explored for different factors and an optimization strategy is used to determine the most convenient pressure and axial feed profile curves for the hammering THF process of the examined part. It is shown how the pulsating pressure allows the minimization of the energy input in the process, still respecting final quality requirements.
System architectures for telerobotic research
NASA Technical Reports Server (NTRS)
Harrison, F. Wallace
1989-01-01
Several activities are performed related to the definition and creation of telerobotic systems. The effort and investment required to create architectures for these complex systems can be enormous; however, the magnitude of process can be reduced if structured design techniques are applied. A number of informal methodologies supporting certain aspects of the design process are available. More recently, prototypes of integrated tools supporting all phases of system design from requirements analysis to code generation and hardware layout have begun to appear. Activities related to system architecture of telerobots are described, including current activities which are designed to provide a methodology for the comparison and quantitative analysis of alternative system architectures.
NASA Technical Reports Server (NTRS)
Dean, Edwin B.; Unal, Resit
1991-01-01
Designing for cost is a state of mind. Of course, a lot of technical knowledge is required and the use of appropriate tools will improve the process. Unfortunately, the extensive use of weight based cost estimating relationships has generated a perception in the aerospace community that the primary way to reduce cost is to reduce weight. Wrong! Based upon an approximation of an industry accepted formula, the PRICE H (tm) production-production equation, Dean demonstrated theoretically that the optimal trajectory for cost reduction is predominantly in the direction of system complexity reduction, not system weight reduction. Thus the phrase "keep it simple" is a primary state of mind required for reducing cost throughout the design process.
Li, Shasha; Nie, Hongchao; Lu, Xudong; Duan, Huilong
2015-02-01
Integration of heterogeneous systems is the key to hospital information construction due to complexity of the healthcare environment. Currently, during the process of healthcare information system integration, people participating in integration project usually communicate by free-format document, which impairs the efficiency and adaptability of integration. A method utilizing business process model and notation (BPMN) to model integration requirement and automatically transforming it to executable integration configuration was proposed in this paper. Based on the method, a tool was developed to model integration requirement and transform it to integration configuration. In addition, an integration case in radiology scenario was used to verify the method.
Szydzik, C; Gavela, A F; Herranz, S; Roccisano, J; Knoerzer, M; Thurgood, P; Khoshmanesh, K; Mitchell, A; Lechuga, L M
2017-08-08
A primary limitation preventing practical implementation of photonic biosensors within point-of-care platforms is their integration with fluidic automation subsystems. For most diagnostic applications, photonic biosensors require complex fluid handling protocols; this is especially prominent in the case of competitive immunoassays, commonly used for detection of low-concentration, low-molecular weight biomarkers. For this reason, complex automated microfluidic systems are needed to realise the full point-of-care potential of photonic biosensors. To fulfil this requirement, we propose an on-chip valve-based microfluidic automation module, capable of automating such complex fluid handling. This module is realised through application of a PDMS injection moulding fabrication technique, recently described in our previous work, which enables practical fabrication of normally closed pneumatically actuated elastomeric valves. In this work, these valves are configured to achieve multiplexed reagent addressing for an on-chip diaphragm pump, providing the sample and reagent processing capabilities required for automation of cyclic competitive immunoassays. Application of this technique simplifies fabrication and introduces the potential for mass production, bringing point-of-care integration of complex automated microfluidics into the realm of practicality. This module is integrated with a highly sensitive, label-free bimodal waveguide photonic biosensor, and is demonstrated in the context of a proof-of-concept biosensing assay, detecting the low-molecular weight antibiotic tetracycline.
Novel technologies for decontamination of fresh and minimally processed fruits and vegetables
USDA-ARS?s Scientific Manuscript database
The complex challenges facing producers and processors of fresh produce require creative applications of conventional treatments and innovative approaches to develop entirely novel treatments. The varied nature of fresh and fresh-cut produce demands solutions that are adapted and optimized for each ...
Building a Community for Science.
ERIC Educational Resources Information Center
Walton, Emma L.
Professional development for effecting school change and school improvement is a community endeavor. While effective professional development requires all components of the local setting to be considered, the complexity of the educational system prohibits simple solutions. Building a community of leaders helps insure success in the change process.…
Tracking Decimal Misconceptions: Strategic Instructional Choices
ERIC Educational Resources Information Center
Griffin, Linda B.
2016-01-01
Understanding the decimal system is challenging, requiring coordination of place-value concepts with features of whole-number and fraction knowledge (Moloney and Stacey 1997). Moreover, the learner must discern if and how previously learned concepts and procedures apply. The process is complex, and misconceptions will naturally arise. In a…
Understanding the visual resource
Floyd L. Newby
1971-01-01
Understanding our visual resources involves a complex interweaving of motivation and cognitive recesses; but, more important, it requires that we understand and can identify those characteristics of a landscape that influence the image formation process. From research conducted in Florida, three major variables were identified that appear to have significant effect...
The Building Commissioning Handbook.
ERIC Educational Resources Information Center
Heinz, John A.; Casault, Rick
This book discusses building commissioning, which is the process of certifying that a new facility meets the required specifications. As buildings have become more complex, the traditional methods for building start-up and final acceptance have been proven inadequate, and building commissioning has been developed, which often necessitates the use…
Networked Professional Learning: Relating the Formal and the Informal
ERIC Educational Resources Information Center
Vaessen, Matthieu; van den Beemt, Antoine; de Laat, Maarten
2014-01-01
The increasing complexity of the workplace environment requires teachers and professionals in general to tap into their social networks, inside and outside circles of direct colleagues and collaborators, for finding appropriate knowledge and expertise. This collective process of sharing and constructing knowledge can be considered "networked…
Growth stimulation of Porphyromonas endodontalis by hemoglobin and protoporphyrin IX.
Zerr, M A; Cox, C D; Johnson, W T; Drake, D R
2000-12-01
Porphyromonas endodontalis, like other Porphyromonas species, has a complex set of nutritional requirements. In addition to being an obligate anaerobe, the bacterium must be grown in a complex medium consisting of amino acids, reducing agents and heme compounds. P. endodontalis accumulates high concentrations of heme pigments to the extent that colonies appear black on blood agar. This accumulation of heme and the need for these compounds has been characterized as iron requirements by these species. However, in our studies, P. endodontalis demonstrated growth dependence on hemoglobin or protoporphyrin IX but not on free iron. Iron added to other heme compounds actually decreased growth stimulation by porphyrin-containing compounds. P. endodontalis actively transported free iron, but this process did not appear to be critical for growth. The maximum stimulation of growth by protoporphyrin IX, under conditions of iron deprivation, suggests that P. endodontalis requires the porphyrin moiety as a growth factor.
Hammond, Thomas M.; Xiao, Hua; Boone, Erin C.; Perdue, Tony D.; Pukkila, Patricia J.; Shiu, Patrick K. T.
2011-01-01
In Neurospora crassa, genes lacking a pairing partner during meiosis are suppressed by a process known as meiotic silencing by unpaired DNA (MSUD). To identify novel MSUD components, we have developed a high-throughput reverse-genetic screen for use with the N. crassa knockout library. Here we describe the screening method and the characterization of a gene (sad-3) subsequently discovered. SAD-3 is a putative helicase required for MSUD and sexual spore production. It exists in a complex with other known MSUD proteins in the perinuclear region, a center for meiotic silencing activity. Orthologs of SAD-3 include Schizosaccharomyces pombe Hrr1, a helicase required for RNAi-induced heterochromatin formation. Both SAD-3 and Hrr1 interact with an RNA-directed RNA polymerase and an Argonaute, suggesting that certain aspects of silencing complex formation may be conserved between the two fungal species. PMID:22384347
Real-time automated failure identification in the Control Center Complex (CCC)
NASA Technical Reports Server (NTRS)
Kirby, Sarah; Lauritsen, Janet; Pack, Ginger; Ha, Anhhoang; Jowers, Steven; Mcnenny, Robert; Truong, The; Dell, James
1993-01-01
A system which will provide real-time failure management support to the Space Station Freedom program is described. The system's use of a simplified form of model based reasoning qualifies it as an advanced automation system. However, it differs from most such systems in that it was designed from the outset to meet two sets of requirements. First, it must provide a useful increment to the fault management capabilities of the Johnson Space Center (JSC) Control Center Complex (CCC) Fault Detection Management system. Second, it must satisfy CCC operational environment constraints such as cost, computer resource requirements, verification, and validation, etc. The need to meet both requirement sets presents a much greater design challenge than would have been the case had functionality been the sole design consideration. The choice of technology, discussing aspects of that choice and the process for migrating it into the control center is overviewed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Posseme, N., E-mail: nicolas.posseme@cea.fr; Pollet, O.; Barnola, S.
2014-08-04
Silicon nitride spacer etching realization is considered today as one of the most challenging of the etch process for the new devices realization. For this step, the atomic etch precision to stop on silicon or silicon germanium with a perfect anisotropy (no foot formation) is required. The situation is that none of the current plasma technologies can meet all these requirements. To overcome these issues and meet the highly complex requirements imposed by device fabrication processes, we recently proposed an alternative etching process to the current plasma etch chemistries. This process is based on thin film modification by light ionsmore » implantation followed by a selective removal of the modified layer with respect to the non-modified material. In this Letter, we demonstrate the benefit of this alternative etch method in term of film damage control (silicon germanium recess obtained is less than 6 A), anisotropy (no foot formation), and its compatibility with other integration steps like epitaxial. The etch mechanisms of this approach are also addressed.« less
NASA Astrophysics Data System (ADS)
Calvo, Juan; Nieto, Juanjo
2016-09-01
The management of human crowds in extreme situations is a complex subject which requires to take into account a variety of factors. To name a few, the understanding of human behaviour, the psychological and behavioural features of individuals, the quality of the venue and the stress level of the pedestrian need to be addressed in order to select the most appropriate action during an evacuation process on a complex venue. In this sense, the mathematical modeling of such complex phenomena can be regarded as a very useful tool to understand and predict these situations. As presented in [4], mathematical models can provide guidance to the personnel in charge of managing evacuation processes, by means of helping to design a set of protocols, among which the most appropriate during a given critical situation is then chosen.
Spreading dynamics on complex networks: a general stochastic approach.
Noël, Pierre-André; Allard, Antoine; Hébert-Dufresne, Laurent; Marceau, Vincent; Dubé, Louis J
2014-12-01
Dynamics on networks is considered from the perspective of Markov stochastic processes. We partially describe the state of the system through network motifs and infer any missing data using the available information. This versatile approach is especially well adapted for modelling spreading processes and/or population dynamics. In particular, the generality of our framework and the fact that its assumptions are explicitly stated suggests that it could be used as a common ground for comparing existing epidemics models too complex for direct comparison, such as agent-based computer simulations. We provide many examples for the special cases of susceptible-infectious-susceptible and susceptible-infectious-removed dynamics (e.g., epidemics propagation) and we observe multiple situations where accurate results may be obtained at low computational cost. Our perspective reveals a subtle balance between the complex requirements of a realistic model and its basic assumptions.
Hypothalamic Survival Circuits: Blueprints for Purposive Behaviors
Sternson, Scott M.
2015-01-01
Neural processes that direct an animal’s actions toward environmental goals are critical elements for understanding behavior. The hypothalamus is closely associated with motivated behaviors required for survival and reproduction. Intense feeding, drinking, aggressive, and sexual behaviors can be produced by a simple neuronal stimulus applied to discrete hypothalamic regions. What can these “evoked behaviors” teach us about the neural processes that determine behavioral intent and intensity? Small populations of neurons sufficient to evoke a complex motivated behavior may be used as entry points to identify circuits that energize and direct behavior to specific goals. Here, I review recent applications of molecular genetic, optogenetic, and pharmacogenetic approaches that overcome previous limitations for analyzing anatomically complex hypothalamic circuits and their interactions with the rest of the brain. These new tools have the potential to bridge the gaps between neurobiological and psychological thinking about the mechanisms of complex motivated behavior. PMID:23473313
Hypothalamic survival circuits: blueprints for purposive behaviors.
Sternson, Scott M
2013-03-06
Neural processes that direct an animal's actions toward environmental goals are critical elements for understanding behavior. The hypothalamus is closely associated with motivated behaviors required for survival and reproduction. Intense feeding, drinking, aggressive, and sexual behaviors can be produced by a simple neuronal stimulus applied to discrete hypothalamic regions. What can these "evoked behaviors" teach us about the neural processes that determine behavioral intent and intensity? Small populations of neurons sufficient to evoke a complex motivated behavior may be used as entry points to identify circuits that energize and direct behavior to specific goals. Here, I review recent applications of molecular genetic, optogenetic, and pharmacogenetic approaches that overcome previous limitations for analyzing anatomically complex hypothalamic circuits and their interactions with the rest of the brain. These new tools have the potential to bridge the gaps between neurobiological and psychological thinking about the mechanisms of complex motivated behavior. Copyright © 2013 Elsevier Inc. All rights reserved.
Cepeda, Nicholas J.; Blackwell, Katharine A.; Munakata, Yuko
2012-01-01
The rate at which people process information appears to influence many aspects of cognition across the lifespan. However, many commonly accepted measures of “processing speed” may require goal maintenance, manipulation of information in working memory, and decision-making, blurring the distinction between processing speed and executive control and resulting in overestimation of processing-speed contributions to cognition. This concern may apply particularly to studies of developmental change, as even seemingly simple processing speed measures may require executive processes to keep children and older adults on task. We report two new studies and a re-analysis of a published study, testing predictions about how different processing speed measures influence conclusions about executive control across the life span. We find that the choice of processing speed measure affects the relationship observed between processing speed and executive control, in a manner that changes with age, and that choice of processing speed measure affects conclusions about development and the relationship among executive control measures. Implications for understanding processing speed, executive control, and their development are discussed. PMID:23432836
Pointo - a Low Cost Solution to Point Cloud Processing
NASA Astrophysics Data System (ADS)
Houshiar, H.; Winkler, S.
2017-11-01
With advance in technology access to data especially 3D point cloud data becomes more and more an everyday task. 3D point clouds are usually captured with very expensive tools such as 3D laser scanners or very time consuming methods such as photogrammetry. Most of the available softwares for 3D point cloud processing are designed for experts and specialists in this field and are usually very large software packages containing variety of methods and tools. This results in softwares that are usually very expensive to acquire and also very difficult to use. Difficulty of use is caused by complicated user interfaces that is required to accommodate a large list of features. The aim of these complex softwares is to provide a powerful tool for a specific group of specialist. However they are not necessary required by the majority of the up coming average users of point clouds. In addition to complexity and high costs of these softwares they generally rely on expensive and modern hardware and only compatible with one specific operating system. Many point cloud customers are not point cloud processing experts or willing to spend the high acquisition costs of these expensive softwares and hardwares. In this paper we introduce a solution for low cost point cloud processing. Our approach is designed to accommodate the needs of the average point cloud user. To reduce the cost and complexity of software our approach focuses on one functionality at a time in contrast with most available softwares and tools that aim to solve as many problems as possible at the same time. Our simple and user oriented design improve the user experience and empower us to optimize our methods for creation of an efficient software. In this paper we introduce Pointo family as a series of connected softwares to provide easy to use tools with simple design for different point cloud processing requirements. PointoVIEWER and PointoCAD are introduced as the first components of the Pointo family to provide a fast and efficient visualization with the ability to add annotation and documentation to the point clouds.
Drunk, but not blind: the effects of alcohol intoxication on change blindness.
Colflesh, Gregory J H; Wiley, Jennifer
2013-03-01
Alcohol use has long been assumed to alter cognition via attentional processes. To better understand the cognitive consequences of intoxication, the present study tested the effects of moderate intoxication (average BAC between .071 and .082) on attentional processing using complex working memory capacity (WMC) span tasks and a change blindness task. Intoxicated and sober participants were matched on baseline WMC performance, and intoxication significantly decreased performance on the complex span tasks. Surprisingly, intoxication improved performance on the change blindness task. The results are interpreted as evidence that intoxication decreases attentional control, causing either a shift towards more passive processing and/or a more diffuse attentional state. This may result in decreased performance on tasks where attentional control or focus are required, but may actually facilitate performance in some contexts. Copyright © 2013 Elsevier Inc. All rights reserved.
Geotechnical approaches to coal ash content control in mining of complex structure deposits
NASA Astrophysics Data System (ADS)
Batugin, SA; Gavrilov, VL; Khoyutanov, EA
2017-02-01
Coal deposits having complex structure and nonuniform quality coal reserves require improved processes of production quality control. The paper proposes a method to present coal ash content as components of natural and technological dilution. It is chosen to carry out studies on the western site of Elginsk coal deposit, composed of four coal beds of complex structure. The reported estimates of coal ash content in the beds with respect to five components point at the need to account for such data in confirmation exploration, mine planning and actual mining. Basic means of analysis and control of overall ash content and its components are discussed.
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Preheim, Larry E.
1990-01-01
Data systems requirements in the Earth Observing System (EOS) Space Station Freedom (SSF) eras indicate increasing data volume, increased discipline interplay, higher complexity and broader data integration and interpretation. A response to the needs of the interdisciplinary investigator is proposed, considering the increasing complexity and rising costs of scientific investigation. The EOS Data Information System, conceived to be a widely distributed system with reliable communication links between central processing and the science user community, is described. Details are provided on information architecture, system models, intelligent data management of large complex databases, and standards for archiving ancillary data, using a research library, a laboratory and collaboration services.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, M. Hope; Truex, Mike; Freshley, Mark
Complex sites are defined as those with difficult subsurface access, deep and/or thick zones of contamination, large areal extent, subsurface heterogeneities that limit the effectiveness of remediation, or where long-term remedies are needed to address contamination (e.g., because of long-term sources or large extent). The Test Area North at the Idaho National Laboratory, developed for nuclear fuel operations and heavy metal manufacturing, is used as a case study. Liquid wastes and sludge from experimental facilities were disposed in an injection well, which contaminated the subsurface aquifer located deep within fractured basalt. The wastes included organic, inorganic, and low-level radioactive constituents,more » with the focus of this case study on trichloroethylene. The site is used as an example of a systems-based framework that provides a structured approach to regulatory processes established for remediation under existing regulations. The framework is intended to facilitate remedy decisions and implementation at complex sites where restoration may be uncertain, require long timeframes, or involve use of adaptive management approaches. The framework facilitates site, regulator, and stakeholder interactions during the remedial planning and implementation process by using a conceptual model description as a technical foundation for decisions, identifying endpoints, which are interim remediation targets or intermediate decision points on the path to an ultimate end, and maintaining protectiveness during the remediation process. At the Test Area North, using a structured approach to implementing concepts in the endpoint framework, a three-component remedy is largely functioning as intended and is projected to meet remedial action objectives by 2095 as required. The remedy approach is being adjusted as new data become available. The framework provides a structured process for evaluating and adjusting the remediation approach, allowing site owners, regulators, and stakeholders to manage contamination at complex sites where adaptive remedies are needed.« less
Vaseem, Mohammad; McKerricher, Garret; Shamim, Atif
2016-01-13
Currently, silver-nanoparticle-based inkjet ink is commercially available. This type of ink has several serious problems such as a complex synthesis protocol, high cost, high sintering temperatures (∼200 °C), particle aggregation, nozzle clogging, poor shelf life, and jetting instability. For the emerging field of printed electronics, these shortcomings in conductive inks are barriers for their widespread use in practical applications. Formulating particle-free silver inks has potential to solve these issues and requires careful design of the silver complexation. The ink complex must meet various requirements, such as in situ reduction, optimum viscosity, storage and jetting stability, smooth uniform sintered films, excellent adhesion, and high conductivity. This study presents a robust formulation of silver-organo-complex (SOC) ink, where complexing molecules act as reducing agents. The 17 wt % silver loaded ink was printed and sintered on a wide range of substrates with uniform surface morphology and excellent adhesion. The jetting stability was monitored for 5 months to confirm that the ink was robust and highly stable with consistent jetting performance. Radio frequency inductors, which are highly sensitive to metal quality, were demonstrated as a proof of concept on flexible PEN substrate. This is a major step toward producing high-quality electronic components with a robust inkjet printing process.
A system level model for preliminary design of a space propulsion solid rocket motor
NASA Astrophysics Data System (ADS)
Schumacher, Daniel M.
Preliminary design of space propulsion solid rocket motors entails a combination of components and subsystems. Expert design tools exist to find near optimal performance of subsystems and components. Conversely, there is no system level preliminary design process for space propulsion solid rocket motors that is capable of synthesizing customer requirements into a high utility design for the customer. The preliminary design process for space propulsion solid rocket motors typically builds on existing designs and pursues feasible rather than the most favorable design. Classical optimization is an extremely challenging method when dealing with the complex behavior of an integrated system. The complexity and combinations of system configurations make the number of the design parameters that are traded off unreasonable when manual techniques are used. Existing multi-disciplinary optimization approaches generally address estimating ratios and correlations rather than utilizing mathematical models. The developed system level model utilizes the Genetic Algorithm to perform the necessary population searches to efficiently replace the human iterations required during a typical solid rocket motor preliminary design. This research augments, automates, and increases the fidelity of the existing preliminary design process for space propulsion solid rocket motors. The system level aspect of this preliminary design process, and the ability to synthesize space propulsion solid rocket motor requirements into a near optimal design, is achievable. The process of developing the motor performance estimate and the system level model of a space propulsion solid rocket motor is described in detail. The results of this research indicate that the model is valid for use and able to manage a very large number of variable inputs and constraints towards the pursuit of the best possible design.
El-Jardali, Fadi; Ataya, Nour; Jamal, Diana; Jaafar, Maha
2012-05-06
Limited work has been done to promote knowledge translation (KT) in the Eastern Mediterranean Region (EMR). The objectives of this study are to: 1.assess the climate for evidence use in policy; 2.explore views and practices about current processes and weaknesses of health policymaking; 3.identify priorities including short-term requirements for policy briefs; and 4.identify country-specific requirements for establishing KT platforms. Senior policymakers, stakeholders and researchers from Algeria, Bahrain, Egypt, Iran, Jordan, Lebanon, Oman, Sudan, Syria, Tunisia, and Yemen participated in this study. Questionnaires were used to assess the climate for use of evidence and identify windows of opportunity and requirements for policy briefs and for establishing KT platforms. Current processes and weaknesses of policymaking were appraised using case study scenarios. Closed-ended questions were analyzed descriptively. Qualitative data was analyzed using thematic analysis. KT activities were not frequently undertaken by policymakers and researchers in EMR countries, research evidence about high priority policy issues was rarely made available, and interaction between policymakers and researchers was limited, and policymakers rarely identified or created places for utilizing research evidence in decision-making processes. Findings emphasized the complexity of policymaking. Donors, political regimes, economic goals and outdated laws were identified as key drivers. Lack of policymakers' abilities to think strategically, constant need to make quick decisions, limited financial resources, and lack of competent and trained human resources were suggested as main weaknesses. Despite the complexity of policymaking processes in countries from this region, the absence of a structured process for decision making, and the limited engagement of policymakers and researchers in KT activities, there are windows of opportunity for moving towards more evidence informed policymaking.
2012-01-01
Objectives Limited work has been done to promote knowledge translation (KT) in the Eastern Mediterranean Region (EMR). The objectives of this study are to: 1.assess the climate for evidence use in policy; 2.explore views and practices about current processes and weaknesses of health policymaking; 3.identify priorities including short-term requirements for policy briefs; and 4.identify country-specific requirements for establishing KT platforms. Methods Senior policymakers, stakeholders and researchers from Algeria, Bahrain, Egypt, Iran, Jordan, Lebanon, Oman, Sudan, Syria, Tunisia, and Yemen participated in this study. Questionnaires were used to assess the climate for use of evidence and identify windows of opportunity and requirements for policy briefs and for establishing KT platforms. Current processes and weaknesses of policymaking were appraised using case study scenarios. Closed-ended questions were analyzed descriptively. Qualitative data was analyzed using thematic analysis. Results KT activities were not frequently undertaken by policymakers and researchers in EMR countries, research evidence about high priority policy issues was rarely made available, and interaction between policymakers and researchers was limited, and policymakers rarely identified or created places for utilizing research evidence in decision-making processes. Findings emphasized the complexity of policymaking. Donors, political regimes, economic goals and outdated laws were identified as key drivers. Lack of policymakers’ abilities to think strategically, constant need to make quick decisions, limited financial resources, and lack of competent and trained human resources were suggested as main weaknesses. Conclusion Despite the complexity of policymaking processes in countries from this region, the absence of a structured process for decision making, and the limited engagement of policymakers and researchers in KT activities, there are windows of opportunity for moving towards more evidence informed policymaking. PMID:22559007
Kümmel, D; Heinemann, U
2008-04-01
The term 'tethering factor' has been coined for a heterogeneous group of proteins that all are required for protein trafficking prior to vesicle docking and SNARE-mediated membrane fusion. Two groups of tethering factors can be distinguished, long coiled-coil proteins and multi-subunit complexes. To date, eight such protein complexes have been identified in yeast, and they are required for different trafficking steps. Homologous complexes are found in all eukaryotic organisms, but conservation seems to be less strict than for other components of the trafficking machinery. In fact, for most proposed multi-subunit tethers their ability to actually bridge two membranes remains to be shown. Here we discuss recent progress in the structural and functional characterization of tethering complexes and present the emerging view that the different complexes are quite diverse in their structure and the molecular mechanisms underlying their function. TRAPP and the exocyst are the structurally best characterized tethering complexes. Their comparison fails to reveal any similarity on a struc nottural level. Furthermore, the interactions with regulatory Rab GTPases vary, with TRAPP acting as a nucleotide exchange factor and the exocyst being an effector. Considering these differences among the tethering complexes as well as between their yeast and mammalian orthologs which is apparent from recent studies, we suggest that tethering complexes do not mediate a strictly conserved process in vesicular transport but are diverse regulators acting after vesicle budding and prior to membrane fusion.
Ramanathan, Rajesh; Walia, Sumeet; Kandjani, Ahmad Esmaielzadeh; Balendran, Sivacarendran; Mohammadtaheri, Mahsa; Bhargava, Suresh Kumar; Kalantar-zadeh, Kourosh; Bansal, Vipul
2015-02-03
A generalized low-temperature approach for fabricating high aspect ratio nanorod arrays of alkali metal-TCNQ (7,7,8,8-tetracyanoquinodimethane) charge transfer complexes at 140 °C is demonstrated. This facile approach overcomes the current limitation associated with fabrication of alkali metal-TCNQ complexes that are based on physical vapor deposition processes and typically require an excess of 800 °C. The compatibility of soft substrates with the proposed low-temperature route allows direct fabrication of NaTCNQ and LiTCNQ nanoarrays on individual cotton threads interwoven within the 3D matrix of textiles. The applicability of these textile-supported TCNQ-based organic charge transfer complexes toward optoelectronics and gas sensing applications is established.
Structure-based characterization of multiprotein complexes.
Wiederstein, Markus; Gruber, Markus; Frank, Karl; Melo, Francisco; Sippl, Manfred J
2014-07-08
Multiprotein complexes govern virtually all cellular processes. Their 3D structures provide important clues to their biological roles, especially through structural correlations among protein molecules and complexes. The detection of such correlations generally requires comprehensive searches in databases of known protein structures by means of appropriate structure-matching techniques. Here, we present a high-speed structure search engine capable of instantly matching large protein oligomers against the complete and up-to-date database of biologically functional assemblies of protein molecules. We use this tool to reveal unseen structural correlations on the level of protein quaternary structure and demonstrate its general usefulness for efficiently exploring complex structural relationships among known protein assemblies. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Situational Analysis for Complex Systems: Methodological Development in Public Health Research.
Martin, Wanda; Pauly, Bernie; MacDonald, Marjorie
2016-01-01
Public health systems have suffered infrastructure losses worldwide. Strengthening public health systems requires not only good policies and programs, but also development of new research methodologies to support public health systems renewal. Our research team considers public health systems to be complex adaptive systems and as such new methods are necessary to generate knowledge about the process of implementing public health programs and services. Within our program of research, we have employed situational analysis as a method for studying complex adaptive systems in four distinct research studies on public health program implementation. The purpose of this paper is to demonstrate the use of situational analysis as a method for studying complex systems and highlight the need for further methodological development.
Jurrus, Elizabeth; Watanabe, Shigeki; Giuly, Richard J.; Paiva, Antonio R. C.; Ellisman, Mark H.; Jorgensen, Erik M.; Tasdizen, Tolga
2013-01-01
Neuroscientists are developing new imaging techniques and generating large volumes of data in an effort to understand the complex structure of the nervous system. The complexity and size of this data makes human interpretation a labor-intensive task. To aid in the analysis, new segmentation techniques for identifying neurons in these feature rich datasets are required. This paper presents a method for neuron boundary detection and nonbranching process segmentation in electron microscopy images and visualizing them in three dimensions. It combines both automated segmentation techniques with a graphical user interface for correction of mistakes in the automated process. The automated process first uses machine learning and image processing techniques to identify neuron membranes that deliniate the cells in each two-dimensional section. To segment nonbranching processes, the cell regions in each two-dimensional section are connected in 3D using correlation of regions between sections. The combination of this method with a graphical user interface specially designed for this purpose, enables users to quickly segment cellular processes in large volumes. PMID:22644867
PAS Kinase Promotes Cell Survival and Growth Through Activation of Rho1
Cardon, Caleb M.; Beck, Thomas; Hall, Michael N.; Rutter, Jared
2014-01-01
In Saccharomyces cerevisiae, phosphorylation of Ugp1 by either of the yeast PASK family protein kinases (yPASK), Psk1 or Psk2, directs this metabolic enzyme to deliver glucose to the periphery for synthesis of the cell wall. However, we isolated PSK1 and PSK2 in a high-copy suppressor screen of a temperature-sensitive mutant of target of rapamycin 2 (TOR2). Posttranslational activation of yPASK, either by cell integrity stress or by growth on nonfermentative carbon sources, also suppressed the growth defect resulting from tor2 mutation. Although suppression of the tor2 mutant growth phenotype by activation of the kinase activity of yPASK required phosphorylation of the metabolic enzyme Ugp1 on serine 11, this resulted in the formation of a complex that induced Rho1 activation, rather than required the glucose partitioning function of Ugp1. In addition to phosphorylated Ugp1, this complex contained Rom2, a Rho1 guanine nucleotide exchange factor, and Ssd1, an mRNA-binding protein. Activation of yPASK-dependent Ugp1 phosphorylation, therefore, enables two processes that are required for cell growth and stress resistance: synthesis of the cell wall through partitioning glucose to the periphery and the formation of the signaling complex with Rom2 and Ssd1 to promote Rho1-dependent polarized cell growth. This complex may integrate metabolic and signaling responses required for cell growth and survival in suboptimal conditions. PMID:22296835
NASA Astrophysics Data System (ADS)
Azwar; Hussain, M. A.; Abdul-Wahab, A. K.; Zanil, M. F.; Mukhlishien
2018-03-01
One of major challenge in bio-hydrogen production process by using MEC process is nonlinear and highly complex system. This is mainly due to the presence of microbial interactions and highly complex phenomena in the system. Its complexity makes MEC system difficult to operate and control under optimal conditions. Thus, precise control is required for the MEC reactor, so that the amount of current required to produce hydrogen gas can be controlled according to the composition of the substrate in the reactor. In this work, two schemes for controlling the current and voltage of MEC were evaluated. The controllers evaluated are PID and Inverse neural network (NN) controller. The comparative study has been carried out under optimal condition for the production of bio-hydrogen gas wherein the controller output is based on the correlation of optimal current and voltage to the MEC. Various simulation tests involving multiple set-point changes and disturbances rejection have been evaluated and the performances of both controllers are discussed. The neural network-based controller results in fast response time and less overshoots while the offset effects are minimal. In conclusion, the Inverse neural network (NN)-based controllers provide better control performance for the MEC system compared to the PID controller.
Castiñeira Reis, Marta; López, Carlos Silva; Kraka, Elfi; Cremer, Dieter; Faza, Olalla Nieto
2016-09-06
β-Hydride eliminations for ethylgold(III) dichloride complexes are identified as reactions with an unusually long prechemical stage corresponding to the conformational preparation of the reaction complex and spanning six phases. The prechemical process is characterized by a geared rotation of the L-Au-L group (L = Cl) driving methyl group rotation and causing a repositioning of the ligands. This requires more than 28 kcal/mol of the total barrier of 34.0 kcal/mol, according to the unified reaction valley approach, which also determines that the energy requirements of the actual chemical process leading to the β-elimination product are only about 5.5 kcal/mol. A detailed mechanistic analysis was used as a basis for a rational design of substrates (via substituents on the ethyl group) and/or ligands, which can significantly reduce the reaction barrier. This strategy takes advantage of either a higher trans activity of the ligands or a tuned electronic demand of the ethyl group. The β-hydride elimination of gold(I) was found to suffer from strong Coulomb and exchange repulsion when a positively charged hydrogen atom enforces a coordination position in a d(10)-configured gold atom, thus triggering an unassisted σ-π Au(I)-C conversion.
Low complexity lossless compression of underwater sound recordings.
Johnson, Mark; Partan, Jim; Hurst, Tom
2013-03-01
Autonomous listening devices are increasingly used to study vocal aquatic animals, and there is a constant need to record longer or with greater bandwidth, requiring efficient use of memory and battery power. Real-time compression of sound has the potential to extend recording durations and bandwidths at the expense of increased processing operations and therefore power consumption. Whereas lossy methods such as MP3 introduce undesirable artifacts, lossless compression algorithms (e.g., flac) guarantee exact data recovery. But these algorithms are relatively complex due to the wide variety of signals they are designed to compress. A simpler lossless algorithm is shown here to provide compression factors of three or more for underwater sound recordings over a range of noise environments. The compressor was evaluated using samples from drifting and animal-borne sound recorders with sampling rates of 16-240 kHz. It achieves >87% of the compression of more-complex methods but requires about 1/10 of the processing operations resulting in less than 1 mW power consumption at a sampling rate of 192 kHz on a low-power microprocessor. The potential to triple recording duration with a minor increase in power consumption and no loss in sound quality may be especially valuable for battery-limited tags and robotic vehicles.
Systems Engineering and Integration for Advanced Life Support System and HST
NASA Technical Reports Server (NTRS)
Kamarani, Ali K.
2005-01-01
Systems engineering (SE) discipline has revolutionized the way engineers and managers think about solving issues related to design of complex systems: With continued development of state-of-the-art technologies, systems are becoming more complex and therefore, a systematic approach is essential to control and manage their integrated design and development. This complexity is driven from integration issues. In this case, subsystems must interact with one another in order to achieve integration objectives, and also achieve the overall system's required performance. Systems engineering process addresses these issues at multiple levels. It is a technology and management process dedicated to controlling all aspects of system life cycle to assure integration at all levels. The Advanced Integration Matrix (AIM) project serves as the systems engineering and integration function for the Human Support Technology (HST) program. AIM provides means for integrated test facilities and personnel for performance trade studies, analyses, integrated models, test results, and validated requirements of the integration of HST. The goal of AIM is to address systems-level integration issues for exploration missions. It will use an incremental systems integration approach to yield technologies, baselines for further development, and possible breakthrough concepts in the areas of technological and organizational interfaces, total information flow, system wide controls, technical synergism, mission operations protocols and procedures, and human-machine interfaces.
Development of a Rubric to Improve Critical Thinking
ERIC Educational Resources Information Center
Hildenbrand, Kasee J.; Schultz, Judy A.
2012-01-01
Context: Health care professionals, including athletic trainers are confronted daily with multiple complex problems that require critical thinking. Objective: This research attempts to develop a reliable process to assess students' critical thinking in a variety of athletic training and kinesiology courses. Design: Our first step was to create a…
2007-04-01
Figure 18: Achieving Self- Synchronization ................................154 Figure 19: Planning Maturity Model...and to leverage, shared awareness and understanding. Planning, a process that creates the necessary conditions for synchronizing actions and effects...cen- tric thinking holds that self- synchronization requires some level of shared awareness.6 In this case, that means cross-domain awareness as well
An Ethnomethodological Perspective on How Middle School Students Addressed a Water Quality Problem
ERIC Educational Resources Information Center
Belland, Brian R.; Gu, Jiangyue; Kim, Nam Ju; Turner, David J.
2016-01-01
Science educators increasingly call for students to address authentic scientific problems in science class. One form of authentic science problem--socioscientific issue--requires that students engage in complex reasoning by considering both scientific and social implications of problems. Computer-based scaffolding can support this process by…
Designing the Regional College Management Information System.
ERIC Educational Resources Information Center
Kin Maung Kywe; And Others
Beginning in 1976, Regional Colleges were formed in Burma to implement career and technical education at the post-secondary level. This paper describes the Regional Colleges and explores the possible use of a systemic management information process that could assist in the complex planning required to develop second-year vocational and technical…
A protocol for parameterization and calibration of RZWQM2 in field research
USDA-ARS?s Scientific Manuscript database
Use of agricultural system models in field research requires a full understanding of both the model and the system it simulates. Since the 1960s, agricultural system models have increased tremendously in their complexity due to greater understanding of the processes simulated, their application to r...
Teaching High-Accuracy Global Positioning System to Undergraduates Using Online Processing Services
ERIC Educational Resources Information Center
Wang, Guoquan
2013-01-01
High-accuracy Global Positioning System (GPS) has become an important geoscientific tool used to measure ground motions associated with plate movements, glacial movements, volcanoes, active faults, landslides, subsidence, slow earthquake events, as well as large earthquakes. Complex calculations are required in order to achieve high-precision…
Biochemistry of the Envenomation Response--A Generator Theme for Interdisciplinary Integration
ERIC Educational Resources Information Center
Montagna, Erik; Guerreiro, Juliano R.; Torres, Bayardo B.
2010-01-01
The understanding of complex physiological processes requires information from many different areas of knowledge. To meet this interdisciplinary scenario, the ability of integrating and articulating information is demanded. The difficulty of such approach arises because, more often than not, information is fragmented through under graduation…
Exploring component-based approaches in forest landscape modeling
H. S. He; D. R. Larsen; D. J. Mladenoff
2002-01-01
Forest management issues are increasingly required to be addressed in a spatial context, which has led to the development of spatially explicit forest landscape models. The numerous processes, complex spatial interactions, and diverse applications in spatial modeling make the development of forest landscape models difficult for any single research group. New...
Molecular Thermodynamics for Cell Biology as Taught with Boxes
ERIC Educational Resources Information Center
Mayorga, Luis S.; Lopez, Maria Jose; Becker, Wayne M.
2012-01-01
Thermodynamic principles are basic to an understanding of the complex fluxes of energy and information required to keep cells alive. These microscopic machines are nonequilibrium systems at the micron scale that are maintained in pseudo-steady-state conditions by very sophisticated processes. Therefore, several nonstandard concepts need to be…
The Integration of Children Dependent on Medical Technology into Public Schools
ERIC Educational Resources Information Center
Raymond, Jill A.
2009-01-01
Advances in medicine have increased the survival rates of children with complex medical conditions, including those who are dependent on technology such as ventilators and tracheostomies. The process of integrating children dependent on medical technology into public schools requires the collaboration of a multidisciplinary team to ensure that…
Teaching Consolidations Accounting: An Approach to Easing the Challenge
ERIC Educational Resources Information Center
Murphy, Elizabeth A.; McCarthy, Mark A.
2010-01-01
Teaching and learning accounting for consolidations is a challenging endeavor. Students not only need to understand the conceptual underpinnings of the accounting requirements for consolidations, but also must master the complex accounting needed to prepare consolidated financial statements. To add to the challenge, the consolidation process is…
Identifying Teaching Methods that Engage Entrepreneurship Students
ERIC Educational Resources Information Center
Balan, Peter; Metcalfe, Mike
2012-01-01
Purpose: Entrepreneurship education particularly requires student engagement because of the complexity of the entrepreneurship process. The purpose of this paper is to describe how an established measure of engagement can be used to identify relevant teaching methods that could be used to engage any group of entrepreneurship students.…
Transforming Professional Learning and Practice in Assessment for Learning
ERIC Educational Resources Information Center
Poskitt, Jenny
2014-01-01
Assessing student learning is a complex process requiring teachers to have deep knowledge of the curriculum, assessment, and pedagogy. Changing political climates mean that teachers are expected to respond to new approaches or systems and adjust their classroom practice accordingly. Teachers often engage in professional learning (PL) to assist…
Evaluation of a Complex, Multisite, Multilevel Grants Initiative
ERIC Educational Resources Information Center
Rollison, Julia; Hill, Gary; Yu, Ping; Murray, Stephen; Mannix, Danyelle; Mathews-Younes, Anne; Wells, Michael E.
2012-01-01
The Safe Schools/Healthy Students (SS/HS) national evaluation seeks to assess both the implementation process and the results of the SS/HS initiative, exploring factors that have contributed to or detracted from grantee success. Each site is required to forge partnerships with representatives from education, mental health, juvenile justice, and…
Managing complex research datasets using electronic tools: A meta-analysis exemplar
Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.
2013-01-01
Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256
Managing complex research datasets using electronic tools: a meta-analysis exemplar.
Brown, Sharon A; Martin, Ellen E; Garcia, Theresa J; Winter, Mary A; García, Alexandra A; Brown, Adama; Cuevas, Heather E; Sumlin, Lisa L
2013-06-01
Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, for example, EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process as well as enhancing communication among research team members. The purpose of this article is to describe the electronic processes designed, using commercially available software, for an extensive, quantitative model-testing meta-analysis. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to decide on which electronic tools to use, determine how these tools would be used, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members.
Towards Hybrid Online On-Demand Querying of Realtime Data with Stateful Complex Event Processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Qunzhi; Simmhan, Yogesh; Prasanna, Viktor K.
Emerging Big Data applications in areas like e-commerce and energy industry require both online and on-demand queries to be performed over vast and fast data arriving as streams. These present novel challenges to Big Data management systems. Complex Event Processing (CEP) is recognized as a high performance online query scheme which in particular deals with the velocity aspect of the 3-V’s of Big Data. However, traditional CEP systems do not consider data variety and lack the capability to embed ad hoc queries over the volume of data streams. In this paper, we propose H2O, a stateful complex event processing framework,more » to support hybrid online and on-demand queries over realtime data. We propose a semantically enriched event and query model to address data variety. A formal query algebra is developed to precisely capture the stateful and containment semantics of online and on-demand queries. We describe techniques to achieve the interactive query processing over realtime data featured by efficient online querying, dynamic stream data persistence and on-demand access. The system architecture is presented and the current implementation status reported.« less
Modeling of Inelastic Collisions in a Multifluid Plasma: Excitation and Deexcitation
2016-05-31
AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES For publication in Physics of Plasma Vol #22, Issue...the fundamental physical processes may be individually known, it is not always clear how their combination affects the overall operation, or at what...arises from the complexity of the physical processes needed to be captured in the model. The required level of detail of the CR model is typically not
Modeling of Inelastic Collisions in a Multifluid Plasma: Excitation and Deexcitation (Preprint)
2015-06-01
AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES For publication in Physics of Plasma PA Case...the fundamental physical processes may be individually known, it is not always clear how their combination affects the overall operation, or at what...arises from the complexity of the physical processes needed to be captured in the model. The required level of detail of the CR model is typically not
Modeling of additive manufacturing processes for metals: Challenges and opportunities
Francois, Marianne M.; Sun, Amy; King, Wayne E.; ...
2017-01-09
Here, with the technology being developed to manufacture metallic parts using increasingly advanced additive manufacturing processes, a new era has opened up for designing novel structural materials, from designing shapes and complex geometries to controlling the microstructure (alloy composition and morphology). The material properties used within specific structural components are also designable in order to meet specific performance requirements that are not imaginable with traditional metal forming and machining (subtractive) techniques.
Infrasound as a Depth Discriminant
2010-09-01
INFRASOUND AS A DEPTH DISCRIMINANT Stephen J. Arrowsmith1, Hans E. Hartse1, Steven R. Taylor2, Richard J. Stead1, and Rod W. Whitaker1 Los Alamos...LA09-Depth-NDD02 ABSTRACT The identification of a signature relating depth to a remotely recorded infrasound signal requires a dataset of...can generate infrasound via a variety of processes, which have occasionally been confused in past studies due to the complexity of the process; (2
Complex multidisciplinary system composition for aerospace vehicle conceptual design
NASA Astrophysics Data System (ADS)
Gonzalez, Lex
Although, there exists a vast amount of work concerning the analysis, design, integration of aerospace vehicle systems, there is no standard for how this data and knowledge should be combined in order to create a synthesis system. Each institution creating a synthesis system has in house vehicle and hardware components they are attempting to model and proprietary methods with which to model them. This leads to the fact that synthesis systems begin as one-off creations meant to answer a specific problem. As the scope of the synthesis system grows to encompass more and more problems, so does its size and complexity; in order for a single synthesis system to answer multiple questions the number of methods and method interface must increase. As a means to curtail the requirement that the increase of an aircraft synthesis systems capability leads to an increase in its size and complexity, this research effort focuses on the idea that each problem in aerospace requires its own analysis framework. By focusing on the creation of a methodology which centers on the matching of an analysis framework towards the problem being solved, the complexity of the analysis framework is decoupled from the complexity of the system that creates it. The derived methodology allows for the composition of complex multi-disciplinary systems (CMDS) through the automatic creation and implementation of system and disciplinary method interfaces. The CMDS Composition process follows a four step methodology meant to take a problem definition and progress towards the creation of an analysis framework meant to answer said problem. The unique implementation of the CMDS Composition process take user selected disciplinary analysis methods and automatically integrates them, together in order to create a syntactically composable analysis framework. As a means of assessing the validity of the CMDS Composition process a prototype system (AVDDBMS) has been developed. AVD DBMS has been used to model the Generic Hypersonic Vehicle (GHV), an open source family of hypersonic vehicles originating from the Air Force Research Laboratory. AVDDBMS has been applied in three different ways in order to assess its validity: Verification using GHV disciplinary data, Validation using selected disciplinary analysis methods, and Application of the CMDS Composition Process to assess the design solution space for the GHV hardware. The research demonstrates the holistic effect that selection of individual disciplinary analysis methods has on the structure and integration of the analysis framework.
An optical Fourier transform coprocessor with direct phase determination.
Macfaden, Alexander J; Gordon, George S D; Wilkinson, Timothy D
2017-10-20
The Fourier transform is a ubiquitous mathematical operation which arises naturally in optics. We propose and demonstrate a practical method to optically evaluate a complex-to-complex discrete Fourier transform. By implementing the Fourier transform optically we can overcome the limiting O(nlogn) complexity of fast Fourier transform algorithms. Efficiently extracting the phase from the well-known optical Fourier transform is challenging. By appropriately decomposing the input and exploiting symmetries of the Fourier transform we are able to determine the phase directly from straightforward intensity measurements, creating an optical Fourier transform with O(n) apparent complexity. Performing larger optical Fourier transforms requires higher resolution spatial light modulators, but the execution time remains unchanged. This method could unlock the potential of the optical Fourier transform to permit 2D complex-to-complex discrete Fourier transforms with a performance that is currently untenable, with applications across information processing and computational physics.
Adaptive evolution of complex innovations through stepwise metabolic niche expansion.
Szappanos, Balázs; Fritzemeier, Jonathan; Csörgő, Bálint; Lázár, Viktória; Lu, Xiaowen; Fekete, Gergely; Bálint, Balázs; Herczeg, Róbert; Nagy, István; Notebaart, Richard A; Lercher, Martin J; Pál, Csaba; Papp, Balázs
2016-05-20
A central challenge in evolutionary biology concerns the mechanisms by which complex metabolic innovations requiring multiple mutations arise. Here, we propose that metabolic innovations accessible through the addition of a single reaction serve as stepping stones towards the later establishment of complex metabolic features in another environment. We demonstrate the feasibility of this hypothesis through three complementary analyses. First, using genome-scale metabolic modelling, we show that complex metabolic innovations in Escherichia coli can arise via changing nutrient conditions. Second, using phylogenetic approaches, we demonstrate that the acquisition patterns of complex metabolic pathways during the evolutionary history of bacterial genomes support the hypothesis. Third, we show how adaptation of laboratory populations of E. coli to one carbon source facilitates the later adaptation to another carbon source. Our work demonstrates how complex innovations can evolve through series of adaptive steps without the need to invoke non-adaptive processes.
Adaptive evolution of complex innovations through stepwise metabolic niche expansion
Szappanos, Balázs; Fritzemeier, Jonathan; Csörgő, Bálint; Lázár, Viktória; Lu, Xiaowen; Fekete, Gergely; Bálint, Balázs; Herczeg, Róbert; Nagy, István; Notebaart, Richard A.; Lercher, Martin J.; Pál, Csaba; Papp, Balázs
2016-01-01
A central challenge in evolutionary biology concerns the mechanisms by which complex metabolic innovations requiring multiple mutations arise. Here, we propose that metabolic innovations accessible through the addition of a single reaction serve as stepping stones towards the later establishment of complex metabolic features in another environment. We demonstrate the feasibility of this hypothesis through three complementary analyses. First, using genome-scale metabolic modelling, we show that complex metabolic innovations in Escherichia coli can arise via changing nutrient conditions. Second, using phylogenetic approaches, we demonstrate that the acquisition patterns of complex metabolic pathways during the evolutionary history of bacterial genomes support the hypothesis. Third, we show how adaptation of laboratory populations of E. coli to one carbon source facilitates the later adaptation to another carbon source. Our work demonstrates how complex innovations can evolve through series of adaptive steps without the need to invoke non-adaptive processes. PMID:27197754
A Program Office Guide to Technology Transfer
1988-11-01
Requirements 2-4 2.4.1 Equipment Complexity 2-5 2.4.2 Industrial Capabilities 2-5 2.4.3 Logistics Requirements/Configuration Control 2-5 2.4.4 Schedule...accomplishment of these milestones re- with the leverage of the FSD and production pro- sults in second source full production capability , grams. For more...MANUFACTURING PROCESSES BUILD UP COMPETITIVE PRODUCTION RATE CAPABILITY DURING LOT III Table 1.2-1 AMRAAM Technology Transfer The leader-follower approach is
Wu, Rentian; Wang, Jiafeng; Liang, Chun
2012-01-01
Regulation of DNA replication initiation is essential for the faithful inheritance of genetic information. Replication initiation is a multi-step process involving many factors including ORC, Cdt1p, Mcm2-7p and other proteins that bind to replication origins to form a pre-replicative complex (pre-RC). As a prerequisite for pre-RC assembly, Cdt1p and the Mcm2-7p heterohexameric complex accumulate in the nucleus in G1 phase in an interdependent manner in budding yeast. However, the nature of this interdependence is not clear, nor is it known whether Cdt1p is required for the assembly of the MCM complex. In this study, we provide the first evidence that Cdt1p, through its interaction with Mcm6p with the C-terminal regions of the two proteins, is crucial for the formation of the MCM complex in both the cytoplasm and nucleoplasm. We demonstrate that disruption of the interaction between Cdt1p and Mcm6p prevents the formation of the MCM complex, excludes Mcm2-7p from the nucleus, and inhibits pre-RC assembly and DNA replication. Our findings suggest a function for Cdt1p in promoting the assembly of the MCM complex and maintaining its integrity by interacting with Mcm6p.
Sobol, Margarita; Yildirim, Sukriye; Philimonenko, Vlada V; Marášek, Pavel; Castaño, Enrique; Hozák, Pavel
2013-01-01
To maintain growth and division, cells require a large-scale production of rRNAs which occurs in the nucleolus. Recently, we have shown the interaction of nucleolar phosphatidylinositol 4,5-bisphosphate (PIP2) with proteins involved in rRNA transcription and processing, namely RNA polymerase I (Pol I), UBF, and fibrillarin. Here we extend the study by investigating transcription-related localization of PIP2 in regards to transcription and processing complexes of Pol I. To achieve this, we used either physiological inhibition of transcription during mitosis or inhibition by treatment the cells with actinomycin D (AMD) or 5,6-dichloro-1β-d-ribofuranosyl-benzimidazole (DRB). We show that PIP2 is associated with Pol I subunits and UBF in a transcription-independent manner. On the other hand, PIP2/fibrillarin colocalization is dependent on the production of rRNA. These results indicate that PIP2 is required not only during rRNA production and biogenesis, as we have shown before, but also plays a structural role as an anchor for the Pol I pre-initiation complex during the cell cycle. We suggest that throughout mitosis, PIP2 together with UBF is involved in forming and maintaining the core platform of the rDNA helix structure. Thus we introduce PIP2 as a novel component of the NOR complex, which is further engaged in the renewed rRNA synthesis upon exit from mitosis. PMID:24513678
Held, Jürgen; Manser, Tanja
2005-02-01
This article outlines how a Palm- or Newton-based PDA (personal digital assistant) system for online event recording was used to record and analyze concurrent events. We describe the features of this PDA-based system, called the FIT-System (flexible interface technique), and its application to the analysis of concurrent events in complex behavioral processes--in this case, anesthesia work processes. The patented FIT-System has a unique user interface design allowing the user to design an interface template with a pencil and paper or using a transparency film. The template usually consists of a drawing or sketch that includes icons or symbols that depict the observer's representation of the situation to be observed. In this study, the FIT-System allowed us to create a design for fast, intuitive online recording of concurrent events using a set of 41 observation codes. An analysis of concurrent events leads to a description of action density, and our results revealed a characteristic distribution of action density during the administration of anesthesia in the operating room. This distribution indicated the central role of the overlapping operations in the action sequences of medical professionals as they deal with the varying requirements of this complex task. We believe that the FIT-System for online recording of concurrent events in complex behavioral processes has the potential to be useful across a broad spectrum of research areas.
Lau, Julia B; Stork, Simone; Moog, Daniel; Sommer, Maik S; Maier, Uwe G
2015-05-01
Nuclear-encoded pre-proteins being imported into complex plastids of red algal origin have to cross up to five membranes. Thereby, transport across the second outermost or periplastidal membrane (PPM) is facilitated by SELMA (symbiont-specific ERAD-like machinery), an endoplasmic reticulum-associated degradation (ERAD)-derived machinery. Core components of SELMA are enzymes involved in ubiquitination (E1-E3), a Cdc48 ATPase complex and Derlin proteins. These components are present in all investigated organisms with four membrane-bound complex plastids of red algal origin, suggesting a ubiquitin-dependent translocation process of substrates mechanistically similar to the process of retro-translocation in ERAD. Even if, according to the current model, translocation via SELMA does not end up in the classical poly-ubiquitination, transient mono-/oligo-ubiquitination of pre-proteins might be required for the mechanism of translocation. We investigated the import mechanism of SELMA and were able to show that protein transport across the PPM depends on lysines in the N-terminal but not in the C-terminal part of pre-proteins. These lysines are predicted to be targets of ubiquitination during the translocation process. As proteins lacking the N-terminal lysines get stuck in the PPM, a 'frozen intermediate' of the translocation process could be envisioned and initially characterized. © 2015 John Wiley & Sons Ltd.
Future fundamental combustion research for aeropropulsion systems
NASA Technical Reports Server (NTRS)
Mularz, E. J.
1985-01-01
Physical fluid mechanics, heat transfer, and chemical kinetic processes which occur in the combustion chamber of aeropropulsion systems were investigated. With the component requirements becoming more severe for future engines, the current design methodology needs the new tools to obtain the optimum configuration in a reasonable design and development cycle. Research efforts in the last few years were encouraging but to achieve these benefits research is required into the fundamental aerothermodynamic processes of combustion. It is recommended that research continues in the areas of flame stabilization, combustor aerodynamics, heat transfer, multiphase flow and atomization, turbulent reacting flows, and chemical kinetics. Associated with each of these engineering sciences is the need for research into computational methods to accurately describe and predict these complex physical processes. Research needs in each of these areas are highlighted.
Reducing the Requirements and Cost of Astronomical Telescopes
NASA Technical Reports Server (NTRS)
Smith, W. Scott; Whitakter, Ann F. (Technical Monitor)
2002-01-01
Limits on astronomical telescope apertures are being rapidly approached. These limits result from logistics, increasing complexity, and finally budgetary constraints. In an historical perspective, great strides have been made in the area of aperture, adaptive optics, wavefront sensors, detectors, stellar interferometers and image reconstruction. What will be the next advances? Emerging data analysis techniques based on communication theory holds the promise of yielding more information from observational data based on significant computer post-processing. This paper explores some of the current telescope limitations and ponders the possibilities increasing the yield of scientific data based on the migration computer post-processing techniques to higher dimensions. Some of these processes hold the promise of reducing the requirements on the basic telescope hardware making the next generation of instruments more affordable.
Quantitative Analysis of HIV-1 Preintegration Complexes
Engelman, Alan; Oztop, Ilker; Vandegraaff, Nick; Raghavendra, Nidhanapati K.
2009-01-01
Retroviral replication proceeds through the formation of a provirus, an integrated DNA copy of the viral RNA genome. The linear cDNA product of reverse transcription is the integration substrate and two different integrase activities, 3′ processing and DNA strand transfer, are required for provirus formation. Integrase nicks the cDNA ends adjacent to phylogenetically-conserved CA dinucleotides during 3′ processing. After nuclear entry and locating a suitable chromatin acceptor site, integrase joins the recessed 3′-OHs to the 5′-phosphates of a double-stranded staggered cut in the DNA target. Integrase functions in the context of a large nucleoprotein complex, called the preintegration complex (PIC), and PICs are analyzed to determine levels of integrase 3′ processing and DNA strand transfer activities that occur during acute virus infection. Denatured cDNA end regions are monitored by indirect end-labeling to measure the extent of 3′ processing. Native PICs can efficiently integrate their viral cDNA into exogenously added target DNA in vitro, and Southern blotting or nested PCR assays are used to quantify the resultant DNA strand transfer activity. This study details HIV-1 infection, PIC extraction, partial purification, and quantitative analyses of integrase 3′ processing and DNA strand transfer activities. PMID:19233280
Child–Adult Differences in Using Dual-Task Paradigms to Measure Listening Effort
Charles, Lauren M.; Ricketts, Todd A.
2017-01-01
Purpose The purpose of the project was to investigate the effects modifying the secondary task in a dual-task paradigm to measure objective listening effort. To be specific, the complexity and depth of processing were increased relative to a simple secondary task. Method Three dual-task paradigms were developed for school-age children. The primary task was word recognition. The secondary task was a physical response to a visual probe (simple task), a physical response to a complex probe (increased complexity), or word categorization (increased depth of processing). Sixteen adults (22–32 years, M = 25.4) and 22 children (9–17 years, M = 13.2) were tested using the 3 paradigms in quiet and noise. Results For both groups, manipulations of the secondary task did not affect word recognition performance. For adults, increasing depth of processing increased the calculated effect of noise; however, for children, results with the deep secondary task were the least stable. Conclusions Manipulations of the secondary task differentially affected adults and children. Consistent with previous findings, increased depth of processing enhanced paradigm sensitivity for adults. However, younger participants were more likely to demonstrate the expected effects of noise on listening effort using a secondary task that did not require deep processing. PMID:28346816
The processivity factor complex of feline herpes virus-1 is a new drug target.
Zhukovskaya, Natalia L; Guan, Hancheng; Saw, Yih Ling; Nuth, Manunya; Ricciardi, Robert P
2015-03-01
Feline herpes virus-1 (FHV-1) is ubiquitous in the cat population and is a major cause of blindness for which antiviral drugs, including acyclovir, are not completely effective. Recurrent infections, due to reactivation of latent FHV-1 residing in the trigeminal ganglia, can lead to epithelial keratitis and stromal keratitis and eventually loss of sight. This has prompted the medical need for an antiviral drug that will specifically inhibit FHV-1 infection. A new antiviral target is the DNA polymerase and its associated processivity factor, which forms a complex that is essential for extended DNA strand synthesis. In this study we have cloned and expressed the FHV-1 DNA polymerase (f-UL30) and processivity factor (f-UL42) and demonstrated that both proteins are required to completely synthesize the 7249 nucleotide full-length DNA from the M13 primed-DNA template in vitro. Significantly, a known inhibitor of human herpes simplex virus-1 (HSV-1) processivity complex was shown to inhibit FHV-1 processive DNA synthesis in vitro and block infection of cells. This validates using f-UL42/f-UL30 as a new antiviral drug target to treat feline ocular herpes infection. Copyright © 2015 Elsevier B.V. All rights reserved.
Learning Efficient Sparse and Low Rank Models.
Sprechmann, P; Bronstein, A M; Sapiro, G
2015-09-01
Parsimony, including sparsity and low rank, has been shown to successfully model data in numerous machine learning and signal processing tasks. Traditionally, such modeling approaches rely on an iterative algorithm that minimizes an objective function with parsimony-promoting terms. The inherently sequential structure and data-dependent complexity and latency of iterative optimization constitute a major limitation in many applications requiring real-time performance or involving large-scale data. Another limitation encountered by these modeling techniques is the difficulty of their inclusion in discriminative learning scenarios. In this work, we propose to move the emphasis from the model to the pursuit algorithm, and develop a process-centric view of parsimonious modeling, in which a learned deterministic fixed-complexity pursuit process is used in lieu of iterative optimization. We show a principled way to construct learnable pursuit process architectures for structured sparse and robust low rank models, derived from the iteration of proximal descent algorithms. These architectures learn to approximate the exact parsimonious representation at a fraction of the complexity of the standard optimization methods. We also show that appropriate training regimes allow to naturally extend parsimonious models to discriminative settings. State-of-the-art results are demonstrated on several challenging problems in image and audio processing with several orders of magnitude speed-up compared to the exact optimization algorithms.
Changing to Concept-Based Curricula: The Process for Nurse Educators.
Baron, Kristy A
2017-01-01
The complexity of health care today requires nursing graduates to use effective thinking skills. Many nursing programs are revising curricula to include concept-based learning that encourages problem-solving, effective thinking, and the ability to transfer knowledge to a variety of situations-requiring nurse educators to modify their teaching styles and methods to promote student-centered learning. Changing from teacher-centered learning to student-centered learning requires a major shift in thinking and application. The focus of this qualitative study was to understand the process of changing to concept-based curricula for nurse educators who previously taught in traditional curriculum designs. The sample included eight educators from two institutions in one Western state using a grounded theory design. The themes that emerged from participants' experiences consisted of the overarching concept, support for change, and central concept, finding meaning in the change. Finding meaning is supported by three main themes : preparing for the change, teaching in a concept-based curriculum, and understanding the teaching-learning process. Changing to a concept-based curriculum required a major shift in thinking and application. Through support, educators discovered meaning to make the change by constructing authentic learning opportunities that mirrored practice, refining the change process, and reinforcing benefits of teaching.
Supporting BPMN choreography with system integration artefacts for enterprise process collaboration
NASA Astrophysics Data System (ADS)
Nie, Hongchao; Lu, Xudong; Duan, Huilong
2014-07-01
Business Process Model and Notation (BPMN) choreography modelling depicts externally visible message exchanges between collaborating processes of enterprise information systems. Implementation of choreography relies on designing system integration solutions to realise message exchanges between independently developed systems. Enterprise integration patterns (EIPs) are widely accepted artefacts to design integration solutions. If the choreography model represents coordination requirements between processes with behaviour mismatches, the integration designer needs to analyse the routing requirements and address these requirements by manually designing EIP message routers. As collaboration scales and complexity increases, manual design becomes inefficient. Thus, the research problem of this paper is to explore a method to automatically identify routing requirements from BPMN choreography model and to accordingly design routing in the integration solution. To achieve this goal, recurring behaviour mismatch scenarios are analysed as patterns, and corresponding solutions are proposed as EIP routers. Using this method, a choreography model can be analysed by computer to identify occurrences of mismatch patterns, leading to corresponding router selection. A case study demonstrates that the proposed method enables computer-assisted integration design to implement choreography. A further experiment reveals that the method is effective to improve the design quality and reduce time cost.
Software Process Assurance for Complex Electronics (SPACE)
NASA Technical Reports Server (NTRS)
Plastow, Richard A.
2007-01-01
Complex Electronics (CE) are now programmed to perform tasks that were previously handled in software, such as communication protocols. Many of the methods used to develop software bare a close resemblance to CE development. For instance, Field Programmable Gate Arrays (FPGAs) can have over a million logic gates while system-on-chip (SOC) devices can combine a microprocessor, input and output channels, and sometimes an FPGA for programmability. With this increased intricacy, the possibility of software-like bugs such as incorrect design, logic, and unexpected interactions within the logic is great. Since CE devices are obscuring the hardware/software boundary, we propose that mature software methodologies may be utilized with slight modifications in the development of these devices. Software Process Assurance for Complex Electronics (SPACE) is a research project that looks at using standardized S/W Assurance/Engineering practices to provide an assurance framework for development activities. Tools such as checklists, best practices and techniques can be used to detect missing requirements and bugs earlier in the development cycle creating a development process for CE that will be more easily maintained, consistent and configurable based on the device used.
Cochrane, Anita J; Dick, Bob; King, Neil A; Hills, Andrew P; Kavanagh, David J
2017-10-16
There have been consistent recommendations for multicomponent and multidisciplinary approaches for obesity management. However, there is no clear agreement on the components, disciplines or processes to be considered within such an approach. In this study, we explored multicomponent and multidisciplinary approaches through an examination of knowledge, skills, beliefs, and recommendations of stakeholders involved in obesity management. These stakeholders included researchers, practitioners, educators, and patients. We used qualitative action research methods, including convergent interviewing and observation, to assist the process of inquiry. The consensus was that a multicomponent and multidisciplinary approach should be based on four central meta-components (patient, practitioner, process, and environmental factors), and specific components of these factors were identified. Psychologists, dieticians, exercise physiologists and general practitioners were nominated as key practitioners to be included. A complex condition like obesity requires that multiple components be addressed, and that both patients and multiple disciplines are involved in developing solutions. Implementing cycles of continuous improvement to deal with complexity, instead of trying to control for it, offers an effective way to deal with complex, changing multisystem problems like obesity.
Karabiyikoglu, Sedef; Boon, Byron A; Merlic, Craig A
2017-08-04
The Pauson-Khand reaction is a powerful tool for the synthesis of cyclopentenones through the efficient [2 + 2 + 1] cycloaddition of dicobalt alkyne complexes with alkenes. While intermolecular and intramolecular variants are widely known, transannular versions of this reaction are unknown and the basis of this study. Macrocyclic enyne and dienyne complexes were readily synthesized by palladium(II)-catalyzed oxidative macrocyclizations of bis(vinyl boronate esters) or ring-closing metathesis reactions followed by complexation with dicobalt octacarbonyl. Several reaction modalities of these macrocyclic complexes were uncovered. In addition to the first successful transannular Pauson-Khand reactions, other intermolecular and transannular cycloaddition reactions included intermolecular Pauson-Khand reactions, transannular [4 + 2] cycloaddition reactions, intermolecular [2 + 2 + 2] cycloaddition reactions, and intermolecular [2 + 2 + 1 + 1] cycloaddition reactions. The structural and reaction requirements for each process are presented.
Merging OLTP and OLAP - Back to the Future
NASA Astrophysics Data System (ADS)
Lehner, Wolfgang
When the terms "Data Warehousing" and "Online Analytical Processing" were coined in the 1990s by Kimball, Codd, and others, there was an obvious need for separating data and workload for operational transactional-style processing and decision-making implying complex analytical queries over large and historic data sets. Large data warehouse infrastructures have been set up to cope with the special requirements of analytical query answering for multiple reasons: For example, analytical thinking heavily relies on predefined navigation paths to guide the user through the data set and to provide different views on different aggregation levels.Multi-dimensional queries exploiting hierarchically structured dimensions lead to complex star queries at a relational backend, which could hardly be handled by classical relational systems.
Low-complexity camera digital signal imaging for video document projection system
NASA Astrophysics Data System (ADS)
Hsia, Shih-Chang; Tsai, Po-Shien
2011-04-01
We present high-performance and low-complexity algorithms for real-time camera imaging applications. The main functions of the proposed camera digital signal processing (DSP) involve color interpolation, white balance, adaptive binary processing, auto gain control, and edge and color enhancement for video projection systems. A series of simulations demonstrate that the proposed method can achieve good image quality while keeping computation cost and memory requirements low. On the basis of the proposed algorithms, the cost-effective hardware core is developed using Verilog HDL. The prototype chip has been verified with one low-cost programmable device. The real-time camera system can achieve 1270 × 792 resolution with the combination of extra components and can demonstrate each DSP function.
Harnessing Thin-Film Continuous-Flow Assembly Lines.
Britton, Joshua; Castle, Jared W; Weiss, Gregory A; Raston, Colin L
2016-07-25
Inspired by nature's ability to construct complex molecules through sequential synthetic transformations, an assembly line synthesis of α-aminophosphonates has been developed. In this approach, simple starting materials are continuously fed through a thin-film reactor where the intermediates accrue molecular complexity as they progress through the flow system. Flow chemistry allows rapid multistep transformations to occur via reaction compartmentalization, an approach not amenable to using conventional flasks. Thin film processing can also access facile in situ solvent exchange to drive reaction efficiency, and through this method, α-aminophosphonate synthesis requires only 443 s residence time to produce 3.22 g h(-1) . Assembly-line synthesis allows unprecedented reaction flexibility and processing efficiency. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Technical Reports Server (NTRS)
Stehle, Roy H.; Ogier, Richard G.
1993-01-01
Alternatives for realizing a packet-based network switch for use on a frequency division multiple access/time division multiplexed (FDMA/TDM) geostationary communication satellite were investigated. Each of the eight downlink beams supports eight directed dwells. The design needed to accommodate multicast packets with very low probability of loss due to contention. Three switch architectures were designed and analyzed. An output-queued, shared bus system yielded a functionally simple system, utilizing a first-in, first-out (FIFO) memory per downlink dwell, but at the expense of a large total memory requirement. A shared memory architecture offered the most efficiency in memory requirements, requiring about half the memory of the shared bus design. The processing requirement for the shared-memory system adds system complexity that may offset the benefits of the smaller memory. An alternative design using a shared memory buffer per downlink beam decreases circuit complexity through a distributed design, and requires at most 1000 packets of memory more than the completely shared memory design. Modifications to the basic packet switch designs were proposed to accommodate circuit-switched traffic, which must be served on a periodic basis with minimal delay. Methods for dynamically controlling the downlink dwell lengths were developed and analyzed. These methods adapt quickly to changing traffic demands, and do not add significant complexity or cost to the satellite and ground station designs. Methods for reducing the memory requirement by not requiring the satellite to store full packets were also proposed and analyzed. In addition, optimal packet and dwell lengths were computed as functions of memory size for the three switch architectures.
When fast logic meets slow belief: Evidence for a parallel-processing model of belief bias.
Trippas, Dries; Thompson, Valerie A; Handley, Simon J
2017-05-01
Two experiments pitted the default-interventionist account of belief bias against a parallel-processing model. According to the former, belief bias occurs because a fast, belief-based evaluation of the conclusion pre-empts a working-memory demanding logical analysis. In contrast, according to the latter both belief-based and logic-based responding occur in parallel. Participants were given deductive reasoning problems of variable complexity and instructed to decide whether the conclusion was valid on half the trials or to decide whether the conclusion was believable on the other half. When belief and logic conflict, the default-interventionist view predicts that it should take less time to respond on the basis of belief than logic, and that the believability of a conclusion should interfere with judgments of validity, but not the reverse. The parallel-processing view predicts that beliefs should interfere with logic judgments only if the processing required to evaluate the logical structure exceeds that required to evaluate the knowledge necessary to make a belief-based judgment, and vice versa otherwise. Consistent with this latter view, for the simplest reasoning problems (modus ponens), judgments of belief resulted in lower accuracy than judgments of validity, and believability interfered more with judgments of validity than the converse. For problems of moderate complexity (modus tollens and single-model syllogisms), the interference was symmetrical, in that validity interfered with belief judgments to the same degree that believability interfered with validity judgments. For the most complex (three-term multiple-model syllogisms), conclusion believability interfered more with judgments of validity than vice versa, in spite of the significant interference from conclusion validity on judgments of belief.
NASA Astrophysics Data System (ADS)
Brecher, Christian; Baum, Christoph; Bastuck, Thomas
2015-03-01
Economically advantageous microfabrication technologies for lab-on-a-chip diagnostic devices substituting commonly used glass etching or injection molding processes are one of the key enablers for the emerging market of microfluidic devices. On-site detection in fields of life sciences, point of care diagnostics and environmental analysis requires compact, disposable and highly functionalized systems. Roll-to-roll production as a high volume process has become the emerging fabrication technology for integrated, complex high technology products within recent years (e.g. fuel cells). Differently functionalized polymer films enable researchers to create a new generation of lab-on-a-chip devices by combining electronic, microfluidic and optical functions in multilayer architecture. For replication of microfluidic and optical functions via roll-to-roll production process competitive approaches are available. One of them is to imprint fluidic channels and optical structures of micro- or nanometer scale from embossing rollers into ultraviolet (UV) curable lacquers on polymer substrates. Depending on dimension, shape and quantity of those structures there are alternative manufacturing technologies for the embossing roller. Ultra-precise diamond turning, electroforming or casting polymer materials are used either for direct structuring or manufacturing of roller sleeves. Mastering methods are selected for application considering replication quality required and structure complexity. Criteria for the replication quality are surface roughness and contour accuracy. Structure complexity is evaluated by shapes producible (e.g. linear, circular) and aspect ratio. Costs for the mastering process and structure lifetime are major cost factors. The alternative replication approaches are introduced and analyzed corresponding to the criteria presented. Advantages and drawbacks of each technology are discussed and exemplary applications are presented.
X-33 Environmental Impact Statement: A Fast Track Approach
NASA Technical Reports Server (NTRS)
McCaleb, Rebecca C.; Holland, Donna L.
1998-01-01
NASA is required by the National Environmental Policy Act (NEPA) to prepare an appropriate level environmental analysis for its major projects. Development of the X-33 Technology Demonstrator and its associated flight test program required an environmental impact statement (EIS) under the NEPA. The EIS process is consists of four parts: the "Notice of Intent" to prepare an EIS and scoping; the draft EIS which is distributed for review and comment; the final ETS; and the "Record of Decision." Completion of this process normally takes from 2 - 3 years, depending on the complexity of the proposed action. Many of the agency's newest fast track, technology demonstration programs require NEPA documentation, but cannot sustain the lengthy time requirement between program concept development to implementation. Marshall Space Flight Center, in cooperation with Kennedy Space Center, accomplished the NEPA process for the X-33 Program in 13 months from Notice of Intent to Record of Decision. In addition, the environmental team implemented an extensive public involvement process, conducting a total of 23 public meetings for scoping and draft EIS comment along with numerous informal meetings with public officials, civic organizations, and Native American Indians. This paper will discuss the fast track approach used to successfully accomplish the NEPA process for X-33 on time.
The translational landscape of Arabidopsis mitochondria.
Planchard, Noelya; Bertin, Pierre; Quadrado, Martine; Dargel-Graffin, Céline; Hatin, Isabelle; Namy, Olivier; Mireau, Hakim
2018-06-05
Messenger RNA translation is a complex process that is still poorly understood in eukaryotic organelles like mitochondria. Growing evidence indicates though that mitochondrial translation differs from its bacterial counterpart in many key aspects. In this analysis, we have used ribosome profiling technology to generate a genome-wide snapshot view of mitochondrial translation in Arabidopsis. We show that, unlike in humans, most Arabidopsis mitochondrial ribosome footprints measure 27 and 28 bases. We also reveal that respiratory subunits encoding mRNAs show much higher ribosome association than other mitochondrial mRNAs, implying that they are translated at higher levels. Homogenous ribosome densities were generally detected within each respiratory complex except for complex V, where higher ribosome coverage corroborated with higher requirements for specific subunits. In complex I respiratory mutants, a reorganization of mitochondrial mRNAs ribosome association was detected involving increased ribosome densities for certain ribosomal protein encoding transcripts and a reduction in translation of a few complex V mRNAs. Taken together, our observations reveal that plant mitochondrial translation is a dynamic process and that translational control is important for gene expression in plant mitochondria. This study paves the way for future advances in the understanding translation in higher plant mitochondria.
A software development and evolution model based on decision-making
NASA Technical Reports Server (NTRS)
Wild, J. Christian; Dong, Jinghuan; Maly, Kurt
1991-01-01
Design is a complex activity whose purpose is to construct an artifact which satisfies a set of constraints and requirements. However the design process is not well understood. The software design and evolution process is the focus of interest, and a three dimensional software development space organized around a decision-making paradigm is presented. An initial instantiation of this model called 3DPM(sub p) which was partly implemented, is presented. Discussion of the use of this model in software reuse and process management is given.
NASA Astrophysics Data System (ADS)
Lebiedz, Dirk; Brandt-Pollmann, Ulrich
2004-09-01
Specific external control of chemical reaction systems and both dynamic control and signal processing as central functions in biochemical reaction systems are important issues of modern nonlinear science. For example nonlinear input-output behavior and its regulation are crucial for the maintainance of the life process that requires extensive communication between cells and their environment. An important question is how the dynamical behavior of biochemical systems is controlled and how they process information transmitted by incoming signals. But also from a general point of view external forcing of complex chemical reaction processes is important in many application areas ranging from chemical engineering to biomedicine. In order to study such control issues numerically, here, we choose a well characterized chemical system, the CO oxidation on Pt(110), which is interesting per se as an externally forced chemical oscillator model. We show numerically that tuning of temporal self-organization by input signals in this simple nonlinear chemical reaction exhibiting oscillatory behavior can in principle be exploited for both specific external control of dynamical system behavior and processing of complex information.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jurrus, Elizabeth R.; Watanabe, Shigeki; Giuly, Richard J.
2013-01-01
Neuroscientists are developing new imaging techniques and generating large volumes of data in an effort to understand the complex structure of the nervous system. The complexity and size of this data makes human interpretation a labor-intensive task. To aid in the analysis, new segmentation techniques for identifying neurons in these feature rich datasets are required. This paper presents a method for neuron boundary detection and nonbranching process segmentation in electron microscopy images and visualizing them in three dimensions. It combines both automated segmentation techniques with a graphical user interface for correction of mistakes in the automated process. The automated processmore » first uses machine learning and image processing techniques to identify neuron membranes that deliniate the cells in each two-dimensional section. To segment nonbranching processes, the cell regions in each two-dimensional section are connected in 3D using correlation of regions between sections. The combination of this method with a graphical user interface specially designed for this purpose, enables users to quickly segment cellular processes in large volumes.« less
Co-optimization of lithographic and patterning processes for improved EPE performance
NASA Astrophysics Data System (ADS)
Maslow, Mark J.; Timoshkov, Vadim; Kiers, Ton; Jee, Tae Kwon; de Loijer, Peter; Morikita, Shinya; Demand, Marc; Metz, Andrew W.; Okada, Soichiro; Kumar, Kaushik A.; Biesemans, Serge; Yaegashi, Hidetami; Di Lorenzo, Paolo; Bekaert, Joost P.; Mao, Ming; Beral, Christophe; Larivière, Stephane
2017-03-01
Complimentary lithography is already being used for advanced logic patterns. The tight pitches for 1D Metal layers are expected to be created using spacer based multiple patterning ArF-i exposures and the more complex cut/block patterns are made using EUV exposures. At the same time, control requirements of CDU, pattern shift and pitch-walk are approaching sub-nanometer levels to meet edge placement error (EPE) requirements. Local variability, such as Line Edge Roughness (LER), Local CDU, and Local Placement Error (LPE), are dominant factors in the total Edge Placement error budget. In the lithography process, improving the imaging contrast when printing the core pattern has been shown to improve the local variability. In the etch process, it has been shown that the fusion of atomic level etching and deposition can also improve these local variations. Co-optimization of lithography and etch processing is expected to further improve the performance over individual optimizations alone. To meet the scaling requirements and keep process complexity to a minimum, EUV is increasingly seen as the platform for delivering the exposures for both the grating and the cut/block patterns beyond N7. In this work, we evaluated the overlay and pattern fidelity of an EUV block printed in a negative tone resist on an ArF-i SAQP grating. High-order Overlay modeling and corrections during the exposure can reduce overlay error after development, a significant component of the total EPE. During etch, additional degrees of freedom are available to improve the pattern placement error in single layer processes. Process control of advanced pitch nanoscale-multi-patterning techniques as described above is exceedingly complicated in a high volume manufacturing environment. Incorporating potential patterning optimizations into both design and HVM controls for the lithography process is expected to bring a combined benefit over individual optimizations. In this work we will show the EPE performance improvement for a 32nm pitch SAQP + block patterned Metal 2 layer by cooptimizing the lithography and etch processes. Recommendations for further improvements and alternative processes will be given.
Nonisothermal glass molding for the cost-efficient production of precision freeform optics
NASA Astrophysics Data System (ADS)
Vu, Anh-Tuan; Kreilkamp, Holger; Dambon, Olaf; Klocke, Fritz
2016-07-01
Glass molding has become a key replication-based technology to satisfy intensively growing demands of complex precision optics in the today's photonic market. However, the state-of-the-art replicative technologies are still limited, mainly due to their insufficiency to meet the requirements of mass production. This paper introduces a newly developed nonisothermal glass molding in which a complex-shaped optic is produced in a very short process cycle. The innovative molding technology promises a cost-efficient production because of increased mold lifetime, less energy consumption, and high throughput from a fast process chain. At the early stage of the process development, the research focuses on an integration of finite element simulation into the process chain to reduce time and labor-intensive cost. By virtue of numerical modeling, defects including chill ripples and glass sticking in the nonisothermal molding process can be predicted and the consequent effects are avoided. In addition, the influences of process parameters and glass preforms on the surface quality, form accuracy, and residual stress are discussed. A series of experiments was carried out to validate the simulation results. The successful modeling, therefore, provides a systematic strategy for glass preform design, mold compensation, and optimization of the process parameters. In conclusion, the integration of simulation into the entire nonisothermal glass molding process chain will significantly increase the manufacturing efficiency as well as reduce the time-to-market for the mass production of complex precision yet low-cost glass optics.
NASA Astrophysics Data System (ADS)
Hong, Y.; Curteza, A.; Zeng, X.; Bruniaux, P.; Chen, Y.
2016-06-01
Material selection is the most difficult section in the customized garment product design and development process. This study aims to create a hierarchical framework for material selection. The analytic hierarchy process and fuzzy sets theories have been applied to mindshare the diverse requirements from the customer and inherent interaction/interdependencies among these requirements. Sensory evaluation ensures a quick and effective selection without complex laboratory test such as KES and FAST, using the professional knowledge of the designers. A real empirical application for the physically disabled people is carried out to demonstrate the proposed method. Both the theoretical and practical background of this paper have indicated the fuzzy analytical network process can capture expert's knowledge existing in the form of incomplete, ambiguous and vague information for the mutual influence on attribute and criteria of the material selection.
Unlocking Potentials of Microwaves for Food Safety and Quality
Tang, Juming
2015-01-01
Microwave is an effective means to deliver energy to food through polymeric package materials, offering potential for developing short-time in-package sterilization and pasteurization processes. The complex physics related to microwave propagation and microwave heating require special attention to the design of process systems and development of thermal processes in compliance with regulatory requirements for food safety. This article describes the basic microwave properties relevant to heating uniformity and system design, and provides a historical overview on the development of microwave-assisted thermal sterilization (MATS) and pasteurization systems in research laboratories and used in food plants. It presents recent activities on the development of 915 MHz single-mode MATS technology, the procedures leading to regulatory acceptance, and sensory results of the processed products. The article discusses needs for further efforts to bridge remaining knowledge gaps and facilitate transfer of academic research to industrial implementation. PMID:26242920
Integrated aerodynamic-structural design of a forward-swept transport wing
NASA Technical Reports Server (NTRS)
Haftka, Raphael T.; Grossman, Bernard; Kao, Pi-Jen; Polen, David M.; Sobieszczanski-Sobieski, Jaroslaw
1989-01-01
The introduction of composite materials is having a profound effect on aircraft design. Since these materials permit the designer to tailor material properties to improve structural, aerodynamic and acoustic performance, they require an integrated multidisciplinary design process. Futhermore, because of the complexity of the design process, numerical optimization methods are required. The utilization of integrated multidisciplinary design procedures for improving aircraft design is not currently feasible because of software coordination problems and the enormous computational burden. Even with the expected rapid growth of supercomputers and parallel architectures, these tasks will not be practical without the development of efficient methods for cross-disciplinary sensitivities and efficient optimization procedures. The present research is part of an on-going effort which is focused on the processes of simultaneous aerodynamic and structural wing design as a prototype for design integration. A sequence of integrated wing design procedures has been developed in order to investigate various aspects of the design process.
Unlocking Potentials of Microwaves for Food Safety and Quality.
Tang, Juming
2015-08-01
Microwave is an effective means to deliver energy to food through polymeric package materials, offering potential for developing short-time in-package sterilization and pasteurization processes. The complex physics related to microwave propagation and microwave heating require special attention to the design of process systems and development of thermal processes in compliance with regulatory requirements for food safety. This article describes the basic microwave properties relevant to heating uniformity and system design, and provides a historical overview on the development of microwave-assisted thermal sterilization (MATS) and pasteurization systems in research laboratories and used in food plants. It presents recent activities on the development of 915 MHz single-mode MATS technology, the procedures leading to regulatory acceptance, and sensory results of the processed products. The article discusses needs for further efforts to bridge remaining knowledge gaps and facilitate transfer of academic research to industrial implementation. © 2015 Institute of Food Technologists®
Inversion of 2-D DC resistivity data using rapid optimization and minimal complexity neural network
NASA Astrophysics Data System (ADS)
Singh, U. K.; Tiwari, R. K.; Singh, S. B.
2010-02-01
The backpropagation (BP) artificial neural network (ANN) technique of optimization based on steepest descent algorithm is known to be inept for its poor performance and does not ensure global convergence. Nonlinear and complex DC resistivity data require efficient ANN model and more intensive optimization procedures for better results and interpretations. Improvements in the computational ANN modeling process are described with the goals of enhancing the optimization process and reducing ANN model complexity. Well-established optimization methods, such as Radial basis algorithm (RBA) and Levenberg-Marquardt algorithms (LMA) have frequently been used to deal with complexity and nonlinearity in such complex geophysical records. We examined here the efficiency of trained LMA and RB networks by using 2-D synthetic resistivity data and then finally applied to the actual field vertical electrical resistivity sounding (VES) data collected from the Puga Valley, Jammu and Kashmir, India. The resulting ANN reconstruction resistivity results are compared with the result of existing inversion approaches, which are in good agreement. The depths and resistivity structures obtained by the ANN methods also correlate well with the known drilling results and geologic boundaries. The application of the above ANN algorithms proves to be robust and could be used for fast estimation of resistive structures for other complex earth model also.
Effects of sentence-structure complexity on speech initiation time and disfluency.
Tsiamtsiouris, Jim; Cairns, Helen Smith
2013-03-01
There is general agreement that stuttering is caused by a variety of factors, and language formulation and speech motor control are two important factors that have been implicated in previous research, yet the exact nature of their effects is still not well understood. Our goal was to test the hypothesis that sentences of high structural complexity would incur greater processing costs than sentences of low structural complexity and these costs would be higher for adults who stutter than for adults who do not stutter. Fluent adults and adults who stutter participated in an experiment that required memorization of a sentence classified as low or high structural complexity followed by production of that sentence upon a visual cue. Both groups of speakers initiated most sentences significantly faster in the low structural complexity condition than in the high structural complexity condition. Adults who stutter were over-all slower in speech initiation than were fluent speakers, but there were no significant interactions between complexity and group. However, adults who stutter produced significantly more disfluencies in sentences of high structural complexity than in those of low complexity. After reading this article, the learner will be able to: (a) identify integral parts of all well-known models of adult sentence production; (b) summarize the way that sentence structure might negatively influence the speech production processes; (c) discuss whether sentence structure influences speech initiation time and disfluencies. Copyright © 2012 Elsevier Inc. All rights reserved.
Künzler, Markus; Gerstberger, Thomas; Stutz, Françoise; Bischoff, F. Ralf; Hurt, Ed
2000-01-01
The RanGTP-binding protein RanBP1, which is located in the cytoplasm, has been implicated in release of nuclear export complexes from the cytoplasmic side of the nuclear pore complex. Here we show that Yrb1 (the yeast homolog of RanBP1) shuttles between the nucleus and the cytoplasm. Nuclear import of Yrb1 is a facilitated process that requires a short basic sequence within the Ran-binding domain (RBD). By contrast, nuclear export of Yrb1 requires an intact RBD, which forms a ternary complex with the Xpo1 (Crm1) NES receptor in the presence of RanGTP. Nuclear export of Yrb1, however, is insensitive towards leptomycin B, suggesting a novel type of substrate recognition between Yrb1 and Xpo1. Taken together, these data suggest that ongoing nuclear import and export is an important feature of Yrb1 function in vivo. PMID:10825193
Inheritance of yeast nuclear pore complexes requires the Nsp1p subcomplex
Makio, Tadashi; Lapetina, Diego L.
2013-01-01
In the yeast Saccharomyces cerevisiae, organelles and macromolecular complexes are delivered from the mother to the emerging daughter during cell division, thereby ensuring progeny viability. Here, we have shown that during mitosis nuclear pore complexes (NPCs) in the mother nucleus are actively delivered through the bud neck and into the daughter cell concomitantly with the nuclear envelope. Furthermore, we show that NPC movement into the daughter cell requires members of an NPC subcomplex containing Nsp1p and its interacting partners. NPCs lacking these nucleoporins (Nups) were blocked from entry into the daughter by a putative barrier at the bud neck. This selection process could be observed within individual cells such that NPCs containing Nup82p (an Nsp1p-interacting Nup) were transferred to the daughter cells while functionally compromised NPCs lacking Nup82p were retained in the mother. This mechanism is proposed to facilitate the inheritance of functional NPCs by daughter cells. PMID:24165935
Liberek, K; Osipiuk, J; Zylicz, M; Ang, D; Skorko, J; Georgopoulos, C
1990-02-25
The process of initiation of lambda DNA replication requires the assembly of the proper nucleoprotein complex at the origin of replication, ori lambda. The complex is composed of both phage and host-coded proteins. The lambda O initiator protein binds specifically to ori lambda. The lambda P initiator protein binds to both lambda O and the host-coded dnaB helicase, giving rise to an ori lambda DNA.lambda O.lambda P.dnaB structure. The dnaK and dnaJ heat shock proteins have been shown capable of dissociating this complex. The thus freed dnaB helicase unwinds the duplex DNA template at the replication fork. In this report, through cross-linking, size chromatography, and protein affinity chromatography, we document some of the protein-protein interactions occurring at ori lambda. Our results show that the dnaK protein specifically interacts with both lambda O and lambda P, and that the dnaJ protein specifically interacts with the dnaB helicase.
Examples of current radar technology and applications, chapter 5, part B
NASA Technical Reports Server (NTRS)
1975-01-01
Basic principles and tradeoff considerations for SLAR are summarized. There are two fundamental types of SLAR sensors available to the remote sensing user: real aperture and synthetic aperture. The primary difference between the two types is that a synthetic aperture system is capable of significant improvements in target resolution but requires equally significant added complexity and cost. The advantages of real aperture SLAR include long range coverage, all-weather operation, in-flight processing and image viewing, and lower cost. The fundamental limitation of the real aperture approach is target resolution. Synthetic aperture processing is the most practical approach for remote sensing problems that require resolution higher than 30 to 40 m.
Snapin mediates insulin secretory granule docking, but not trans-SNARE complex formation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Somanath, Sangeeta; Partridge, Christopher J.; Marshall, Catriona
Secretory granule exocytosis is a tightly regulated process requiring granule targeting, tethering, priming, and membrane fusion. At the heart of this process is the SNARE complex, which drives fusion through a coiled-coil zippering effect mediated by the granule v-SNARE protein, VAMP2, and the plasma membrane t-SNAREs, SNAP-25 and syntaxin-1A. Here we demonstrate that in pancreatic β-cells the SNAP-25 accessory protein, snapin, C-terminal H2 domain binds SNAP-25 through its N-terminal Sn-1 domain. Interestingly whilst snapin binds SNAP-25, there is only modest binding of this complex with syntaxin-1A under resting conditions. Instead synataxin-1A appears to be recruited in response to secretory stimulation.more » These results indicate that snapin plays a role in tethering insulin granules to the plasma membrane through coiled coil interaction of snapin with SNAP-25, with full granule fusion competency only resulting after subsequent syntaxin-1A recruitment triggered by secretory stimulation. - Highlights: • Snapin mediates granule docking. • Snapin binds SNAP-25. • SNARE complex forms downstream.« less
Instrument control software requirement specification for Extremely Large Telescopes
NASA Astrophysics Data System (ADS)
Young, Peter J.; Kiekebusch, Mario J.; Chiozzi, Gianluca
2010-07-01
Engineers in several observatories are now designing the next generation of optical telescopes, the Extremely Large Telescopes (ELT). These are very complex machines that will host sophisticated astronomical instruments to be used for a wide range of scientific studies. In order to carry out scientific observations, a software infrastructure is required to orchestrate the control of the multiple subsystems and functions. This paper will focus on describing the considerations, strategies and main issues related to the definition and analysis of the software requirements for the ELT's Instrument Control System using modern development processes and modelling tools like SysML.
NASA Astrophysics Data System (ADS)
Korotkova, T. I.; Popova, V. I.
2017-11-01
The generalized mathematical model of decision-making in the problem of planning and mode selection providing required heat loads in a large heat supply system is considered. The system is multilevel, decomposed into levels of main and distribution heating networks with intermediate control stages. Evaluation of the effectiveness, reliability and safety of such a complex system is carried out immediately according to several indicators, in particular pressure, flow, temperature. This global multicriteria optimization problem with constraints is decomposed into a number of local optimization problems and the coordination problem. An agreed solution of local problems provides a solution to the global multicriterion problem of decision making in a complex system. The choice of the optimum operational mode of operation of a complex heat supply system is made on the basis of the iterative coordination process, which converges to the coordinated solution of local optimization tasks. The interactive principle of multicriteria task decision-making includes, in particular, periodic adjustment adjustments, if necessary, guaranteeing optimal safety, reliability and efficiency of the system as a whole in the process of operation. The degree of accuracy of the solution, for example, the degree of deviation of the internal air temperature from the required value, can also be changed interactively. This allows to carry out adjustment activities in the best way and to improve the quality of heat supply to consumers. At the same time, an energy-saving task is being solved to determine the minimum required values of heads at sources and pumping stations.
Wood, Bradley M; Jia, Guang; Carmichael, Owen; McKlveen, Kevin; Homberger, Dominique G
2018-05-12
3D imaging techniques enable the non-destructive analysis and modeling of complex structures. Among these, MRI exhibits good soft tissue contrast, but is currently less commonly used for non-clinical research than x-ray CT, even though the latter requires contrast-staining that shrinks and distorts soft tissues. When the objective is the creation of a realistic and complete 3D model of soft tissue structures, MRI data are more demanding to acquire and visualize and require extensive post-processing because they comprise non-cubic voxels with dimensions that represent a trade-off between tissue contrast and image resolution. Therefore, thin soft tissue structures with complex spatial configurations are not always visible in a single MRI dataset, so that standard segmentation techniques are not sufficient for their complete visualization. By using the example of the thin and spatially complex connective tissue myosepta in lampreys, we developed a workflow protocol for the selection of the appropriate parameters for the acquisition of MRI data and for the visualization and 3D modeling of soft tissue structures. This protocol includes a novel recursive segmentation technique for supplementing missing data in one dataset with data from another dataset to produce realistic and complete 3D models. Such 3D models are needed for the modeling of dynamic processes, such as the biomechanics of fish locomotion. However, our methodology is applicable to the visualization of any thin soft tissue structures with complex spatial configurations, such as fasciae, aponeuroses, and small blood vessels and nerves, for clinical research and the further exploration of tensegrity. This article is protected by copyright. All rights reserved. © 2018 Wiley Periodicals, Inc.
Endothelin-converting enzyme 1 degrades neuropeptides in endosomes to control receptor recycling.
Roosterman, Dirk; Cottrell, Graeme S; Padilla, Benjamin E; Muller, Laurent; Eckman, Christopher B; Bunnett, Nigel W; Steinhoff, Martin
2007-07-10
Neuropeptide signaling requires the presence of G protein-coupled receptors (GPCRs) at the cell surface. Activated GPCRs interact with beta-arrestins, which mediate receptor desensitization, endocytosis, and mitogenic signaling, and the peptide-receptor-arrestin complex is sequestered into endosomes. Although dissociation of beta-arrestins is required for receptor recycling and resensitization, the critical event that initiates this process is unknown. Here we report that the agonist availability in the endosomes, controlled by the membrane metalloendopeptidase endothelin-converting enzyme 1 (ECE-1), determines stability of the peptide-receptor-arrestin complex and regulates receptor recycling and resensitization. Substance P (SP) binding to the tachykinin neurokinin 1 receptor (NK1R) induced membrane translocation of beta-arrestins followed by trafficking of the SP-NK1R-beta-arrestin complex to early endosomes containing ECE-1a-d. ECE-1 degraded SP in acidified endosomes, disrupting the complex; beta-arrestins returned to the cytosol, and the NK1R, freed from beta-arrestins, recycled and resensitized. An ECE-1 inhibitor, by preventing NK1R recycling in endothelial cells, inhibited resensitization of SP-induced inflammation. This mechanism is a general one because ECE-1 similarly regulated NK3R resensitization. Thus, peptide availability in endosomes, here regulated by ECE-1, determines the stability of the peptide-receptor-arrestin complex. This mechanism regulates receptor recycling, which is necessary for sustained signaling, and it may also control beta-arrestin-dependent mitogenic signaling of endocytosed receptors. We propose that other endosomal enzymes and transporters may similarly control the availability of transmitters in endosomes to regulate trafficking and signaling of GPCRs. Antagonism of these endosomal processes represents a strategy for inhibiting sustained signaling of receptors, and defects may explain the tachyphylaxis of drugs that are receptor agonists.
Grayscale lithography-automated mask generation for complex three-dimensional topography
NASA Astrophysics Data System (ADS)
Loomis, James; Ratnayake, Dilan; McKenna, Curtis; Walsh, Kevin M.
2016-01-01
Grayscale lithography is a relatively underutilized technique that enables fabrication of three-dimensional (3-D) microstructures in photosensitive polymers (photoresists). By spatially modulating ultraviolet (UV) dosage during the writing process, one can vary the depth at which photoresist is developed. This means complex structures and bioinspired designs can readily be produced that would otherwise be cost prohibitive or too time intensive to fabricate. The main barrier to widespread grayscale implementation, however, stems from the laborious generation of mask files required to create complex surface topography. We present a process and associated software utility for automatically generating grayscale mask files from 3-D models created within industry-standard computer-aided design (CAD) suites. By shifting the microelectromechanical systems (MEMS) design onus to commonly used CAD programs ideal for complex surfacing, engineering professionals already familiar with traditional 3-D CAD software can readily utilize their pre-existing skills to make valuable contributions to the MEMS community. Our conversion process is demonstrated by prototyping several samples on a laser pattern generator-capital equipment already in use in many foundries. Finally, an empirical calibration technique is shown that compensates for nonlinear relationships between UV exposure intensity and photoresist development depth as well as a thermal reflow technique to help smooth microstructure surfaces.
Challenges in treating post-traumatic stress disorder and attachment trauma.
Allen, Jon G
2003-06-01
Treating women suffering from trauma poses significant challenges. The diagnostic prototype of post-traumatic stress disorder (PTSD) is based on single-event trauma, such as sexual assault in adulthood. Several effective cognitive- behavioral treatments for such traumas have been developed, although many treated patients continue to experience residual symptoms. Even more problematic is the complex developmental psychopathology stemming from a lifetime history of multiple traumas, often beginning with maltreatment in early attachment relationships. A history of attachment trauma undermines the development of capacities to regulate emotional distress and thereby complicates the treatment of acute trauma in adulthood. Such complex trauma requires a multifaceted treatment approach that must balance processing of traumatic memories with strategies to contain the intense emotions this processing evokes. Moreover, conducting such treatment places therapists at risk for secondary trauma such that trauma therapists also must process this stressful experience and implement strategies to regulate their own distress.
Computational modelling of oxygenation processes in enzymes and biomimetic model complexes.
de Visser, Sam P; Quesne, Matthew G; Martin, Bodo; Comba, Peter; Ryde, Ulf
2014-01-11
With computational resources becoming more efficient and more powerful and at the same time cheaper, computational methods have become more and more popular for studies on biochemical and biomimetic systems. Although large efforts from the scientific community have gone into exploring the possibilities of computational methods for studies on large biochemical systems, such studies are not without pitfalls and often cannot be routinely done but require expert execution. In this review we summarize and highlight advances in computational methodology and its application to enzymatic and biomimetic model complexes. In particular, we emphasize on topical and state-of-the-art methodologies that are able to either reproduce experimental findings, e.g., spectroscopic parameters and rate constants, accurately or make predictions of short-lived intermediates and fast reaction processes in nature. Moreover, we give examples of processes where certain computational methods dramatically fail.
Collaborative Working Architecture for IoT-Based Applications.
Mora, Higinio; Signes-Pont, María Teresa; Gil, David; Johnsson, Magnus
2018-05-23
The new sensing applications need enhanced computing capabilities to handle the requirements of complex and huge data processing. The Internet of Things (IoT) concept brings processing and communication features to devices. In addition, the Cloud Computing paradigm provides resources and infrastructures for performing the computations and outsourcing the work from the IoT devices. This scenario opens new opportunities for designing advanced IoT-based applications, however, there is still much research to be done to properly gear all the systems for working together. This work proposes a collaborative model and an architecture to take advantage of the available computing resources. The resulting architecture involves a novel network design with different levels which combines sensing and processing capabilities based on the Mobile Cloud Computing (MCC) paradigm. An experiment is included to demonstrate that this approach can be used in diverse real applications. The results show the flexibility of the architecture to perform complex computational tasks of advanced applications.