Sample records for process typically requires

  1. Thermal Design, Analysis, and Testing of the Quench Module Insert Bread Board

    NASA Technical Reports Server (NTRS)

    Breeding Shawn; Khodabandeh, Julia; Turner, Larry D. (Technical Monitor)

    2001-01-01

    The science requirements for materials processing is to provide the desired PI requirements of thermal gradient, solid/liquid interface front velocity for a given processing temperature desired by the PI. Processing is performed by translating the furnace with the sample in a stationary position to minimize any disturbances to the solid/liquid interface front during steady state processing. Typical sample materials for this metals and alloys furnace are lead-tin alloys, lead-antimony alloys, and aluminum alloys. Samples must be safe to process and therefore typically are contained with hermetically sealed cartridge tubes (gas tight) with inner ceramic liners (liquid tight) to prevent contamination and/or reaction of the sample material with the cartridge tube.

  2. Study for identification of Beneficial uses of Space (BUS). Volume 3: Appendices

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The quantification of required specimen(s) from space processing experiments, the typical EMI measurements and estimates of a typical RF source, and the integration of commercial payloads into spacelab were considered.

  3. Lesson 6: Registration

    EPA Pesticide Factsheets

    Lesson 6 provides CROMERR checklist items grouped under the Registration Process, where users establish their accounts in the system. This process typically requires users to provide information about them.

  4. Face-to-face interference in typical and atypical development

    PubMed Central

    Riby, Deborah M; Doherty-Sneddon, Gwyneth; Whittle, Lisa

    2012-01-01

    Visual communication cues facilitate interpersonal communication. It is important that we look at faces to retrieve and subsequently process such cues. It is also important that we sometimes look away from faces as they increase cognitive load that may interfere with online processing. Indeed, when typically developing individuals hold face gaze it interferes with task completion. In this novel study we quantify face interference for the first time in Williams syndrome (WS) and Autism Spectrum Disorder (ASD). These disorders of development impact on cognition and social attention, but how do faces interfere with cognitive processing? Individuals developing typically as well as those with ASD (n = 19) and WS (n = 16) were recorded during a question and answer session that involved mathematics questions. In phase 1 gaze behaviour was not manipulated, but in phase 2 participants were required to maintain eye contact with the experimenter at all times. Looking at faces decreased task accuracy for individuals who were developing typically. Critically, the same pattern was seen in WS and ASD, whereby task performance decreased when participants were required to hold face gaze. The results show that looking at faces interferes with task performance in all groups. This finding requires the caveat that individuals with WS and ASD found it harder than individuals who were developing typically to maintain eye contact throughout the interaction. Individuals with ASD struggled to hold eye contact at all points of the interaction while those with WS found it especially difficult when thinking. PMID:22356183

  5. Ground robotic measurement of aeolian processes

    USDA-ARS?s Scientific Manuscript database

    Models of aeolian processes rely on accurate measurements of the rates of sediment transport by wind, and careful evaluation of the environmental controls of these processes. Existing field approaches typically require intensive, event-based experiments involving dense arrays of instruments. These d...

  6. Attention, Working Memory, and Grammaticality Judgment in Typical Young Adults

    ERIC Educational Resources Information Center

    Smith, Pamela A.

    2011-01-01

    Purpose: To examine resource allocation and sentence processing, this study examined the effects of auditory distraction on grammaticality judgment (GJ) of sentences varied by semantics (reversibility) and short-term memory requirements. Method: Experiment 1: Typical young adult females (N = 60) completed a whole-sentence GJ task in distraction…

  7. 41 CFR 102-36.35 - What is the typical process for disposing of excess personal property?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false What is the typical... agency property or by obtaining excess property from other federal agencies in lieu of new procurements... eligible non-federal activities. Title 40 of the United States Code requires that surplus personal property...

  8. SAR operational aspects

    NASA Astrophysics Data System (ADS)

    Holmdahl, P. E.; Ellis, A. B. E.; Moeller-Olsen, P.; Ringgaard, J. P.

    1981-12-01

    The basic requirements of the SAR ground segment of ERS-1 are discussed. A system configuration for the real time data acquisition station and the processing and archive facility is depicted. The functions of a typical SAR processing unit (SPU) are specified, and inputs required for near real time and full precision, deferred time processing are described. Inputs and the processing required for provision of these inputs to the SPU are dealt with. Data flow through the systems, and normal and nonnormal operational sequence, are outlined. Prerequisites for maintaining overall performance are identified, emphasizing quality control. The most demanding tasks to be performed by the front end are defined in order to determine types of processors and peripherals which comply with throughput requirements.

  9. Simulation of Simple Controlled Processes with Dead-Time.

    ERIC Educational Resources Information Center

    Watson, Keith R.; And Others

    1985-01-01

    The determination of closed-loop response of processes containing dead-time is typically not covered in undergraduate process control, possibly because the solution by Laplace transforms requires the use of Pade approximation for dead-time, which makes the procedure lengthy and tedious. A computer-aided method is described which simplifies the…

  10. Energy-Performance-Based Design-Build Process: Strategies for Procuring High-Performance Buildings on Typical Construction Budgets: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scheib, J.; Pless, S.; Torcellini, P.

    NREL experienced a significant increase in employees and facilities on our 327-acre main campus in Golden, Colorado over the past five years. To support this growth, researchers developed and demonstrated a new building acquisition method that successfully integrates energy efficiency requirements into the design-build requests for proposals and contracts. We piloted this energy performance based design-build process with our first new construction project in 2008. We have since replicated and evolved the process for large office buildings, a smart grid research laboratory, a supercomputer, a parking structure, and a cafeteria. Each project incorporated aggressive efficiency strategies using contractual energy usemore » requirements in the design-build contracts, all on typical construction budgets. We have found that when energy efficiency is a core project requirement as defined at the beginning of a project, innovative design-build teams can integrate the most cost effective and high performance efficiency strategies on typical construction budgets. When the design-build contract includes measurable energy requirements and is set up to incentivize design-build teams to focus on achieving high performance in actual operations, owners can now expect their facilities to perform. As NREL completed the new construction in 2013, we have documented our best practices in training materials and a how-to guide so that other owners and owner's representatives can replicate our successes and learn from our experiences in attaining market viable, world-class energy performance in the built environment.« less

  11. Application of laser anemometry in turbine engine research

    NASA Technical Reports Server (NTRS)

    Seasholtz, R. G.

    1983-01-01

    The application of laser anemometry to the study of flow fields in turbine engine components is reviewed. Included are discussions of optical configurations, seeding requirements, electronic signal processing, and data processing. Some typical results are presented along with a discussion of ongoing work.

  12. Application of laser anemometry in turbine engine research

    NASA Technical Reports Server (NTRS)

    Seasholtz, R. G.

    1982-01-01

    The application of laser anemometry to the study of flow fields in turbine engine components is reviewed. Included are discussions of optical configurations, seeding requirements, electronic signal processing, and data processing. Some typical results are presented along with a discussion of ongoing work.

  13. On the energy budget in the current disruption region. [of geomagnetic tail

    NASA Technical Reports Server (NTRS)

    Hesse, Michael; Birn, Joachim

    1993-01-01

    This study investigates the energy budget in the current disruption region of the magnetotail, coincident with a pre-onset thin current sheet, around substorm onset time using published observational data and theoretical estimates. We find that the current disruption/dipolarization process typically requires energy inflow into the primary disruption region. The disruption dipolarization process is therefore endoenergetic, i.e., requires energy input to operate. Therefore we argue that some other simultaneously operating process, possibly a large scale magnetotail instability, is required to provide the necessary energy input into the current disruption region.

  14. Improving Competition: Reforming the Requirements Process

    DTIC Science & Technology

    2016-07-01

    45 Defense AT&L: July-August 2016 Improving Competition Reforming the Requirements Process Roy Wood, Ph.D. Wood is the Acting Vice President...professional. T ypical acquisition reform efforts have been focused in the margins, achiev­ ing marginal results. The evidence of decades of...acquisition reform indicates that the marginal reforms typically taken are not making the funda­ mental changes needed by the Department of Defense (DoD

  15. Developing lettuce with improved quality for processed salads.

    USDA-ARS?s Scientific Manuscript database

    Lettuce is increasingly consumed as minimally processed salads. Cultivars grown for this market may require breeding for improved shelf-life and resistance to physiological defects such as tipburn (TB). Tipburn is a calcium deficiency related defect causing necrosis on the leaf margins, typically on...

  16. Improved Edge Performance in MRF

    NASA Technical Reports Server (NTRS)

    Shorey, Aric; Jones, Andrew; Durnas, Paul; Tricard, Marc

    2004-01-01

    The fabrication of large segmented optics requires a polishing process that can correct the figure of a surface to within a short distance from its edges-typically, a few millimeters. The work here is to develop QED's Magnetorheological Finishing (MRF) precision polishing process to minimize residual edge effects.

  17. Power and Performance Trade-offs for Space Time Adaptive Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gawande, Nitin A.; Manzano Franco, Joseph B.; Tumeo, Antonino

    Computational efficiency – performance relative to power or energy – is one of the most important concerns when designing RADAR processing systems. This paper analyzes power and performance trade-offs for a typical Space Time Adaptive Processing (STAP) application. We study STAP implementations for CUDA and OpenMP on two computationally efficient architectures, Intel Haswell Core I7-4770TE and NVIDIA Kayla with a GK208 GPU. We analyze the power and performance of STAP’s computationally intensive kernels across the two hardware testbeds. We also show the impact and trade-offs of GPU optimization techniques. We show that data parallelism can be exploited for efficient implementationmore » on the Haswell CPU architecture. The GPU architecture is able to process large size data sets without increase in power requirement. The use of shared memory has a significant impact on the power requirement for the GPU. A balance between the use of shared memory and main memory access leads to an improved performance in a typical STAP application.« less

  18. Biological Hydrogen Production: Simultaneous Saccharification and Fermentation with Nitrogen and Phosphorus Removal from Wastewater Effluent

    DTIC Science & Technology

    2010-07-29

    bedirectly catalyzed tomonosaccharidesby cellulaseswithout requiring thermochemical pretreatment , aswould typically be required with lignocellulosic ...of a similar process with lignocellulosic biomass, although such biomass would likely require ther- mochemical pretreatment prior to enzymatic...by the automatic addition of 0.1 N NaOH . Total organic carbon (TOC), ammonia nitrogen, nitrate nitrogen, nitrite nitrogen and phosphorus analyses

  19. Biological Hydrogen Production: Simultaneous Saccharification and Fermentation With Nitrogen and Phosphorus Removal from Wastewater Effluent

    DTIC Science & Technology

    2010-01-01

    requiring thermochemical pretreatment , aswould typically be required with lignocellulosic feedstocks. Therefore it offers a readily-processed and...Standards and Technology. The pH of the reactors was controlled throughout all fermentations by the automatic addition of 0.1 N NaOH . Total organic...nutrients. The optimized conditions developed with paper as a substrate may also convey to the use of a similar process with lignocellulosic biomass

  20. Near infared spectroscopy in the forest products industry

    Treesearch

    Chi-Leung So; Brian K. Via; Leslie H. Groom; Lawrence R. Schimleck; Todd F. Shupe; Stephen S. Kelley; Timothy G. Rials

    2004-01-01

    Improving manufacturing efficiency and increasing product worth requires the right combination of actions throughout the manufacturing process. Many innovations have been developed over the last several decades to achieve these goals. Innovations typically work their way backwards in the manufacturing process, with an increasing level of monitoring occurring at the end...

  1. A Design Rationale Capture Using REMAP/MM

    DTIC Science & Technology

    1994-06-01

    company-wide down-sizing, the power company has determined that an automated service order processing system is the most economical solution. This new...service order processing system for a large power company can easily be 37 led. A system of this complexity would typically require three to five years

  2. Orbital construction support equipment

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Approximately 200 separate construction steps were defined for the three solar power satellite (SPS) concepts. Detailed construction scenarios were developed which describe the specific tasks to be accomplished, and identify general equipment requirements. The scenarios were used to perform a functional analysis, which resulted in the definition of 100 distinct SPS elements. These elements are the components, parts, subsystems, or assemblies upon which construction activities take place. The major SPS elements for each configuration are shown. For those elements, 300 functional requirements were identified in seven generic processes. Cumulatively, these processes encompass all functions required during SPS construction/assembly. Individually each process is defined such that it includes a specific type of activity. Each SPS element may involve activities relating to any or all of the generic processes. The processes are listed, and examples of the requirements defined for a typical element are given.

  3. Defect printability for high-exposure dose advanced packaging applications

    NASA Astrophysics Data System (ADS)

    Mikles, Max; Flack, Warren; Nguyen, Ha-Ai; Schurz, Dan

    2003-12-01

    Pellicles are used in semiconductor lithography to minimize printable defects and reduce reticle cleaning frequency. However, there are a growing number of microlithography applications, such as advanced packaging and nanotechnology, where it is not clear that pellicles always offer a significant benefit. These applications have relatively large critical dimensions and require ultra thick photoresists with extremely high exposure doses. Given that the lithography is performed in Class 100 cleanroom conditions, it is possible that the risk of defects from contamination is sufficiently low that pellicles would not be required on certain process layer reticles. The elimination of the pellicle requirement would provide a cost reduction by saving the original pellicle cost and eliminating future pellicle replacement and repair costs. This study examines the imaging potential of defects with reticle patterns and processes typical for gold-bump and solder-bump advanced packaging lithography. The test reticle consists of 30 to 90 μm octagonal contact patterns representative of advanced packaging reticles. Programmed defects are added that represent the range of particle sizes (3 to 30 μm) normally protected by the pellicle and that are typical of advanced packaging lithography cleanrooms. The reticle is exposed using an Ultratech Saturn Spectrum 300e2 1X stepper on wafers coated with a variety of ultra thick (30 to 100 μm) positive and negative-acting photoresists commonly used in advanced packaging. The experimental results show that in many cases smaller particles continue to be yield issues for the feature size and density typical of advanced packaging processes. For the two negative photoresists studied it appears that a pellicle is not required for protection from defects smaller than 10 to 15 μm depending on the photoresist thickness. Thus the decision on pellicle usage for these materials would need to be made based on the device fabrication process and the cleanliness of a fabrication facility. For the two positive photoresists studied it appears that a pellicle is required to protect from defects down to 3 μm defects depending on the photoresist thickness. This suggests that a pellicle should always be used for these materials. Since a typical fabrication facility would use both positive and negative photoresists it may be advantageous to use pellicles on all reticles simply to avoid confusion. The cost savings of not using a pellicle could easily be outweighed by the yield benefits of using one.

  4. Development of Integrated Programs for Aerospace-vehicle design (IPAD): Integrated information processing requirements

    NASA Technical Reports Server (NTRS)

    Southall, J. W.

    1979-01-01

    The engineering-specified requirements for integrated information processing by means of the Integrated Programs for Aerospace-Vehicle Design (IPAD) system are presented. A data model is described and is based on the design process of a typical aerospace vehicle. General data management requirements are specified for data storage, retrieval, generation, communication, and maintenance. Information management requirements are specified for a two-component data model. In the general portion, data sets are managed as entities, and in the specific portion, data elements and the relationships between elements are managed by the system, allowing user access to individual elements for the purpose of query. Computer program management requirements are specified for support of a computer program library, control of computer programs, and installation of computer programs into IPAD.

  5. "Hunting with a knife and ... fork": examining central coherence in autism, attention deficit/hyperactivity disorder, and typical development with a linguistic task.

    PubMed

    Booth, Rhonda; Happé, Francesca

    2010-12-01

    A local processing bias, referred to as "weak central coherence," has been postulated to underlie key aspects of autism spectrum disorder (ASD). Little research has examined whether individual differences in this cognitive style can be found in typical development, independent of intelligence, and how local processing relates to executive control. We present a brief and easy-to-administer test of coherence requiring global sentence completions. We report results from three studies assessing (a) 176 typically developing (TD) 8- to 25-year-olds, (b) individuals with ASD and matched controls, and (c) matched groups with ASD or attention deficit/hyperactivity disorder (ADHD). The results suggest that the Sentence Completion Task can reveal individual differences in cognitive style unrelated to IQ in typical development, that most (but not all) people with ASD show weak coherence on this task, and that performance is not related to inhibitory control. The Sentence Completion Task was found to be a useful test instrument, capable of tapping local processing bias in a range of populations. (c) 2010 Elsevier Inc. All rights reserved.

  6. Near Infrared Spectroscopy in the Forest Products Industry, Forest Products Journal

    Treesearch

    Chi-Leung So; Brian K. Via; Leslie H. Groom; Laurence R. Schimleck; Todd F. Shupe; Stephen S. Kelley; Timothy G. Rials

    2004-01-01

    Improving manufacturing efficiency and increasing product worth requires the right combination of actions throughout the manufacturing process. Many innovations have been developed over the last several decades to achieve these goals. Innovations typically work their way backwards in the manufacturing process, with an increasing level of monitoring occurring at the...

  7. Neural Evidence of Allophonic Perception in Children at Risk for Dyslexia

    ERIC Educational Resources Information Center

    Noordenbos, M. W.; Segers, E.; Serniclaes, W.; Mitterer, H.; Verhoeven, L.

    2012-01-01

    Learning to read is a complex process that develops normally in the majority of children and requires the mapping of graphemes to their corresponding phonemes. Problems with the mapping process nevertheless occur in about 5% of the population and are typically attributed to poor phonological representations, which are--in turn--attributed to…

  8. 42 CFR 137.294 - What is the typical IHS environmental review process for construction projects?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... impact on the environment, and therefore do not require environmental impact statements (EIS). Under..., state, and local officials and interested parties on potential environmental effects; (2) Document...

  9. 42 CFR 137.294 - What is the typical IHS environmental review process for construction projects?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... impact on the environment, and therefore do not require environmental impact statements (EIS). Under..., state, and local officials and interested parties on potential environmental effects; (2) Document...

  10. OPPE, FPPE, QPS, and why the alphabet soup of physician assessment is essential for safer patient care.

    PubMed

    Loftus, Michael L

    Creating a successful quality and patient safety program requires a multifaceted approach that systematically reviews overall systems and processes, but also creates a standardized framework for evaluating individual practitioner performance on a routine basis. There are two required elements of competency assessment that are typically tied to the hospital credentialing process: ongoing professional practice evaluation (OPPE) and focused professional practice evaluation (FPPE). Each of these processes are mandated by the Joint Commission, and form an important cornerstone for ensuring adequate physician performance and knowledge base. Copyright © 2017. Published by Elsevier Inc.

  11. Silvicultural treatments

    Treesearch

    Carl E. Fiedler

    2000-01-01

    Sustainable, ecologically-based management of pine/ fir forests requires silviculturists to integrate several treatments that emulate historic disturbance processes. Restoration prescriptions typically include cleaning or heavy understory thinning, improvement cutting to reduce the proportion of firs, and modified selection cutting to reduce overall stand density,...

  12. Pilot information system for cross-border hazmat transportation.

    DOT National Transportation Integrated Search

    2009-10-01

    Under NAFTA requirements, all hazardous materials that are shipped into Mexico or generated during the : manufacturing process must be shipped back to its point of origin, typically the United States. Thus, the delivery : and return of hazardous mate...

  13. Narrative comprehension in 4-7-year-old children with autism: testing the Weak Central Coherence account.

    PubMed

    Nuske, Heather Joy; Bavin, Edith L

    2011-01-01

    Despite somewhat spared structural language development in high-functioning autism, communicative comprehension deficits persist. Comprehension involves the integration of meaning: global processing is required. The Weak Central Coherence theory suggests that individuals with autism are biased to process information locally. This cognitive style may impair comprehension, particularly if inferencing is required. However, task performance may be facilitated by this cognitive style if local processing is required. The current study was designed to examine the extent to which the 'weak central coherence' cognitive style affects comprehension and inferential processing of spoken narratives. The children with autism were expected to perform comparatively poorer on inferences relating to event scripts and comparatively better on inferences requiring deductive reasoning. Fourteen high-functioning children with autism were recruited from databases of various autism organizations (mean age = 6:7, 13 males, one female) and were matched on a receptive vocabulary and a picture-completion task with 14 typically developing children recruited from a local childcare centre (mean age = 4:10, seven males, seven females). The children were read short stories and asked questions about the stories. Results indicated that the children with autism were less able to make inferences based on event scripts, but the groups did not differ significantly on inferences requiring deductive logical reasoning. Despite similar group performance on questions relating to the main idea of the stories, only for the typically developing group was good performance on extracting the main idea of the narratives significantly correlated with performance on all other comprehension tasks. Findings provide some support for the Weak Central Coherence theory and demonstrate that young children with autism do not spontaneously integrate information in order to make script inferences, as do typically developing children. These findings may help to explain communicative problems of young children with autism and can be applied to intervention programme development. More research on the link between a 'weak central coherence' cognitive style and communicative comprehension in autism will be valuable in understanding the comprehension deficits associated with autism. © 2010 Royal College of Speech & Language Therapists.

  14. Gaussian Processes for Data-Efficient Learning in Robotics and Control.

    PubMed

    Deisenroth, Marc Peter; Fox, Dieter; Rasmussen, Carl Edward

    2015-02-01

    Autonomous learning has been a promising direction in control and robotics for more than a decade since data-driven learning allows to reduce the amount of engineering knowledge, which is otherwise required. However, autonomous reinforcement learning (RL) approaches typically require many interactions with the system to learn controllers, which is a practical limitation in real systems, such as robots, where many interactions can be impractical and time consuming. To address this problem, current learning approaches typically require task-specific knowledge in form of expert demonstrations, realistic simulators, pre-shaped policies, or specific knowledge about the underlying dynamics. In this paper, we follow a different approach and speed up learning by extracting more information from data. In particular, we learn a probabilistic, non-parametric Gaussian process transition model of the system. By explicitly incorporating model uncertainty into long-term planning and controller learning our approach reduces the effects of model errors, a key problem in model-based learning. Compared to state-of-the art RL our model-based policy search method achieves an unprecedented speed of learning. We demonstrate its applicability to autonomous learning in real robot and control tasks.

  15. Recent developments in membrane-based separations in biotechnology processes: review.

    PubMed

    Rathore, A S; Shirke, A

    2011-01-01

    Membrane-based separations are the most ubiquitous unit operations in biotech processes. There are several key reasons for this. First, they can be used with a large variety of applications including clarification, concentration, buffer exchange, purification, and sterilization. Second, they are available in a variety of formats, such as depth filtration, ultrafiltration, diafiltration, nanofiltration, reverse osmosis, and microfiltration. Third, they are simple to operate and are generally robust toward normal variations in feed material and operating parameters. Fourth, membrane-based separations typically require lower capital cost when compared to other processing options. As a result of these advantages, a typical biotech process has anywhere from 10 to 20 membrane-based separation steps. In this article we review the major developments that have occurred on this topic with a focus on developments in the last 5 years.

  16. Implications of Decreased Nitrite Concentrations on Clostridium perfringens Outgrowth during Cooling of Ready-to-Eat Meats.

    PubMed

    Myers, Megan I; Sebranek, Joseph G; Dickson, James S; Shaw, Angela M; Tarté, Rodrigo; Adams, Kristin R; Neibuhr, Steve

    2016-01-01

    Increased popularity of natural and organic processed meats can be attributed to the growing consumer demand for preservative-free foods, including processed meats. To meet this consumer demand, meat processors have begun using celery juice concentrate in place of sodium nitrite to create products labeled as no-nitrate or no-nitrite-added meat products while maintaining the characteristics unique to conventionally cured processed meats. Because of flavor limitations, natural cures with celery concentrate typically provide lower ingoing nitrite concentrations for ready-to-eat processed meats than do conventional cures, which could allow for increased growth of pathogens, such as Clostridium perfringens, during cooked product cooling such as that required by the U.S. Department of Agriculture. The objective of this study was to investigate the implications associated with reduced nitrite concentrations for preventing C. perfringens outgrowth during a typical cooling cycle used for cooked products. Nitrite treatments of 0, 50, and 100 ppm were tested in a broth system inoculated with a three-strain C. perfringens cocktail and heated with a simulated product thermal process followed by a typical cooling-stabilization process. The nitrite concentration of 50 ppm was more effective for preventing C. perfringens outgrowth than was 0 ppm but was not as effective as 100 ppm. The interaction between nitrite and temperature significantly affected (P < 0.05) C. perfringens outgrowth in both total population and number of vegetative cells. Both temperature and nitrite concentration significantly affected (P < 0.05) C. perfringens spore survival, but the interaction between nitrite and temperature did not have a significant effect (P > 0.05) on spore outgrowth. Results indicate that decreased nitrite concentrations (50 ppm) have increased potential for total C. perfringens population outgrowth during cooling and may require additional protective measures, such as faster chilling rates.

  17. Juggling Act: Re-Planning and Building on Observatory...Simultaneously!

    NASA Technical Reports Server (NTRS)

    Zavala, Eddie; Daws, Patricia

    2011-01-01

    SOFIA (Stratospheric Observatory for Infrared Astronomy) is a major SMD program that has been required to meet several requirements and implement major planning and business initiatives overthe past 1 1/2 years, in the midst of system development and flight test phases. The program was required to implementing JCL and EVM simultaneously, as well as undergo a major replan and Standing Review Board - and all without impacting technical schedule progress. The team developed innovative processes that met all the requirements, and improved Program Management process toolsets. The SOFIA team, being subject to all the typical budget constraints, found ways to leverage existing roles in new ways to meet the requirements without creating unmanageable overhead. The team developed strategies and value added processes - such as improved risk identification, structured reserves management, cost/risk integration - so that the effort expended resulted in a positive return to the program.

  18. A requirements index for information processing in hospitals.

    PubMed

    Ammenwerth, E; Buchauer, A; Haux, R

    2002-01-01

    Reference models describing typical information processing requirements in hospitals do not currently exist. This leads to high hospital information system (HIS) management expenses, for example, during tender processes for the acquisition of software application programs. Our aim was, therefore, to develop a comprehensive, lasting, technology-independent, and sufficiently detailed index of requirements for information processing in hospitals in order to reduce respective expenses. Two-dozen German experts established an index of requirements for information processing in university hospitals. This was done in a consensus-based, top-down, cyclic manner. Each functional requirement was derived from information processing functions and sub-functions of a hospital. The result is the first official German version of a requirements index, containing 233 functional requirements and 102 function-independent requirements, focusing on German needs. The functional requirements are structured according to the primary care process from admission to discharge and supplemented by requirements for handling patient records, work organization and resource planning, hospital management, research and education. Both the German version and its English translation are available in the Internet. The index of requirements contains general information processing requirements in hospitals which are formulated independent of information processing tools, or of HIS architectures. It aims at supporting HIS management, especially HIS strategic planning, HIS evaluation, and tender processes. The index can be regarded as a draft, which must, however, be refined according to the specific aims of a particular project. Although focused on German needs, we expect that it can also be useful in other countries. The high amount of interest shown for the index supports its usefulness.

  19. Three-phase flow? Consider helical-coil heat exchangers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haraburda, S.S.

    1995-07-01

    In recent years, chemical process plants are increasingly encountering processes that require heat exchange in three-phase fluids. A typical application, for example, is heating liquids containing solid catalyst particles and non-condensable gases. Heat exchangers designed for three-phase flow generally have tubes with large diameters (typically greater than two inches), because solids can build-up inside the tube and lead to plugging. At the same time, in order to keep heat-transfer coefficients high, the velocity of the process fluid within the tube should also be high. As a result, heat exchangers for three-phase flow may require less than five tubes -- eachmore » having a required linear length that could exceed several hundred feet. Given these limitations, it is obvious that a basic shell-and-tube heat exchanger is not the most practical solution for this purpose. An alternative for three-phase flow is a helical-coil heat exchanger. The helical-coil units offer a number of advantages, including perpendicular, counter-current flow and flexible overall dimensions for the exchanger itself. The paper presents equations for: calculating the tube-side heat-transfer coefficient; calculating the shell-side heat-transfer coefficient; calculating the heat-exchanger size; calculating the tube-side pressure drop; and calculating shell-side pressure-drop.« less

  20. Manufacture and quality control of interconnecting wire hardnesses, Volume 1

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A standard is presented for manufacture, installation, and quality control of eight types of interconnecting wire harnesses. The processes, process controls, and inspection and test requirements reflected are based on acknowledgment of harness design requirements, acknowledgment of harness installation requirements, identification of the various parts, materials, etc., utilized in harness manufacture, and formulation of a typical manufacturing flow diagram for identification of each manufacturing and quality control process, operation, inspection, and test. The document covers interconnecting wire harnesses defined in the design standard, including type 1, enclosed in fluorocarbon elastomer convolute, tubing; type 2, enclosed in TFE convolute tubing lines with fiberglass braid; type 3, enclosed in TFE convolute tubing; and type 5, combination of types 3 and 4. Knowledge gained through experience on the Saturn 5 program coupled with recent advances in techniques, materials, and processes was incorporated.

  1. Semantic Typicality Effects in Acquired Dyslexia: Evidence for Semantic Impairment in Deep Dyslexia.

    PubMed

    Riley, Ellyn A; Thompson, Cynthia K

    2010-06-01

    BACKGROUND: Acquired deep dyslexia is characterized by impairment in grapheme-phoneme conversion and production of semantic errors in oral reading. Several theories have attempted to explain the production of semantic errors in deep dyslexia, some proposing that they arise from impairments in both grapheme-phoneme and lexical-semantic processing, and others proposing that such errors stem from a deficit in phonological production. Whereas both views have gained some acceptance, the limited evidence available does not clearly eliminate the possibility that semantic errors arise from a lexical-semantic input processing deficit. AIMS: To investigate semantic processing in deep dyslexia, this study examined the typicality effect in deep dyslexic individuals, phonological dyslexic individuals, and controls using an online category verification paradigm. This task requires explicit semantic access without speech production, focusing observation on semantic processing from written or spoken input. METHODS #ENTITYSTARTX00026; PROCEDURES: To examine the locus of semantic impairment, the task was administered in visual and auditory modalities with reaction time as the primary dependent measure. Nine controls, six phonological dyslexic participants, and five deep dyslexic participants completed the study. OUTCOMES #ENTITYSTARTX00026; RESULTS: Controls and phonological dyslexic participants demonstrated a typicality effect in both modalities, while deep dyslexic participants did not demonstrate a typicality effect in either modality. CONCLUSIONS: These findings suggest that deep dyslexia is associated with a semantic processing deficit. Although this does not rule out the possibility of concomitant deficits in other modules of lexical-semantic processing, this finding suggests a direction for treatment of deep dyslexia focused on semantic processing.

  2. Enhanced pure-tone pitch discrimination among persons with autism but not Asperger syndrome.

    PubMed

    Bonnel, Anna; McAdams, Stephen; Smith, Bennett; Berthiaume, Claude; Bertone, Armando; Ciocca, Valter; Burack, Jacob A; Mottron, Laurent

    2010-07-01

    Persons with Autism spectrum disorders (ASD) display atypical perceptual processing in visual and auditory tasks. In vision, Bertone, Mottron, Jelenic, and Faubert (2005) found that enhanced and diminished visual processing is linked to the level of neural complexity required to process stimuli, as proposed in the neural complexity hypothesis. Based on these findings, Samson, Mottron, Jemel, Belin, and Ciocca (2006) proposed to extend the neural complexity hypothesis to the auditory modality. They hypothesized that persons with ASD should display enhanced performance for simple tones that are processed in primary auditory cortical regions, but diminished performance for complex tones that require additional processing in associative auditory regions, in comparison to typically developing individuals. To assess this hypothesis, we designed four auditory discrimination experiments targeting pitch, non-vocal and vocal timbre, and loudness. Stimuli consisted of spectro-temporally simple and complex tones. The participants were adolescents and young adults with autism, Asperger syndrome, and typical developmental histories, all with IQs in the normal range. Consistent with the neural complexity hypothesis and enhanced perceptual functioning model of ASD (Mottron, Dawson, Soulières, Hubert, & Burack, 2006), the participants with autism, but not with Asperger syndrome, displayed enhanced pitch discrimination for simple tones. However, no discrimination-thresholds differences were found between the participants with ASD and the typically developing persons across spectrally and temporally complex conditions. These findings indicate that enhanced pure-tone pitch discrimination may be a cognitive correlate of speech-delay among persons with ASD. However, auditory discrimination among this group does not appear to be directly contingent on the spectro-temporal complexity of the stimuli. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  3. Zombie algorithms: a timesaving remote sensing systems engineering tool

    NASA Astrophysics Data System (ADS)

    Ardanuy, Philip E.; Powell, Dylan C.; Marley, Stephen

    2008-08-01

    In modern horror fiction, zombies are generally undead corpses brought back from the dead by supernatural or scientific means, and are rarely under anyone's direct control. They typically have very limited intelligence, and hunger for the flesh of the living [1]. Typical spectroradiometric or hyperspectral instruments providess calibrated radiances for a number of remote sensing algorithms. The algorithms typically must meet specified latency and availability requirements while yielding products at the required quality. These systems, whether research, operational, or a hybrid, are typically cost constrained. Complexity of the algorithms can be high, and may evolve and mature over time as sensor characterization changes, product validation occurs, and areas of scientific basis improvement are identified and completed. This suggests the need for a systems engineering process for algorithm maintenance that is agile, cost efficient, repeatable, and predictable. Experience on remote sensing science data systems suggests the benefits of "plug-n-play" concepts of operation. The concept, while intuitively simple, can be challenging to implement in practice. The use of zombie algorithms-empty shells that outwardly resemble the form, fit, and function of a "complete" algorithm without the implemented theoretical basis-provides the ground systems advantages equivalent to those obtained by integrating sensor engineering models onto the spacecraft bus. Combined with a mature, repeatable process for incorporating the theoretical basis, or scientific core, into the "head" of the zombie algorithm, along with associated scripting and registration, provides an easy "on ramp" for the rapid and low-risk integration of scientific applications into operational systems.

  4. Index change of chalcogenide materials from precision glass molding processes

    NASA Astrophysics Data System (ADS)

    Deegan, J.; Walsh, K.; Lindberg, G.; Benson, R.; Gibson, D.; Bayya, S.; Sanghera, J.; Stover, E.

    2015-05-01

    With the increase in demand for infrared optics for thermal applications and the use of glass molding of chalcogenide materials to support these higher volume optical designs, an investigation of changes to the optical properties of these materials is required. Typical precision glass molding requires specific thermal conditions for proper lens molding of any type of optical glass. With these conditions a change (reduction) of optical index occurs after molding of all oxide glass types and it is presumed that a similar behavior will happen with chalcogenide based materials. We will discuss the effects of a typical molding thermal cycle for use with commercially and newly developed chalcogenide materials and show results of index variation from nominally established material data.

  5. ISO 9001: 2008 Quality Assurance Assessment of Defense Acquisition University Processes

    DTIC Science & Technology

    2012-09-27

    required documents are in the package. A typical PR package includes a Supply Requistion, Purchase Description(PD)/Statement of Work ( SOW ...service requests such as major interior decorations, painting, air conditioning, and bug infestations. In general, at least 90 percent of the

  6. Error-Monitoring in Response to Social Stimuli in Individuals with Higher-Functioning Autism Spectrum Disorder

    PubMed Central

    McMahon, Camilla M.; Henderson, Heather A.

    2014-01-01

    Error-monitoring, or the ability to recognize one's mistakes and implement behavioral changes to prevent further mistakes, may be impaired in individuals with Autism Spectrum Disorder (ASD). Children and adolescents (ages 9-19) with ASD (n = 42) and typical development (n = 42) completed two face processing tasks that required discrimination of either the gender or affect of standardized face stimuli. Post-error slowing and the difference in Error-Related Negativity amplitude between correct and incorrect responses (ERNdiff) were used to index error-monitoring ability. Overall, ERNdiff increased with age. On the Gender Task, individuals with ASD had a smaller ERNdiff than individuals with typical development; however, on the Affect Task, there were no significant diagnostic group differences on ERNdiff. Individuals with ASD may have ERN amplitudes similar to those observed in individuals with typical development in more social contexts compared to less social contexts due to greater consequences for errors, more effortful processing, and/or reduced processing efficiency in these contexts. Across all participants, more post-error slowing on the Affect Task was associated with better social cognitive skills. PMID:25066088

  7. Application of fuzzy logic to the control of wind tunnel settling chamber temperature

    NASA Technical Reports Server (NTRS)

    Gwaltney, David A.; Humphreys, Gregory L.

    1994-01-01

    The application of Fuzzy Logic Controllers (FLC's) to the control of nonlinear processes, typically controlled by a human operator, is a topic of much study. Recent application of a microprocessor-based FLC to the control of temperature processes in several wind tunnels has proven to be very successful. The control of temperature processes in the wind tunnels requires the ability to monitor temperature feedback from several points and to accommodate varying operating conditions in the wind tunnels. The FLC has an intuitive and easily configurable structure which incorporates the flexibility required to have such an ability. The design and implementation of the FLC is presented along with process data from the wind tunnels under automatic control.

  8. Simulated Single Tooth Bending of High Temperature Alloys

    NASA Technical Reports Server (NTRS)

    Handschuh, Robert, F.; Burke, Christopher

    2012-01-01

    Future unmanned space missions will require mechanisms to operate at extreme conditions in order to be successful. In some of these mechanisms, very high gear reductions will be needed to permit very small motors to drive other components at low rotational speed with high output torque. Therefore gearing components are required that can meet the mission requirements. In mechanisms such as this, bending fatigue strength capacity of the gears is very important. The bending fatigue capacity of a high temperature, nickel-based alloy, typically used for turbine disks in gas turbine engines and two tool steel materials with high vanadium content, were compared to that of a typical aerospace alloy-AISI 9310. Test specimens were fabricated by electro-discharge machining without post machining processing. Tests were run at 24 and at 490 C. As test temperature increased from 24 to 490 C the bending fatigue strength was reduced by a factor of five.

  9. EVA Development and Verification Testing at NASA's Neutral Buoyancy Laboratory

    NASA Technical Reports Server (NTRS)

    Jairala, Juniper; Durkin, Robert

    2012-01-01

    As an early step in preparing for future EVAs, astronauts perform neutral buoyancy testing to develop and verify EVA hardware and operations. To date, neutral buoyancy demonstrations at NASA JSC’s Sonny Carter Training Facility have primarily evaluated assembly and maintenance tasks associated with several elements of the ISS. With the retirement of the Space Shuttle, completion of ISS assembly, and introduction of commercial participants for human transportation into space, evaluations at the NBL will take on a new focus. In this session, Juniper Jairala briefly discussed the design of the NBL and, in more detail, described the requirements and process for performing a neutral buoyancy test, including typical hardware and support equipment requirements, personnel and administrative resource requirements, examples of ISS systems and operations that are evaluated, and typical operational objectives that are evaluated. Robert Durkin discussed the new and potential types of uses for the NBL, including those by non-NASA external customers.

  10. EVA Development and Verification Testing at NASA's Neutral Buoyancy Laboratory

    NASA Technical Reports Server (NTRS)

    Jairala, Juniper; Durkin, Robert

    2012-01-01

    As an early step in preparing for future EVAs, astronauts perform neutral buoyancy testing to develop and verify EVA hardware and operations. To date, neutral buoyancy demonstrations at NASA JSC's Sonny Carter Training Facility have primarily evaluated assembly and maintenance tasks associated with several elements of the ISS. With the retirement of the Space Shuttle, completion of ISS assembly, and introduction of commercial participants for human transportation into space, evaluations at the NBL will take on a new focus. In this session, Juniper Jairala briefly discussed the design of the NBL and, in more detail, described the requirements and process for performing a neutral buoyancy test, including typical hardware and support equipment requirements, personnel and administrative resource requirements, examples of ISS systems and operations that are evaluated, and typical operational objectives that are evaluated. Robert Durkin discussed the new and potential types of uses for the NBL, including those by non-NASA external customers.

  11. Mini-Satellites for Affordable Space Science

    NASA Astrophysics Data System (ADS)

    Phipps, Andy; da Silva Curiel, Alex; Gibbon, Dave; Richardson, Guy; Cropp, Alex; Sweeting, Martin, , Sir

    Magnetospheric science missions are a key component of solar terrestrial physics programmes - charged with the unravelling of these fundamental processes. These missions require distributed science gathering in a wide variety of alternative orbits. Missions typically require constellations of high delta-v formation flying spacecraft - single launch vehicles are usually mandated. Typical missions baseline space standard technology and standard communication and operations architectures - all driving up programme cost. By trading on the requirements, applying prudent analysis of performance as well as selection of subsystems outside the traditional space range most of the mission objectives can be met for a reduced overall mission cost. This paper describes Surrey's platform solution which has been studied for a future NASA opportunity. It will emphasise SSTL's proven spacecraft engineering philosophies and the use of terrestrial commercial off-the-shelf technology in this demanding environment. This will lead to a cost-capped science mission, and extend the philosophy of affordable access to space beyond Low Earth Orbit.

  12. Comparative evaluation of urban storm water quality models

    NASA Astrophysics Data System (ADS)

    Vaze, J.; Chiew, Francis H. S.

    2003-10-01

    The estimation of urban storm water pollutant loads is required for the development of mitigation and management strategies to minimize impacts to receiving environments. Event pollutant loads are typically estimated using either regression equations or "process-based" water quality models. The relative merit of using regression models compared to process-based models is not clear. A modeling study is carried out here to evaluate the comparative ability of the regression equations and process-based water quality models to estimate event diffuse pollutant loads from impervious surfaces. The results indicate that, once calibrated, both the regression equations and the process-based model can estimate event pollutant loads satisfactorily. In fact, the loads estimated using the regression equation as a function of rainfall intensity and runoff rate are better than the loads estimated using the process-based model. Therefore, if only estimates of event loads are required, regression models should be used because they are simpler and require less data compared to process-based models.

  13. Hardware development process for Human Research facility applications

    NASA Astrophysics Data System (ADS)

    Bauer, Liz

    2000-01-01

    The simple goal of the Human Research Facility (HRF) is to conduct human research experiments on the International Space Station (ISS) astronauts during long-duration missions. This is accomplished by providing integration and operation of the necessary hardware and software capabilities. A typical hardware development flow consists of five stages: functional inputs and requirements definition, market research, design life cycle through hardware delivery, crew training, and mission support. The purpose of this presentation is to guide the audience through the early hardware development process: requirement definition through selecting a development path. Specific HRF equipment is used to illustrate the hardware development paths. .

  14. Perception of shapes targeting local and global processes in autism spectrum disorders.

    PubMed

    Grinter, Emma J; Maybery, Murray T; Pellicano, Elizabeth; Badcock, Johanna C; Badcock, David R

    2010-06-01

    Several researchers have found evidence for impaired global processing in the dorsal visual stream in individuals with autism spectrum disorders (ASDs). However, support for a similar pattern of visual processing in the ventral visual stream is less consistent. Critical to resolving the inconsistency is the assessment of local and global form processing ability. Within the visual domain, radial frequency (RF) patterns - shapes formed by sinusoidally varying the radius of a circle to add 'bumps' of a certain number to a circle - can be used to examine local and global form perception. Typically developing children and children with an ASD discriminated between circles and RF patterns that are processed either locally (RF24) or globally (RF3). Children with an ASD required greater shape deformation to identify RF3 shapes compared to typically developing children, consistent with difficulty in global processing in the ventral stream. No group difference was observed for RF24 shapes, suggesting intact local ventral-stream processing. These outcomes support the position that a deficit in global visual processing is present in ASDs, consistent with the notion of Weak Central Coherence.

  15. Military Medical Decision Support for Homeland Defense During Emergency

    DTIC Science & Technology

    2004-12-01

    abstraction hierarchy, three levels of information requirement for designing emergency training interface are recognized. These are epistemological ...support human decision making process is considered to be decision-centric. A typical decision-centric interface is supported by at least four design ... Designing Emergency Training Interface ......................................................................................... 5 Epistemological

  16. Space station microscopy: Beyond the box

    NASA Technical Reports Server (NTRS)

    Hunter, N. R.; Pierson, Duane L.; Mishra, S. K.

    1993-01-01

    Microscopy aboard Space Station Freedom poses many unique challenges for in-flight investigations. Disciplines such as material processing, plant and animal research, human reseach, enviromental monitoring, health care, and biological processing have diverse microscope requirements. The typical microscope not only does not meet the comprehensive needs of these varied users, but also tends to require excessive crew time. To assess user requirements, a comprehensive survey was conducted among investigators with experiments requiring microscopy. The survey examined requirements such as light sources, objectives, stages, focusing systems, eye pieces, video accessories, etc. The results of this survey and the application of an Intelligent Microscope Imaging System (IMIS) may address these demands for efficient microscopy service in space. The proposed IMIS can accommodate multiple users with varied requirements, operate in several modes, reduce crew time needed for experiments, and take maximum advantage of the restrictive data/ instruction transmission environment on Freedom.

  17. High Performance Computing Assets for Ocean Acoustics Research

    DTIC Science & Technology

    2016-11-18

    independently on processing units with access to a typically available amount of memory, say 16 or 32 gigabytes. Our models require each processor to...allow results to be obtained with limited amounts of memory available to individual processing units (with no time frame for successful completion...put into use. One file server computer to store simulation output has also been purchased. The first workstation has 28 CPU cores, dual- thread , (56

  18. Apollo experience report the command and service module milestone review process

    NASA Technical Reports Server (NTRS)

    Brendle, H. L.; York, J. A.

    1974-01-01

    The sequence of the command and service module milestone review process is given, and the Customer Acceptance Readiness Review and Flight Readiness Review plans are presented. Contents of the System Summary Acceptance Documents for the two formal spacecraft reviews are detailed, and supplemental data required for presentation to the review boards are listed. Typical forms, correspondence, supporting documentation, and minutes of a board meeting are included.

  19. Variety in emotional life: within-category typicality of emotional experiences is associated with neural activity in large-scale brain networks

    PubMed Central

    Barrett, Lisa Feldman; Barsalou, Lawrence W.

    2015-01-01

    The tremendous variability within categories of human emotional experience receives little empirical attention. We hypothesized that atypical instances of emotion categories (e.g. pleasant fear of thrill-seeking) would be processed less efficiently than typical instances of emotion categories (e.g. unpleasant fear of violent threat) in large-scale brain networks. During a novel fMRI paradigm, participants immersed themselves in scenarios designed to induce atypical and typical experiences of fear, sadness or happiness (scenario immersion), and then focused on and rated the pleasant or unpleasant feeling that emerged (valence focus) in most trials. As predicted, reliably greater activity in the ‘default mode’ network (including medial prefrontal cortex and posterior cingulate) was observed for atypical (vs typical) emotional experiences during scenario immersion, suggesting atypical instances require greater conceptual processing to situate the socio-emotional experience. During valence focus, reliably greater activity was observed for atypical (vs typical) emotional experiences in the ‘salience’ network (including anterior insula and anterior cingulate), suggesting atypical instances place greater demands on integrating shifting body signals with the sensory and social context. Consistent with emerging psychological construction approaches to emotion, these findings demonstrate that is it important to study the variability within common categories of emotional experience. PMID:24563528

  20. Improved Assembly for Gas Shielding During Welding or Brazing

    NASA Technical Reports Server (NTRS)

    Gradl, Paul; Baker, Kevin; Weeks, Jack

    2009-01-01

    An improved assembly for inert-gas shielding of a metallic joint is designed to be useable during any of a variety of both laser-based and traditional welding and brazing processes. The basic purpose of this assembly or of a typical prior related assembly is to channel the flow of a chemically inert gas to a joint to prevent environmental contamination of the joint during the welding or brazing process and, if required, to accelerate cooling upon completion of the process.

  1. The Goal-Based Scenario Builder: Experiences with Novice Instructional Designers.

    ERIC Educational Resources Information Center

    Bell, Benjamin; Korcuska, Michael

    Creating educational software generally requires a great deal of computer expertise, and as a result, educators lacking such knowledge have largely been excluded from the design process. Recently, researchers have been designing tools for automating some aspects of building instructional applications. These tools typically aim for generality,…

  2. Business Models for Training and Performance Improvement Departments

    ERIC Educational Resources Information Center

    Carliner, Saul

    2004-01-01

    Although typically applied to entire enterprises, the concept of business models applies to training and performance improvement groups. Business models are "the method by which firm[s] build and use [their] resources to offer.. value." Business models affect the types of projects, services offered, skills required, business processes, and type of…

  3. When Time Makes a Difference: Addressing Ergodicity and Complexity in Education

    ERIC Educational Resources Information Center

    Koopmans, Matthijs

    2015-01-01

    The detection of complexity in behavioral outcomes often requires an estimation of their variability over a prolonged time spectrum to assess processes of stability and transformation. Conventional scholarship typically relies on time-independent measures, "snapshots", to analyze those outcomes, assuming that group means and their…

  4. Friction Stir Welding Development at NASA, Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    McGill, Preston; Gentz, Steve (Technical Monitor)

    2001-01-01

    Friction stir welding (FSW) is a solid state process that pan be used to join materials without melting. The process was invented by The Welding Institute (TWI), Cambridge, England. Friction stir welding exhibits several advantages over fusion welding in that it produces welds with fewer defects and higher joint efficiency and is capable of joining alloys that are generally considered non-weldable with a fusion weld process. In 1994, NASA-Marshall began collaborating with TWI to transform FSW from a laboratory curiosity to a viable metal joining process suitable for manufacturing hardware. While teamed with TWI, NASA-Marshall began its own FSW research and development effort to investigate possible aerospace applications for the FSW process. The work involved nearly all aspects of FSW development, including process modeling, scale-up issues, applications to advanced materials and development of tooling to use FSW on components of the Space Shuttle with particular emphasis on aluminum tanks. The friction stir welding process involves spinning a pin-tool at an appropriate speed, plunging it into the base metal pieces to be joined, and then translating it along the joint of the work pieces. In aluminum alloys the rotating speed typically ranges from 200 to 400 revolutions per minute and the translation speed is approximately two to five inches per minute. The pin-tool is inserted at a small lead angle from the axis normal to the work piece and requires significant loading along the axis of the tool. An anvil or reaction structure is required behind the welded material to react the load along the axis of the pin tool. The process requires no external heat input, filler material, protective shielding gas or inert atmosphere typical of fusion weld processes. The FSW solid-state weld process has resulted in aluminum welds with significantly higher strengths, higher joint efficiencies and fewer defects than fusion welds used to join similar alloys.

  5. Note: Quasi-real-time analysis of dynamic near field scattering data using a graphics processing unit

    NASA Astrophysics Data System (ADS)

    Cerchiari, G.; Croccolo, F.; Cardinaux, F.; Scheffold, F.

    2012-10-01

    We present an implementation of the analysis of dynamic near field scattering (NFS) data using a graphics processing unit. We introduce an optimized data management scheme thereby limiting the number of operations required. Overall, we reduce the processing time from hours to minutes, for typical experimental conditions. Previously the limiting step in such experiments, the processing time is now comparable to the data acquisition time. Our approach is applicable to various dynamic NFS methods, including shadowgraph, Schlieren and differential dynamic microscopy.

  6. Autistic traits and social anxiety predict differential performance on social cognitive tasks in typically developing young adults

    PubMed Central

    Burk, Joshua A.; Fleckenstein, Katarina; Kozikowski, C. Teal

    2018-01-01

    The current work examined the unique contribution that autistic traits and social anxiety have on tasks examining attention and emotion processing. In Study 1, 119 typically-developing college students completed a flanker task assessing the control of attention to target faces and away from distracting faces during emotion identification. In Study 2, 208 typically-developing college students performed a visual search task which required identification of whether a series of 8 or 16 emotional faces depicted the same or different emotions. Participants with more self-reported autistic traits performed more slowly on the flanker task in Study 1 than those with fewer autistic traits when stimuli depicted complex emotions. In Study 2, participants higher in social anxiety performed less accurately on trials showing all complex faces; participants with autistic traits showed no differences. These studies suggest that traits related to autism and to social anxiety differentially impact social cognitive processing. PMID:29596523

  7. Combination of process and vibration data for improved condition monitoring of industrial systems working under variable operating conditions

    NASA Astrophysics Data System (ADS)

    Ruiz-Cárcel, C.; Jaramillo, V. H.; Mba, D.; Ottewill, J. R.; Cao, Y.

    2016-01-01

    The detection and diagnosis of faults in industrial processes is a very active field of research due to the reduction in maintenance costs achieved by the implementation of process monitoring algorithms such as Principal Component Analysis, Partial Least Squares or more recently Canonical Variate Analysis (CVA). Typically the condition of rotating machinery is monitored separately using vibration analysis or other specific techniques. Conventional vibration-based condition monitoring techniques are based on the tracking of key features observed in the measured signal. Typically steady-state loading conditions are required to ensure consistency between measurements. In this paper, a technique based on merging process and vibration data is proposed with the objective of improving the detection of mechanical faults in industrial systems working under variable operating conditions. The capabilities of CVA for detection and diagnosis of faults were tested using experimental data acquired from a compressor test rig where different process faults were introduced. Results suggest that the combination of process and vibration data can effectively improve the detectability of mechanical faults in systems working under variable operating conditions.

  8. Managing the Nuclear Fuel Cycle: Policy Implications of Expanding Global Access to Nuclear Power

    DTIC Science & Technology

    2007-11-01

    critical aspect of the nuclear fuel cycle for the United States, where longstanding nonproliferation policy discouraged commercial nuclear fuel...perhaps the most critical question in this decade for strengthening the nuclear nonproliferation regime: how can access to sensitive fuel cycle...process can take advantage of the slight difference in atomic mass between 235U and 238U. The typical enrichment process requires about 10 lbs of uranium

  9. Modeling of Inelastic Collisions in a Multifluid Plasma: Excitation and Deexcitation

    DTIC Science & Technology

    2016-05-31

    AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES For publication in Physics of Plasma Vol #22, Issue...the fundamental physical processes may be individually known, it is not always clear how their combination affects the overall operation, or at what...arises from the complexity of the physical processes needed to be captured in the model. The required level of detail of the CR model is typically not

  10. Modeling of Inelastic Collisions in a Multifluid Plasma: Excitation and Deexcitation (Preprint)

    DTIC Science & Technology

    2015-06-01

    AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES For publication in Physics of Plasma PA Case...the fundamental physical processes may be individually known, it is not always clear how their combination affects the overall operation, or at what...arises from the complexity of the physical processes needed to be captured in the model. The required level of detail of the CR model is typically not

  11. The Effects of Visual Degradation on Attended Objects and the Ability to Process Unattended Objects within the Visual Array

    DTIC Science & Technology

    2010-09-01

    field at once (e.g., Biederman , Blickle, Teitelbaum, & Klatsky, 1988), and objects of interest typically receive the attention required to recognize them...field ( Biederman & Cooper, 1991) and image size changes ( Biederman & Cooper, 1992). Yet, only attended objects are recognized when mirror images...left-right reversals) occur ( Biederman & Cooper, 1991). Due to these results, Hummel (2001) proposed that attended images are processed by both

  12. Maneuver Recovery Analysis for the Magnetospheric Multiscale Mission

    NASA Technical Reports Server (NTRS)

    Gramling, Cheryl; Carpenter, Russell; Volle, Michael; Lee, Taesul; Long, Anne

    2007-01-01

    The use of spacecraft formations creates new and more demanding requirements for orbit determination accuracy. In addition to absolute navigation requirements, there are typically relative navigation requirements that are based on the size or shape of the formation. The difficulty in meeting these requirements is related to the relative dynamics of the spacecraft orbits and the frequency of the formation maintenance maneuvers. This paper examines the effects of bi-weekly formation maintenance maneuvers on the absolute and relative orbit determination accuracy for the four-spacecraft Magnetospheric Multiscale (MMS) formation. Results are presented from high fidelity simulations that include the effects of realistic orbit determination errors in the maneuver planning process. Solutions are determined using a high accuracy extended Kalman filter designed for onboard navigation. Three different solutions are examined, considering the effects of process noise and measurement rate on the solutions.

  13. A General Audiovisual Temporal Processing Deficit in Adult Readers With Dyslexia.

    PubMed

    Francisco, Ana A; Jesse, Alexandra; Groen, Margriet A; McQueen, James M

    2017-01-01

    Because reading is an audiovisual process, reading impairment may reflect an audiovisual processing deficit. The aim of the present study was to test the existence and scope of such a deficit in adult readers with dyslexia. We tested 39 typical readers and 51 adult readers with dyslexia on their sensitivity to the simultaneity of audiovisual speech and nonspeech stimuli, their time window of audiovisual integration for speech (using incongruent /aCa/ syllables), and their audiovisual perception of phonetic categories. Adult readers with dyslexia showed less sensitivity to audiovisual simultaneity than typical readers for both speech and nonspeech events. We found no differences between readers with dyslexia and typical readers in the temporal window of integration for audiovisual speech or in the audiovisual perception of phonetic categories. The results suggest an audiovisual temporal deficit in dyslexia that is not specific to speech-related events. But the differences found for audiovisual temporal sensitivity did not translate into a deficit in audiovisual speech perception. Hence, there seems to be a hiatus between simultaneity judgment and perception, suggesting a multisensory system that uses different mechanisms across tasks. Alternatively, it is possible that the audiovisual deficit in dyslexia is only observable when explicit judgments about audiovisual simultaneity are required.

  14. Home Language Survey Data Quality Self-Assessment. REL 2017-198

    ERIC Educational Resources Information Center

    Henry, Susan F.; Mello, Dan; Avery, Maria-Paz; Parker, Caroline; Stafford, Erin

    2017-01-01

    Most state departments of education across the United States recommend or require that districts use a home language survey as the first step in a multistep process of identifying students who qualify for English learner student services. School districts typically administer the home language survey to parents and guardians during a student's…

  15. Mapping Next Generation Learning Spaces as a Designed Quality Enhancement Process

    ERIC Educational Resources Information Center

    Leonard, Simon N.; Fitzgerald, Robert N.; Bacon, Matt; Munnerley, Danny

    2017-01-01

    The learning spaces of higher education are changing with collaborative, agile and technology-enabled spaces ever more popular. Despite the massive investment required to create these new spaces, current quality systems are poorly placed to account for the value they create. Such learning spaces are typically popular with students but the impact…

  16. Easy Ways to Promote Inquiry in a Laboratory Course: The Power of Student Questions

    ERIC Educational Resources Information Center

    Polacek, Kelly Myer; Keeling, Elena Levine

    2005-01-01

    To teach students to think like scientists, the authors modified their laboratory course to include regular opportunities for student practice of inquiry and the scientific process. Their techniques are simple; they can be implemented without rewriting lab manuals, require little additional grading beyond typical lab reports, and are applicable…

  17. Tying Theory To Practice: Cognitive Aspects of Computer Interaction in the Design Process.

    ERIC Educational Resources Information Center

    Mikovec, Amy E.; Dake, Dennis M.

    The new medium of computer-aided design requires changes to the creative problem-solving methodologies typically employed in the development of new visual designs. Most theoretical models of creative problem-solving suggest a linear progression from preparation and incubation to some type of evaluative study of the "inspiration." These…

  18. Phono-Orthographic Interaction and Attentional Control in Children with Reading Disabilities

    ERIC Educational Resources Information Center

    Cone, Nadia Elise

    2012-01-01

    Fluent reading requires the effective integration of orthographic and phonological information in addition to intact processing of either type. The current study used a rhyme decision task to examine phono-orthographic interaction in children with reading disabilities (RD) as compared to typically achieving (TA) children. Word pairs were presented…

  19. Sources of Cognitive Inflexibility in Set-Shifting Tasks: Insights into Developmental Theories from Adult Data

    ERIC Educational Resources Information Center

    Dick, Anthony Steven

    2012-01-01

    Two experiments examined processes underlying cognitive inflexibility in set-shifting tasks typically used to assess the development of executive function in children. Adult participants performed a Flexible Item Selection Task (FIST) that requires shifting from categorizing by one dimension (e.g., color) to categorizing by a second orthogonal…

  20. A Nationwide Survey of State-Mandated Evaluation Practices for Domestic Violence Agencies

    ERIC Educational Resources Information Center

    Riger, Stephanie; Staggs, Susan L.

    2011-01-01

    Many agencies serving survivors of domestic violence are required to evaluate their services. Three possible evaluation strategies include: a) process measurement, which typically involves a frequency count of agency activities, such as the number of counseling hours given; b) outcome evaluation, which measures the impact of agency activities on…

  1. Deliberate Laterality Practice Facilitates Sensory-Motor Processing in Developing Children

    ERIC Educational Resources Information Center

    Pedersen, Scott J.

    2014-01-01

    Background: The innate ability for typically developing children to attain developmental motor milestones early in life has been a thoroughly researched area of inquiry. Nonetheless, as children grow and are required to perform more complex motor skills in order to experience success in physical activity and sport pursuits, the range of…

  2. The Evolution of a Flipped Classroom: Evidence-Based Recommendations

    ERIC Educational Resources Information Center

    Velegol, Stephanie Butler; Zappe, Sarah E.; Mahoney, Emily

    2015-01-01

    Engineering students benefit from an active and interactive classroom environment where they can be guided through the problem solving process. Typically faculty members spend class time presenting the technical content required to solve problems, leaving students to apply this knowledge and problem solve on their own at home. There has recently…

  3. Preschool-Aged Children Have Difficulty Constructing and Interpreting Simple Utterances Composed of Graphic Symbols

    ERIC Educational Resources Information Center

    Sutton, Ann; Trudeau, Natacha; Morford, Jill; Rios, Monica; Poirier, Marie-Andree

    2010-01-01

    Children who require augmentative and alternative communication (AAC) systems while they are in the process of acquiring language face unique challenges because they use graphic symbols for communication. In contrast to the situation of typically developing children, they use different modalities for comprehension (auditory) and expression…

  4. A system level model for preliminary design of a space propulsion solid rocket motor

    NASA Astrophysics Data System (ADS)

    Schumacher, Daniel M.

    Preliminary design of space propulsion solid rocket motors entails a combination of components and subsystems. Expert design tools exist to find near optimal performance of subsystems and components. Conversely, there is no system level preliminary design process for space propulsion solid rocket motors that is capable of synthesizing customer requirements into a high utility design for the customer. The preliminary design process for space propulsion solid rocket motors typically builds on existing designs and pursues feasible rather than the most favorable design. Classical optimization is an extremely challenging method when dealing with the complex behavior of an integrated system. The complexity and combinations of system configurations make the number of the design parameters that are traded off unreasonable when manual techniques are used. Existing multi-disciplinary optimization approaches generally address estimating ratios and correlations rather than utilizing mathematical models. The developed system level model utilizes the Genetic Algorithm to perform the necessary population searches to efficiently replace the human iterations required during a typical solid rocket motor preliminary design. This research augments, automates, and increases the fidelity of the existing preliminary design process for space propulsion solid rocket motors. The system level aspect of this preliminary design process, and the ability to synthesize space propulsion solid rocket motor requirements into a near optimal design, is achievable. The process of developing the motor performance estimate and the system level model of a space propulsion solid rocket motor is described in detail. The results of this research indicate that the model is valid for use and able to manage a very large number of variable inputs and constraints towards the pursuit of the best possible design.

  5. Viscous Dissipation and Heat Conduction in Binary Neutron-Star Mergers.

    PubMed

    Alford, Mark G; Bovard, Luke; Hanauske, Matthias; Rezzolla, Luciano; Schwenzer, Kai

    2018-01-26

    Inferring the properties of dense matter is one of the most exciting prospects from the measurement of gravitational waves from neutron star mergers. However, it requires reliable numerical simulations that incorporate viscous dissipation and energy transport as these can play a significant role in the survival time of the post-merger object. We calculate time scales for typical forms of dissipation and find that thermal transport and shear viscosity will not be important unless neutrino trapping occurs, which requires temperatures above 10 MeV and gradients over length scales of 0.1 km or less. On the other hand, if direct-Urca processes remain suppressed, leaving modified-Urca processes to establish flavor equilibrium, then bulk viscous dissipation could provide significant damping to density oscillations right after merger. When comparing with data from state-of-the-art merger simulations, we find that the bulk viscosity takes values close to its resonant maximum in a typical merger, motivating a more careful assessment of the role of bulk viscous dissipation in the gravitational-wave signal from merging neutron stars.

  6. Viscous Dissipation and Heat Conduction in Binary Neutron-Star Mergers

    NASA Astrophysics Data System (ADS)

    Alford, Mark G.; Bovard, Luke; Hanauske, Matthias; Rezzolla, Luciano; Schwenzer, Kai

    2018-01-01

    Inferring the properties of dense matter is one of the most exciting prospects from the measurement of gravitational waves from neutron star mergers. However, it requires reliable numerical simulations that incorporate viscous dissipation and energy transport as these can play a significant role in the survival time of the post-merger object. We calculate time scales for typical forms of dissipation and find that thermal transport and shear viscosity will not be important unless neutrino trapping occurs, which requires temperatures above 10 MeV and gradients over length scales of 0.1 km or less. On the other hand, if direct-Urca processes remain suppressed, leaving modified-Urca processes to establish flavor equilibrium, then bulk viscous dissipation could provide significant damping to density oscillations right after merger. When comparing with data from state-of-the-art merger simulations, we find that the bulk viscosity takes values close to its resonant maximum in a typical merger, motivating a more careful assessment of the role of bulk viscous dissipation in the gravitational-wave signal from merging neutron stars.

  7. Accounting for the influence of salt water in the physics required for processing underwater UXO EMI signals

    NASA Astrophysics Data System (ADS)

    Shubitidze, Fridon; Barrowes, Benjamin E.; Shamatava, Irma; Sigman, John; O'Neill, Kevin A.

    2018-05-01

    Processing electromagnetic induction signals from subsurface targets, for purposes of discrimination, requires accurate physical models. To date, successful approaches for on-land cases have entailed advanced modeling of responses by the targets themselves, with quite adequate treatment of instruments as well. Responses from the environment were typically slight and/or were treated very simply. When objects are immersed in saline solutions, however, more sophisticated modeling of the diffusive EMI physics in the environment is required. One needs to account for the response of the environment itself as well as the environment's frequency and time-dependent effects on both primary and secondary fields, from sensors and targets, respectively. Here we explicate the requisite physics and identify its effects quantitatively via analytical, numerical, and experimental investigations. Results provide a path for addressing the quandaries posed by previous underwater measurements and indicate how the environmental physics may be included in more successful processing.

  8. Design analysis of levitation facility for space processing applications. [Skylab program, space shuttles

    NASA Technical Reports Server (NTRS)

    Frost, R. T.; Kornrumpf, W. P.; Napaluch, L. J.; Harden, J. D., Jr.; Walden, J. P.; Stockhoff, E. H.; Wouch, G.; Walker, L. H.

    1974-01-01

    Containerless processing facilities for the space laboratory and space shuttle are defined. Materials process examples representative of the most severe requirements for the facility in terms of electrical power, radio frequency equipment, and the use of an auxiliary electron beam heater were used to discuss matters having the greatest effect upon the space shuttle pallet payload interfaces and envelopes. Improved weight, volume, and efficiency estimates for the RF generating equipment were derived. Results are particularly significant because of the reduced requirements for heat rejection from electrical equipment, one of the principal envelope problems for shuttle pallet payloads. It is shown that although experiments on containerless melting of high temperature refractory materials make it desirable to consider the highest peak powers which can be made available on the pallet, total energy requirements are kept relatively low by the very fast processing times typical of containerless experiments and allows consideration of heat rejection capabilities lower than peak power demand if energy storage in system heat capacitances is considered. Batteries are considered to avoid a requirement for fuel cells capable of furnishing this brief peak power demand.

  9. Herb-drug interactions. Interactions between saw palmetto and prescription medications.

    PubMed

    Bressler, Rubin

    2005-11-01

    Patients over age 50 typically present with one chronic disease per decade. Each chronic disease typically requires long-term drug therapy, meaning most older patients require several drugs to maintain health. Simultaneously, use of complementary and alternative medicine (CAM) has increased in the United States in the last 20 years, reaching 36% in 2002; herbal medicine use accounts for approximately 22% of all CAM use. Older adults often add herbal medicines to prescription medications, yet do not always inform their physicians. The drug metabolizing enzyme systems process all compounds foreign to the body, including prescription and herbal medications. Therefore use of both medicinals simultaneously has a potential for adverse interactions. This review, which discusses saw palmetto, is the last in a series covering the documented interactions among the top 5 efficacious herbal medicines and prescription drugs.

  10. Variety in emotional life: within-category typicality of emotional experiences is associated with neural activity in large-scale brain networks.

    PubMed

    Wilson-Mendenhall, Christine D; Barrett, Lisa Feldman; Barsalou, Lawrence W

    2015-01-01

    The tremendous variability within categories of human emotional experience receives little empirical attention. We hypothesized that atypical instances of emotion categories (e.g. pleasant fear of thrill-seeking) would be processed less efficiently than typical instances of emotion categories (e.g. unpleasant fear of violent threat) in large-scale brain networks. During a novel fMRI paradigm, participants immersed themselves in scenarios designed to induce atypical and typical experiences of fear, sadness or happiness (scenario immersion), and then focused on and rated the pleasant or unpleasant feeling that emerged (valence focus) in most trials. As predicted, reliably greater activity in the 'default mode' network (including medial prefrontal cortex and posterior cingulate) was observed for atypical (vs typical) emotional experiences during scenario immersion, suggesting atypical instances require greater conceptual processing to situate the socio-emotional experience. During valence focus, reliably greater activity was observed for atypical (vs typical) emotional experiences in the 'salience' network (including anterior insula and anterior cingulate), suggesting atypical instances place greater demands on integrating shifting body signals with the sensory and social context. Consistent with emerging psychological construction approaches to emotion, these findings demonstrate that is it important to study the variability within common categories of emotional experience. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  11. Superior memory efficiency of quantum devices for the simulation of continuous-time stochastic processes

    NASA Astrophysics Data System (ADS)

    Elliott, Thomas J.; Gu, Mile

    2018-03-01

    Continuous-time stochastic processes pervade everyday experience, and the simulation of models of these processes is of great utility. Classical models of systems operating in continuous-time must typically track an unbounded amount of information about past behaviour, even for relatively simple models, enforcing limits on precision due to the finite memory of the machine. However, quantum machines can require less information about the past than even their optimal classical counterparts to simulate the future of discrete-time processes, and we demonstrate that this advantage extends to the continuous-time regime. Moreover, we show that this reduction in the memory requirement can be unboundedly large, allowing for arbitrary precision even with a finite quantum memory. We provide a systematic method for finding superior quantum constructions, and a protocol for analogue simulation of continuous-time renewal processes with a quantum machine.

  12. Managing the Nuclear Fuel Cycle: Policy Implications of Expanding Global Access to Nuclear Power

    DTIC Science & Technology

    2008-09-03

    Spent nuclear fuel disposal has remained the most critical aspect of the nuclear fuel cycle for the United States, where longstanding nonproliferation...inalienable right and by and large, neither have U.S. government officials. However, the case of Iran raises perhaps the most critical question in...the enrichment process can take advantage of the slight difference in atomic mass between 235U and 238U. The typical enrichment process requires

  13. Managing the Nuclear Fuel Cycle: Policy Implications of Expanding Global Access to Nuclear Power

    DTIC Science & Technology

    2010-03-05

    However, the case of Iran raises perhaps the most critical question in this decade for strengthening the nuclear nonproliferation regime: How can...enrichment process can take advantage of the slight difference in atomic mass between 235U and 238U. The typical enrichment process requires about 10 lbs of...neutrons but can induce fission in all actinides , including all plutonium isotopes. Therefore, nuclear fuel for a fast reactor must have a higher

  14. Overview of the production of sintered SiC optics and optical sub-assemblies

    NASA Astrophysics Data System (ADS)

    Williams, S.; Deny, P.

    2005-08-01

    The following is an overview on sintered silicon carbide (SSiC) material properties and processing requirements for the manufacturing of components for advanced technology optical systems. The overview will compare SSiC material properties to typical materials used for optics and optical structures. In addition, it will review manufacturing processes required to produce optical components in detail by process step. The process overview will illustrate current manufacturing process and concepts to expand the process size capability. The overview will include information on the substantial capital equipment employed in the manufacturing of SSIC. This paper will also review common in-process inspection methodology and design rules. The design rules are used to improve production yield, minimize cost, and maximize the inherent benefits of SSiC for optical systems. Optimizing optical system designs for a SSiC manufacturing process will allow systems designers to utilize SSiC as a low risk, cost competitive, and fast cycle time technology for next generation optical systems.

  15. A nationwide survey of state-mandated evaluation practices for domestic violence agencies.

    PubMed

    Riger, Stephanie; Staggs, Susan L

    2011-01-01

    Many agencies serving survivors of domestic violence are required to evaluate their services. Three possible evaluation strategies include: a) process measurement, which typically involves a frequency count of agency activities, such as the number of counseling hours given; b) outcome evaluation, which measures the impact of agency activities on clients, such as increased understanding of the dynamics of abuse; or c) performance measurement, which assesses the extent to which agencies achieve their stated goals. Findings of a telephone survey of state funders of domestic violence agencies in the United States revealed that most states (67%) require only process measurement, while fewer than 10% require performance measurement. Most (69%) funders reported satisfaction with their evaluation strategy and emphasized the need for involvement of all stakeholders, especially grantees, in developing an evaluation.

  16. Financing biotechnology projects: lender due diligence requirements and the role of independent technical consultants.

    PubMed

    Keller, J B; Plath, P B

    1999-01-01

    An increasing number of biotechnology projects are being brought to commercialization using conventional structured finance sources, which have traditionally only been available to proven technologies and primary industries. Attracting and securing competitive cost financing from mainstream lenders, however, will require the sponsor of a new technology or process to undergo a greater level of due diligence. The specific areas and intensity of investigation, which are typically required by lenders in order to secure long-term financing for biotechnology-based manufacturing systems, is reviewed. The processes for evaluating the adequacy of prior laboratory testing and pilot plant demonstrations is discussed. Particular emphasis is given to scale-up considerations and the ability of the proposed facility design to accommodate significant modifications, in the event that scale-up problems are encountered.

  17. Low-cost data analysis systems for processing multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Whitely, S. L.

    1976-01-01

    The basic hardware and software requirements are described for four low cost analysis systems for computer generated land use maps. The data analysis systems consist of an image display system, a small digital computer, and an output recording device. Software is described together with some of the display and recording devices, and typical costs are cited. Computer requirements are given, and two approaches are described for converting black-white film and electrostatic printer output to inexpensive color output products. Examples of output products are shown.

  18. Multivariate statistical monitoring as applied to clean-in-place (CIP) and steam-in-place (SIP) operations in biopharmaceutical manufacturing.

    PubMed

    Roy, Kevin; Undey, Cenk; Mistretta, Thomas; Naugle, Gregory; Sodhi, Manbir

    2014-01-01

    Multivariate statistical process monitoring (MSPM) is becoming increasingly utilized to further enhance process monitoring in the biopharmaceutical industry. MSPM can play a critical role when there are many measurements and these measurements are highly correlated, as is typical for many biopharmaceutical operations. Specifically, for processes such as cleaning-in-place (CIP) and steaming-in-place (SIP, also known as sterilization-in-place), control systems typically oversee the execution of the cycles, and verification of the outcome is based on offline assays. These offline assays add to delays and corrective actions may require additional setup times. Moreover, this conventional approach does not take interactive effects of process variables into account and cycle optimization opportunities as well as salient trends in the process may be missed. Therefore, more proactive and holistic online continued verification approaches are desirable. This article demonstrates the application of real-time MSPM to processes such as CIP and SIP with industrial examples. The proposed approach has significant potential for facilitating enhanced continuous verification, improved process understanding, abnormal situation detection, and predictive monitoring, as applied to CIP and SIP operations. © 2014 American Institute of Chemical Engineers.

  19. Novel MRF fluid for ultra-low roughness optical surfaces

    NASA Astrophysics Data System (ADS)

    Dumas, Paul; McFee, Charles

    2014-08-01

    Over the past few years there have been an increasing number of applications calling for ultra-low roughness (ULR) surfaces. A critical demand has been driven by EUV optics, EUV photomasks, X-Ray, and high energy laser applications. Achieving ULR results on complex shapes like aspheres and X-Ray mirrors is extremely challenging with conventional polishing techniques. To achieve both tight figure and roughness specifications, substrates typically undergo iterative global and local polishing processes. Typically the local polishing process corrects the figure or flatness but cannot achieve the required surface roughness, whereas the global polishing process produces the required roughness but degrades the figure. Magnetorheological Finishing (MRF) is a local polishing technique based on a magnetically-sensitive fluid that removes material through a shearing mechanism with minimal normal load, thus removing sub-surface damage. The lowest surface roughness produced by current MRF is close to 3 Å RMS. A new ULR MR fluid uses a nano-based cerium as the abrasive in a proprietary aqueous solution, the combination of which reliably produces under 1.5Å RMS roughness on Fused Silica as measured by atomic force microscopy. In addition to the highly convergent figure correction achieved with MRF, we show results of our novel MR fluid achieving <1.5Å RMS roughness on fused silica and other materials.

  20. The Dependence of CNT Aerogel Synthesis on Sulfur-driven Catalyst Nucleation Processes and a Critical Catalyst Particle Mass Concentration.

    PubMed

    Hoecker, Christian; Smail, Fiona; Pick, Martin; Weller, Lee; Boies, Adam M

    2017-11-06

    The floating catalyst chemical vapor deposition (FC-CVD) process permits macro-scale assembly of nanoscale materials, enabling continuous production of carbon nanotube (CNT) aerogels. Despite the intensive research in the field, fundamental uncertainties remain regarding how catalyst particle dynamics within the system influence the CNT aerogel formation, thus limiting effective scale-up. While aerogel formation in FC-CVD reactors requires a catalyst (typically iron, Fe) and a promotor (typically sulfur, S), their synergistic roles are not fully understood. This paper presents a paradigm shift in the understanding of the role of S in the process with new experimental studies identifying that S lowers the nucleation barrier of the catalyst nanoparticles. Furthermore, CNT aerogel formation requires a critical threshold of Fe x C y  > 160 mg/m 3 , but is surprisingly independent of the initial catalyst diameter or number concentration. The robustness of the critical catalyst mass concentration principle is proved further by producing CNTs using alternative catalyst systems; Fe nanoparticles from a plasma spark generator and cobaltocene and nickelocene precursors. This finding provides evidence that low-cost and high throughput CNT aerogel routes may be achieved by decoupled and enhanced catalyst production and control, opening up new possibilities for large-scale CNT synthesis.

  1. Documentation of a restart option for the U.S. Geological Survey coupled Groundwater and Surface-Water Flow (GSFLOW) model

    USGS Publications Warehouse

    Regan, R. Steve; Niswonger, Richard G.; Markstrom, Steven L.; Barlow, Paul M.

    2015-10-02

    The spin-up simulation should be run for a sufficient length of time necessary to establish antecedent conditions throughout a model domain. Each GSFLOW application can require different lengths of time to account for the hydrologic stresses to propagate through a coupled groundwater and surface-water system. Typically, groundwater hydrologic processes require many years to come into equilibrium with dynamic climate and other forcing (or stress) data, such as precipitation and well pumping, whereas runoff-dominated surface-water processes respond relatively quickly. Use of a spin-up simulation can substantially reduce execution-time requirements for applications where the time period of interest is small compared to the time for hydrologic memory; thus, use of the restart option can be an efficient strategy for forecast and calibration simulations that require multiple simulations starting from the same day.

  2. Evaluation of four methods for estimating leaf area of isolated trees

    Treesearch

    P.J. Peper; E.G. McPherson

    2003-01-01

    The accurate modeling of the physiological and functional processes of urban forests requires information on the leaf area of urban tree species. Several non-destructive, indirect leaf area sampling methods have shown good performance for homogenous canopies. These methods have not been evaluated for use in urban settings where trees are typically isolated and...

  3. 6 Project-Management Tips

    ERIC Educational Resources Information Center

    Demski, Jennifer

    2012-01-01

    When it comes to project management, the IT department is typically its own worst enemy. When project requests are pushed through the budgeting process by different departments, it's up to IT to make them all work. The staff is required to be "heroic" to get the project load done. People get to work over weekends and postpone their vacations. The…

  4. Net Operating Working Capital, Capital Budgeting, and Cash Budgets: A Teaching Example

    ERIC Educational Resources Information Center

    Tuner, James A.

    2016-01-01

    Many introductory finance texts present information on the capital budgeting process, including estimation of project cash flows. Typically, estimation of project cash flows begins with a calculation of net income. Getting from net income to cash flows requires accounting for non-cash items such as depreciation. Also important is the effect of…

  5. Motor Skill Assessment of Children: Is There an Association between Performance-Based, Child-Report, and Parent-Report Measures of Children's Motor Skills?

    ERIC Educational Resources Information Center

    Kennedy, Johanna; Brown, Ted; Chien, Chi-Wen

    2012-01-01

    Client-centered practice requires therapists to actively seek the perspectives of children and families. Several assessment tools are available to facilitate this process. However, when evaluating motor skill performance, therapists typically concentrate on performance-based assessment. To improve understanding of the information provided by the…

  6. Processing Satellite Images on Tertiary Storage: A Study of the Impact of Tile Size on Performance

    NASA Technical Reports Server (NTRS)

    Yu, JieBing; DeWitt, David J.

    1996-01-01

    Before raw data from a satellite can be used by an Earth scientist, it must first undergo a number of processing steps including basic processing, cleansing, and geo-registration. Processing actually expands the volume of data collected by a factor of 2 or 3 and the original data is never deleted. Thus processing and storage requirements can exceed 2 terrabytes/day. Once processed data is ready for analysis, a series of algorithms (typically developed by the Earth scientists) is applied to a large number of images in a data set. The focus of this paper is how best to handle such images stored on tape using the following assumptions: (1) all images of interest to a scientist are stored on a single tape, (2) images are accessed and processed in the order that they are stored on tape, and (3) the analysis requires access to only a portion of each image and not the entire image.

  7. The Paperless Solution

    NASA Technical Reports Server (NTRS)

    2001-01-01

    REI Systems, Inc. developed a software solution that uses the Internet to eliminate the paperwork typically required to document and manage complex business processes. The data management solution, called Electronic Handbooks (EHBs), is presently used for the entire SBIR program processes at NASA. The EHB-based system is ideal for programs and projects whose users are geographically distributed and are involved in complex management processes and procedures. EHBs provide flexible access control and increased communications while maintaining security for systems of all sizes. Through Internet Protocol- based access, user authentication and user-based access restrictions, role-based access control, and encryption/decryption, EHBs provide the level of security required for confidential data transfer. EHBs contain electronic forms and menus, which can be used in real time to execute the described processes. EHBs use standard word processors that generate ASCII HTML code to set up electronic forms that are viewed within a web browser. EHBs require no end-user software distribution, significantly reducing operating costs. Each interactive handbook simulates a hard-copy version containing chapters with descriptions of participants' roles in the online process.

  8. AMPS data management concepts. [Atmospheric, Magnetospheric and Plasma in Space experiment

    NASA Technical Reports Server (NTRS)

    Metzelaar, P. N.

    1975-01-01

    Five typical AMPS experiments were formulated to allow simulation studies to verify data management concepts. Design studies were conducted to analyze these experiments in terms of the applicable procedures, data processing and displaying functions. Design concepts for AMPS data management system are presented which permit both automatic repetitive measurement sequences and experimenter-controlled step-by-step procedures. Extensive use is made of a cathode ray tube display, the experimenters' alphanumeric keyboard, and the computer. The types of computer software required by the system and the possible choices of control and display procedures available to the experimenter are described for several examples. An electromagnetic wave transmission experiment illustrates the methods used to analyze data processing requirements.

  9. The DalHouses: 100 new photographs of houses with ratings of typicality, familiarity, and degree of similarity to faces.

    PubMed

    Filliter, Jillian H; Glover, Jacqueline M; McMullen, Patricia A; Salmon, Joshua P; Johnson, Shannon A

    2016-03-01

    Houses have often been used as comparison stimuli in face-processing studies because of the many attributes they share with faces (e.g., distinct members of a basic category, consistent internal features, mono-orientation, and relative familiarity). Despite this, no large, well-controlled databases of photographs of houses that have been developed for research use currently exist. To address this gap, we photographed 100 houses and carefully edited these images. We then asked 41 undergraduate students (18 to 31 years of age) to rate each house on three dimensions: typicality, likeability, and face-likeness. The ratings had a high degree of face validity, and analyses revealed a significant positive correlation between typicality and likeability. We anticipate that this stimulus set (i.e., the DalHouses) and the associated ratings will prove useful to face-processing researchers by minimizing the effort required to acquire stimuli and allowing for easier replication and extension of studies. The photographs of all 100 houses and their ratings data can be obtained at http://dx.doi.org/10.6084/m9.figshare.1279430.

  10. Validating the Use of Deep Learning Neural Networks for Correction of Large Hydrometric Datasets

    NASA Astrophysics Data System (ADS)

    Frazier, N.; Ogden, F. L.; Regina, J. A.; Cheng, Y.

    2017-12-01

    Collection and validation of Earth systems data can be time consuming and labor intensive. In particular, high resolution hydrometric data, including rainfall and streamflow measurements, are difficult to obtain due to a multitude of complicating factors. Measurement equipment is subject to clogs, environmental disturbances, and sensor drift. Manual intervention is typically required to identify, correct, and validate these data. Weirs can become clogged and the pressure transducer may float or drift over time. We typically employ a graphical tool called Time Series Editor to manually remove clogs and sensor drift from the data. However, this process is highly subjective and requires hydrological expertise. Two different people may produce two different data sets. To use this data for scientific discovery and model validation, a more consistent method is needed to processes this field data. Deep learning neural networks have proved to be excellent mechanisms for recognizing patterns in data. We explore the use of Recurrent Neural Networks (RNN) to capture the patterns in the data over time using various gating mechanisms (LSTM and GRU), network architectures, and hyper-parameters to build an automated data correction model. We also explore the required amount of manually corrected training data required to train the network for reasonable accuracy. The benefits of this approach are that the time to process a data set is significantly reduced, and the results are 100% reproducible after training is complete. Additionally, we train the RNN and calibrate a physically-based hydrological model against the same portion of data. Both the RNN and the model are applied to the remaining data using a split-sample methodology. Performance of the machine learning is evaluated for plausibility by comparing with the output of the hydrological model, and this analysis identifies potential periods where additional investigation is warranted.

  11. Quantifying induced effects of subsurface renewable energy storage

    NASA Astrophysics Data System (ADS)

    Bauer, Sebastian; Beyer, Christof; Pfeiffer, Tilmann; Boockmeyer, Anke; Popp, Steffi; Delfs, Jens-Olaf; Wang, Bo; Li, Dedong; Dethlefsen, Frank; Dahmke, Andreas

    2015-04-01

    New methods and technologies for energy storage are required for the transition to renewable energy sources. Subsurface energy storage systems such as salt caverns or porous formations offer the possibility of hosting large amounts of energy or substance. When employing these systems, an adequate system and process understanding is required in order to assess the feasibility of the individual storage option at the respective site and to predict the complex and interacting effects induced. This understanding is the basis for assessing the potential as well as the risks connected with a sustainable usage of these storage options, especially when considering possible mutual influences. For achieving this aim, in this work synthetic scenarios for the use of the geological underground as an energy storage system are developed and parameterized. The scenarios are designed to represent typical conditions in North Germany. The types of subsurface use investigated here include gas storage and heat storage in porous formations. The scenarios are numerically simulated and interpreted with regard to risk analysis and effect forecasting. For this, the numerical simulators Eclipse and OpenGeoSys are used. The latter is enhanced to include the required coupled hydraulic, thermal, geomechanical and geochemical processes. Using the simulated and interpreted scenarios, the induced effects are quantified individually and monitoring concepts for observing these effects are derived. This presentation will detail the general investigation concept used and analyze the parameter availability for this type of model applications. Then the process implementation and numerical methods required and applied for simulating the induced effects of subsurface storage are detailed and explained. Application examples show the developed methods and quantify induced effects and storage sizes for the typical settings parameterized. This work is part of the ANGUS+ project, funded by the German Ministry of Education and Research (BMBF).

  12. IEC 61511 and the capital project process--a protective management system approach.

    PubMed

    Summers, Angela E

    2006-03-17

    This year, the process industry has reached an important milestone in process safety-the acceptance of an internationally recognized standard for safety instrumented systems (SIS). This standard, IEC 61511, documents good engineering practice for the assessment, design, operation, maintenance, and management of SISs. The foundation of the standard is established by several requirements in Part 1, Clauses 5-7, which cover the development of a management system aimed at ensuring that functional safety is achieved. The management system includes a quality assurance process for the entire SIS lifecycle, requiring the development of procedures, identification of resources and acquisition of tools. For maximum benefit, the deliverables and quality control checks required by the standard should be integrated into the capital project process, addressing safety, environmental, plant productivity, and asset protection. Industry has become inundated with a multitude of programs focusing on safety, quality, and cost performance. This paper introduces a protective management system, which builds upon the work process identified in IEC 61511. Typical capital project phases are integrated with the management system to yield one comprehensive program to efficiently manage process risk. Finally, the paper highlights areas where internal practices or guidelines should be developed to improve program performance and cost effectiveness.

  13. Disorders of representation and control in semantic cognition: Effects of familiarity, typicality, and specificity

    PubMed Central

    Rogers, Timothy T.; Patterson, Karalyn; Jefferies, Elizabeth; Lambon Ralph, Matthew A.

    2015-01-01

    We present a case-series comparison of patients with cross-modal semantic impairments consequent on either (a) bilateral anterior temporal lobe atrophy in semantic dementia (SD) or (b) left-hemisphere fronto-parietal and/or posterior temporal stroke in semantic aphasia (SA). Both groups were assessed on a new test battery designed to measure how performance is influenced by concept familiarity, typicality and specificity. In line with previous findings, performance in SD was strongly modulated by all of these factors, with better performance for more familiar items (regardless of typicality), for more typical items (regardless of familiarity) and for tasks that did not require very specific classification, consistent with the gradual degradation of conceptual knowledge in SD. The SA group showed significant impairments on all tasks but their sensitivity to familiarity, typicality and specificity was more variable and governed by task-specific effects of these factors on controlled semantic processing. The results are discussed with reference to theories about the complementary roles of representation and manipulation of semantic knowledge. PMID:25934635

  14. Cylinder Test Specification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richard Catanach; Larry Hill; Herbert Harry

    1999-10-01

    The purpose of the cylinder testis two-fold: (1) to characterize the metal-pushing ability of an explosive relative to that of other explosives as evaluated by the E{sub 19} cylinder energy and the G{sub 19} Gurney energy and (2) to help establish the explosive product equation-of-state (historically, the Jones-Wilkins-Lee (JWL) equation). This specification details the material requirements and procedures necessary to assemble and fire a typical Los Alamos National Laboratory (LANL) cylinder test. Strict adherence to the cylinder. material properties, machining tolerances, material heat-treatment and etching processes, and high explosive machining tolerances is essential for test-to-test consistency and to maximize radialmore » wall expansions. Assembly and setup of the cylinder test require precise attention to detail, especially when placing intricate pin wires on the cylinder wall. The cylinder test is typically fired outdoors and at ambient temperature.« less

  15. From Sensory Perception to Lexical-Semantic Processing: An ERP Study in Non-Verbal Children with Autism.

    PubMed

    Cantiani, Chiara; Choudhury, Naseem A; Yu, Yan H; Shafer, Valerie L; Schwartz, Richard G; Benasich, April A

    2016-01-01

    This study examines electrocortical activity associated with visual and auditory sensory perception and lexical-semantic processing in nonverbal (NV) or minimally-verbal (MV) children with Autism Spectrum Disorder (ASD). Currently, there is no agreement on whether these children comprehend incoming linguistic information and whether their perception is comparable to that of typically developing children. Event-related potentials (ERPs) of 10 NV/MV children with ASD and 10 neurotypical children were recorded during a picture-word matching paradigm. Atypical ERP responses were evident at all levels of processing in children with ASD. Basic perceptual processing was delayed in both visual and auditory domains but overall was similar in amplitude to typically-developing children. However, significant differences between groups were found at the lexical-semantic level, suggesting more atypical higher-order processes. The results suggest that although basic perception is relatively preserved in NV/MV children with ASD, higher levels of processing, including lexical- semantic functions, are impaired. The use of passive ERP paradigms that do not require active participant response shows significant potential for assessment of non-compliant populations such as NV/MV children with ASD.

  16. From Sensory Perception to Lexical-Semantic Processing: An ERP Study in Non-Verbal Children with Autism

    PubMed Central

    Cantiani, Chiara; Choudhury, Naseem A.; Yu, Yan H.; Shafer, Valerie L.; Schwartz, Richard G.; Benasich, April A.

    2016-01-01

    This study examines electrocortical activity associated with visual and auditory sensory perception and lexical-semantic processing in nonverbal (NV) or minimally-verbal (MV) children with Autism Spectrum Disorder (ASD). Currently, there is no agreement on whether these children comprehend incoming linguistic information and whether their perception is comparable to that of typically developing children. Event-related potentials (ERPs) of 10 NV/MV children with ASD and 10 neurotypical children were recorded during a picture-word matching paradigm. Atypical ERP responses were evident at all levels of processing in children with ASD. Basic perceptual processing was delayed in both visual and auditory domains but overall was similar in amplitude to typically-developing children. However, significant differences between groups were found at the lexical-semantic level, suggesting more atypical higher-order processes. The results suggest that although basic perception is relatively preserved in NV/MV children with ASD, higher levels of processing, including lexical- semantic functions, are impaired. The use of passive ERP paradigms that do not require active participant response shows significant potential for assessment of non-compliant populations such as NV/MV children with ASD. PMID:27560378

  17. LABORATORY PROCESS CONTROLLER USING NATURAL LANGUAGE COMMANDS FROM A PERSONAL COMPUTER

    NASA Technical Reports Server (NTRS)

    Will, H.

    1994-01-01

    The complex environment of the typical research laboratory requires flexible process control. This program provides natural language process control from an IBM PC or compatible machine. Sometimes process control schedules require changes frequently, even several times per day. These changes may include adding, deleting, and rearranging steps in a process. This program sets up a process control system that can either run without an operator, or be run by workers with limited programming skills. The software system includes three programs. Two of the programs, written in FORTRAN77, record data and control research processes. The third program, written in Pascal, generates the FORTRAN subroutines used by the other two programs to identify the user commands with the user-written device drivers. The software system also includes an input data set which allows the user to define the user commands which are to be executed by the computer. To set the system up the operator writes device driver routines for all of the controlled devices. Once set up, this system requires only an input file containing natural language command lines which tell the system what to do and when to do it. The operator can make up custom commands for operating and taking data from external research equipment at any time of the day or night without the operator in attendance. This process control system requires a personal computer operating under MS-DOS with suitable hardware interfaces to all controlled devices. The program requires a FORTRAN77 compiler and user-written device drivers. This program was developed in 1989 and has a memory requirement of about 62 Kbytes.

  18. Experience from start-ups of the first ANITA Mox plants.

    PubMed

    Christensson, M; Ekström, S; Andersson Chan, A; Le Vaillant, E; Lemaire, R

    2013-01-01

    ANITA™ Mox is a new one-stage deammonification Moving-Bed Biofilm Reactor (MBBR) developed for partial nitrification to nitrite and autotrophic N-removal from N-rich effluents. This deammonification process offers many advantages such as dramatically reduced oxygen requirements, no chemical oxygen demand requirement, lower sludge production, no pre-treatment or requirement of chemicals and thereby being an energy and cost efficient nitrogen removal process. An innovative seeding strategy, the 'BioFarm concept', has been developed in order to decrease the start-up time of new ANITA Mox installations. New ANITA Mox installations are started with typically 3-15% of the added carriers being from the 'BioFarm', with already established anammox biofilm, the rest being new carriers. The first ANITA Mox plant, started up in 2010 at Sjölunda wastewater treatment plant (WWTP) in Malmö, Sweden, proved this seeding concept, reaching an ammonium removal rate of 1.2 kgN/m³ d and approximately 90% ammonia removal within 4 months from start-up. This first ANITA Mox plant is also the BioFarm used for forthcoming installations. Typical features of this first installation were low energy consumption, 1.5 kW/NH4-N-removed, low N₂O emissions, <1% of the reduced nitrogen and a very stable and robust process towards variations in loads and process conditions. The second ANITA Mox plant, started up at Sundets WWTP in Växjö, Sweden, reached full capacity with more than 90% ammonia removal within 2 months from start-up. By applying a nitrogen loading strategy to the reactor that matches the capacity of the seeding carriers, more than 80% nitrogen removal could be obtained throughout the start-up period.

  19. Characterization of Developer Application Methods Used in Fluorescent Penetrant Inspection

    NASA Astrophysics Data System (ADS)

    Brasche, L. J. H.; Lopez, R.; Eisenmann, D.

    2006-03-01

    Fluorescent penetrant inspection (FPI) is the most widely used inspection method for aviation components seeing use for production as well as an inservice inspection applications. FPI is a multiple step process requiring attention to the process parameters for each step in order to enable a successful inspection. A multiyear program is underway to evaluate the most important factors affecting the performance of FPI, to determine whether existing industry specifications adequately address control of the process parameters, and to provide the needed engineering data to the public domain. The final step prior to the inspection is the application of developer with typical aviation inspections involving the use of dry powder (form d) usually applied using either a pressure wand or dust storm chamber. Results from several typical dust storm chambers and wand applications have shown less than optimal performance. Measurements of indication brightness and recording of the UVA image, and in some cases, formal probability of detection (POD) studies were used to assess the developer application methods. Key conclusions and initial recommendations are provided.

  20. Transfer of the epoxidation of soybean oil from batch to flow chemistry guided by cost and environmental issues.

    PubMed

    Kralisch, Dana; Streckmann, Ina; Ott, Denise; Krtschil, Ulich; Santacesaria, Elio; Di Serio, Martino; Russo, Vincenzo; De Carlo, Lucrezia; Linhart, Walter; Christian, Engelbert; Cortese, Bruno; de Croon, Mart H J M; Hessel, Volker

    2012-02-13

    The simple transfer of established chemical production processes from batch to flow chemistry does not automatically result in more sustainable ones. Detailed process understanding and the motivation to scrutinize known process conditions are necessary factors for success. Although the focus is usually "only" on intensifying transport phenomena to operate under intrinsic kinetics, there is also a large intensification potential in chemistry under harsh conditions and in the specific design of flow processes. Such an understanding and proposed processes are required at an early stage of process design because decisions on the best-suited tools and parameters required to convert green engineering concepts into practice-typically with little chance of substantial changes later-are made during this period. Herein, we present a holistic and interdisciplinary process design approach that combines the concept of novel process windows with process modeling, simulation, and simplified cost and lifecycle assessment for the deliberate development of a cost-competitive and environmentally sustainable alternative to an existing production process for epoxidized soybean oil. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Conversion of direct process high-boiling residue to monosilanes

    DOEpatents

    Brinson, Jonathan Ashley; Crum, Bruce Robert; Jarvis, Jr., Robert Frank

    2000-01-01

    A process for the production of monosilanes from the high-boiling residue resulting from the reaction of hydrogen chloride with silicon metalloid in a process typically referred to as the "direct process." The process comprises contacting a high-boiling residue resulting from the reaction of hydrogen chloride and silicon metalloid, with hydrogen gas in the presence of a catalytic amount of aluminum trichloride effective in promoting conversion of the high-boiling residue to monosilanes. The present process results in conversion of the high-boiling residue to monosilanes. At least a portion of the aluminum trichloride catalyst required for conduct of the process may be formed in situ during conduct of the direct process and isolation of the high-boiling residue.

  2. Description of the AILS Alerting Algorithm

    NASA Technical Reports Server (NTRS)

    Samanant, Paul; Jackson, Mike

    2000-01-01

    This document provides a complete description of the Airborne Information for Lateral Spacing (AILS) alerting algorithms. The purpose of AILS is to provide separation assurance between aircraft during simultaneous approaches to closely spaced parallel runways. AILS will allow independent approaches to be flown in such situations where dependent approaches were previously required (typically under Instrument Meteorological Conditions (IMC)). This is achieved by providing multiple levels of alerting for pairs of aircraft that are in parallel approach situations. This document#s scope is comprehensive and covers everything from general overviews, definitions, and concepts down to algorithmic elements and equations. The entire algorithm is presented in complete and detailed pseudo-code format. This can be used by software programmers to program AILS into a software language. Additional supporting information is provided in the form of coordinate frame definitions, data requirements, calling requirements as well as all necessary pre-processing and post-processing requirements. This is important and required information for the implementation of AILS into an analysis, a simulation, or a real-time system.

  3. Scalable DB+IR Technology: Processing Probabilistic Datalog with HySpirit.

    PubMed

    Frommholz, Ingo; Roelleke, Thomas

    2016-01-01

    Probabilistic Datalog (PDatalog, proposed in 1995) is a probabilistic variant of Datalog and a nice conceptual idea to model Information Retrieval in a logical, rule-based programming paradigm. Making PDatalog work in real-world applications requires more than probabilistic facts and rules, and the semantics associated with the evaluation of the programs. We report in this paper some of the key features of the HySpirit system required to scale the execution of PDatalog programs. Firstly, there is the requirement to express probability estimation in PDatalog. Secondly, fuzzy-like predicates are required to model vague predicates (e.g. vague match of attributes such as age or price). Thirdly, to handle large data sets there are scalability issues to be addressed, and therefore, HySpirit provides probabilistic relational indexes and parallel and distributed processing . The main contribution of this paper is a consolidated view on the methods of the HySpirit system to make PDatalog applicable in real-scale applications that involve a wide range of requirements typical for data (information) management and analysis.

  4. An aspect-oriented approach for designing safety-critical systems

    NASA Astrophysics Data System (ADS)

    Petrov, Z.; Zaykov, P. G.; Cardoso, J. P.; Coutinho, J. G. F.; Diniz, P. C.; Luk, W.

    The development of avionics systems is typically a tedious and cumbersome process. In addition to the required functions, developers must consider various and often conflicting non-functional requirements such as safety, performance, and energy efficiency. Certainly, an integrated approach with a seamless design flow that is capable of requirements modelling and supporting refinement down to an actual implementation in a traceable way, may lead to a significant acceleration of development cycles. This paper presents an aspect-oriented approach supported by a tool chain that deals with functional and non-functional requirements in an integrated manner. It also discusses how the approach can be applied to development of safety-critical systems and provides experimental results.

  5. Event-Based Processing of Neutron Scattering Data

    DOE PAGES

    Peterson, Peter F.; Campbell, Stuart I.; Reuter, Michael A.; ...

    2015-09-16

    Many of the world's time-of-flight spallation neutrons sources are migrating to the recording of individual neutron events. This provides for new opportunities in data processing, the least of which is to filter the events based on correlating them with logs of sample environment and other ancillary equipment. This paper will describe techniques for processing neutron scattering data acquired in event mode that preserve event information all the way to a final spectrum, including any necessary corrections or normalizations. This results in smaller final errors, while significantly reducing processing time and memory requirements in typical experiments. Results with traditional histogramming techniquesmore » will be shown for comparison.« less

  6. Affinity+: Semi-Structured Brainstorming on Large Displays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burtner, Edwin R.; May, Richard A.; Scarberry, Randall E.

    2013-04-27

    Affinity diagraming is a powerful method for encouraging and capturing lateral thinking in a group environment. The Affinity+ Concept was designed to improve the collaborative brainstorm process through the use of large display surfaces in conjunction with mobile devices like smart phones and tablets. The system works by capturing the ideas digitally and allowing users to sort and group them on a large touch screen manually. Additionally, Affinity+ incorporates theme detection, topic clustering, and other processing algorithms that help bring structured analytic techniques to the process without requiring explicit leadership roles and other overhead typically involved in these activities.

  7. Membrane thickening aerobic digestion processes.

    PubMed

    Woo, Bryen

    2014-01-01

    Sludge management accounts for approximately 60% of the total wastewater treatment plant expenditure and laws for sludge disposal are becoming increasingly stringent, therefore much consideration is required when designing a solids handling process. A membrane thickening aerobic digestion process integrates a controlled aerobic digestion process with pre-thickening waste activated sludge using membrane technology. This process typically features an anoxic tank, an aerated membrane thickener operating in loop with a first-stage digester followed by second-stage digestion. Membrane thickening aerobic digestion processes can handle sludge from any liquid treatment process and is best for facilities obligated to meet low total phosphorus and nitrogen discharge limits. Membrane thickening aerobic digestion processes offer many advantages including: producing a reusable quality permeate with minimal levels of total phosphorus and nitrogen that can be recycled to the head works of a plant, protecting the performance of a biological nutrient removal liquid treatment process without requiring chemical addition, providing reliable thickening up to 4% solids concentration without the use of polymers or attention to decanting, increasing sludge storage capacities in existing tanks, minimizing the footprint of new tanks, reducing disposal costs, and providing Class B stabilization.

  8. Potential for yield improvement in combined rip-first and crosscut-first rough mill processing

    Treesearch

    Ed Thomas; Urs Buehlmann

    2016-01-01

    Traditionally, lumber cutting systems in rough mills have either first ripped lumber into wide strips and then crosscut the resulting strips into component lengths (rip-first), or first crosscut the lumber into component lengths, then ripped the segments to the required widths (crosscut-first). Each method has its advantages and disadvantages. Crosscut-first typically...

  9. The Components of Working Memory Updating: An Experimental Decomposition and Individual Differences

    ERIC Educational Resources Information Center

    Ecker, Ullrich K. H.; Lewandowsky, Stephan; Oberauer, Klaus; Chee, Abby E. H.

    2010-01-01

    Working memory updating (WMU) has been identified as a cognitive function of prime importance for everyday tasks and has also been found to be a significant predictor of higher mental abilities. Yet, little is known about the constituent processes of WMU. We suggest that operations required in a typical WMU task can be decomposed into 3 major…

  10. A non-parametric, supervised classification of vegetation types on the Kaibab National Forest using decision trees

    Treesearch

    Suzanne M. Joy; R. M. Reich; Richard T. Reynolds

    2003-01-01

    Traditional land classification techniques for large areas that use Landsat Thematic Mapper (TM) imagery are typically limited to the fixed spatial resolution of the sensors (30m). However, the study of some ecological processes requires land cover classifications at finer spatial resolutions. We model forest vegetation types on the Kaibab National Forest (KNF) in...

  11. Designing High Performance Steel Castings Today: Proceedings of the Steel Founders Society of America, Technical and Operating Conference December 7-10, 2016Chicago, IL

    DTIC Science & Technology

    2016-12-10

    will be 2 x failure (critical) depth. G. INSPECTION REQUIREMENTS Either the No- Bake sand or Investment process is selected based on which... Bake sand and the Investment Casting Handbook by the Investment Casting Institute has the tolerance values for investment castings. Typically there

  12. Effect of Sentence Length and Complexity on Working Memory Performance in Hungarian Children with Specific Language Impairment (SLI): A Cross-Linguistic Comparison

    ERIC Educational Resources Information Center

    Marton, Klara; Schwartz, Richard G.; Farkas, Lajos; Katsnelson, Valeriya

    2006-01-01

    Background: English-speaking children with specific language impairment (SLI) perform more poorly than their typically developing peers in verbal working memory tasks where processing and storage are simultaneously required. Hungarian is a language with a relatively free word order and a rich agglutinative morphology. Aims: To examine the effect…

  13. The Missing Ingredients in Reflective Supervision: Helping Staff Members Learn about and Fully Participate in the Supervisory Process

    ERIC Educational Resources Information Center

    Heffron, Mary Claire; Murch, Trudi

    2018-01-01

    Successful implementation of a reflective supervision (RS) model in an agency or system requires careful attention to the learning needs of supervisees. Although supervisors and managers typically receive orientation and training to help them understand and implement RS, their staff rarely do. In this article, the authors explore supervisees'…

  14. Annotation: What Electrical Brain Activity Tells Us about Brain Function that Other Techniques Cannot Tell Us--A Child Psychiatric Perspective

    ERIC Educational Resources Information Center

    Banaschewski, Tobias; Brandeis, Daniel

    2007-01-01

    Background: Monitoring brain processes in real time requires genuine subsecond resolution to follow the typical timing and frequency of neural events. Non-invasive recordings of electric (EEG/ERP) and magnetic (MEG) fields provide this time resolution. They directly measure neural activations associated with a wide variety of brain states and…

  15. Overview 1993: Computational applications

    NASA Technical Reports Server (NTRS)

    Benek, John A.

    1993-01-01

    Computational applications include projects that apply or develop computationally intensive computer programs. Such programs typically require supercomputers to obtain solutions in a timely fashion. This report describes two CSTAR projects involving Computational Fluid Dynamics (CFD) technology. The first, the Parallel Processing Initiative, is a joint development effort and the second, the Chimera Technology Development, is a transfer of government developed technology to American industry.

  16. Funding Solar Projects at Federal Agencies: Mechanisms and Selection Criteria (Brochure)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Implementing solar energy projects at federal facilities is a process. The project planning phase of the process includes determining goals, building a team, determining site feasibility and selecting the appropriate project funding tool. This fact sheet gives practical guidance to assist decision-makers with understanding and selecting the funding tool that would best address their site goals. Because project funding tools are complex, federal agencies should seek project assistance before making final decisions. High capital requirements combined with limits on federal agency energy contracts create challenges for funding solar projects. Solar developers typically require long-term contracts (15-20) years to spread outmore » the initial investment and to enable payments similar to conventional utility bill payments. In the private sector, 20-year contracts have been developed, vetted, and accepted, but the General Services Administration (GSA) contract authority (federal acquisition regulation [FAR] part 41) typically limits contract terms to 10 years. Payments on shorter-term contracts make solar economically unattractive compared with conventional generation. However, in several instances, the federal sector has utilized innovative funding tools that allow long-term contracts or has created a project package that is economically attractive within a shorter contract term.« less

  17. Water requirements of the iron and steel industry

    USGS Publications Warehouse

    Walling, Faulkner B.; Otts, Louis Ethelbert

    1967-01-01

    Twenty-nine steel plants surveyed during 1957 and 1958 withdrew from various sources about 1,400 billion gallons of water annually and produced 40.8 million tons of ingot steel. This is equivalent to about 34,000 gallons of water per ton of steel. Fifteen iron ore mines and fifteen ore concentration plants together withdrew annually about 89,000 million gallons to produce 15 million tons of iron ore concentrate, or 5,900 gallons per ton of concentrate. About 97 percent of the water used in the steel plants came from surface sources, 2.2 percent was reclaimed sewage, and 1.2 percent was ground water. Steel plants supplied about 96 percent of their own water requirements, although only three plants used self-supplied water exclusively. Water used by the iron ore mines and concentration plants was also predominantly self supplied from surface source. Water use in the iron and steel industry varied widely and depended on the availability of water, age and condition of plants and equipment, kinds of processes, and plant operating procedures. Gross water use in integrated steel plants ranged from 11,200 to 110,000 gallons per ton of steel ingots, and in steel processing plants it ranged from 4,180 to 26,700 gallons per ton. Water reuse also varied widely from 0 to 18 times in integrated steel plants and from 0 to 44 times in steel processing plants. Availability of water seemed to be the principal factor in determining the rate of reuse. Of the units within steel plants, a typical (median) blast furnace required 20,500 gallons of water per ton of pig iron. At the 1956-60 average rate of pig iron consumption, this amounts to about 13,000 gallons per ton of steel ingots or about 40 percent of that required by a typical integrated steel plant 33,200 gallons per ton. Different processes of iron ore concentration are devised specifically for the various kinds of ore. These processes result in a wide range of water use from 124 to 11,300 gallons of water per ton of iron ore concentrate. Water use in concentration plants is related to the physical state of the ore. The data in this report indicate that grain size of the ore is the most important factor; the very fine grained taconite and jasper required the greatest amount of water. Reuse was not widely practiced in the iron ore industry.Consumption of water by integrated steel plants ranged from 0 to 2,010 gallons per ton of ingot steel and by steel processing plants from 120 to 3,420 gallons per ton. Consumption by a typical integrated steel plant was 681 gallons per ton of ingot steel, about 1.8 percent of the intake and about 1 percent of the gross water use. Consumption by a typical steel processing plant was 646 gallons per ton, 18 percent of the intake, and 3.2 percent of the gross water use. The quality of available water was found not to be a critical factor in choosing the location of steel plants, although changes in equipment and in operating procedures are necessary when poor-quality water is used. The use of saline water having a concentration of dissolved solids as much as 10,400 ppm (parts per million) was reported. This very saline water was used for cooling furnaces and for quenching slag. In operations such as rolling steel in which the water comes into contact with the steel being processed, better quality water is used, although water containing as much as 3,410 ppm dissolved solids has been used for this purpose. Treatment of water for use in the iron and steel industry was not widely practiced. Disinfection and treatment for scale and corrosion control were the most frequently used treatment methods.

  18. Medium Deep High Temperature Heat Storage

    NASA Astrophysics Data System (ADS)

    Bär, Kristian; Rühaak, Wolfram; Schulte, Daniel; Welsch, Bastian; Chauhan, Swarup; Homuth, Sebastian; Sass, Ingo

    2015-04-01

    Heating of buildings requires more than 25 % of the total end energy consumption in Germany. Shallow geothermal systems for indirect use as well as shallow geothermal heat storage systems like aquifer thermal energy storage (ATES) or borehole thermal energy storage (BTES) typically provide low exergy heat. The temperature levels and ranges typically require a coupling with heat pumps. By storing hot water from solar panels or thermal power stations with temperatures of up to 110 °C a medium deep high temperature heat storage (MDHTS) can be operated on relatively high temperature levels of more than 45 °C. Storage depths of 500 m to 1,500 m below surface avoid conflicts with groundwater use for drinking water or other purposes. Permeability is typically also decreasing with greater depth; especially in the crystalline basement therefore conduction becomes the dominant heat transport process. Solar-thermal charging of a MDHTS is a very beneficial option for supplying heat in urban and rural systems. Feasibility and design criteria of different system configurations (depth, distance and number of BHE) are discussed. One system is designed to store and supply heat (300 kW) for an office building. The required boreholes are located in granodioritic bedrock. Resulting from this setup several challenges have to be addressed. The drilling and completion has to be planned carefully under consideration of the geological and tectonical situation at the specific site.

  19. Parallel workflow tools to facilitate human brain MRI post-processing

    PubMed Central

    Cui, Zaixu; Zhao, Chenxi; Gong, Gaolang

    2015-01-01

    Multi-modal magnetic resonance imaging (MRI) techniques are widely applied in human brain studies. To obtain specific brain measures of interest from MRI datasets, a number of complex image post-processing steps are typically required. Parallel workflow tools have recently been developed, concatenating individual processing steps and enabling fully automated processing of raw MRI data to obtain the final results. These workflow tools are also designed to make optimal use of available computational resources and to support the parallel processing of different subjects or of independent processing steps for a single subject. Automated, parallel MRI post-processing tools can greatly facilitate relevant brain investigations and are being increasingly applied. In this review, we briefly summarize these parallel workflow tools and discuss relevant issues. PMID:26029043

  20. Remedial investigation work plan for Bear Creek Valley Operable Unit 2 (Rust Spoil Area, SY-200 Yard, Spoil Area 1) at the Oak Ridge Y-12 Plant, Oak Ridge, Tennessee. Environmental Restoration Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-05-01

    The enactment of the Resource Conservation and Recovery Act (RCRA) in 1976 and the Hazardous and Solid Waste Amendments (HSWA) to RCRA in 1984 created management requirements for hazardous waste facilities. The facilities within the Oak Ridge Reservation (ORR) were in the process of meeting the RCRA requirements when ORR was placed on the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) National Priorities List (NPL) on November 21, 1989. Under RCRA, the actions typically follow the RCRA Facility Assessment (RFA)/RCRA Facility Investigation (RFI)/Corrective Measures Study (CMS)/Corrective Measures implementation process. Under CERCLA the actions follow the PA/SI/Remedial Investigation (RI)/Feasibility Studymore » (FS)/Remedial Design/Remedial Action process. The development of this document will incorporate requirements under both RCRA and CERCLA into an RI work plan for the characterization of Bear Creek Valley (BCV) Operable Unit (OU) 2.« less

  1. Structural Safety of a Hubble Space Telescope Science Instrument

    NASA Technical Reports Server (NTRS)

    Lou, M. C.; Brent, D. N.

    1993-01-01

    This paper gives an overview of safety requirements related to structural design and verificationof payloads to be launched and/or retrieved by the Space Shuttle. To demonstrate the generalapproach used to implement these requirements in the development of a typical Shuttle payload, theWide Field/Planetary Camera II, a second generation science instrument currently being developed bythe Jet Propulsion Laboratory (JPL) for the Hubble Space Telescope is used as an example. Inaddition to verification of strength and dynamic characteristics, special emphasis is placed upon thefracture control implementation process, including parts classification and fracture controlacceptability.

  2. Object-based attention benefits reveal selective abnormalities of visual integration in autism.

    PubMed

    Falter, Christine M; Grant, Kate C Plaisted; Davis, Greg

    2010-06-01

    A pervasive integration deficit could provide a powerful and elegant account of cognitive processing in autism spectrum disorders (ASD). However, in the case of visual Gestalt grouping, typically assessed by tasks that require participants explicitly to introspect on their own grouping perception, clear evidence for such a deficit remains elusive. To resolve this issue, we adopt an index of Gestalt grouping from the object-based attention literature that does not require participants to assess their own grouping perception. Children with ASD and mental- and chronological-age matched typically developing children (TD) performed speeded orientation discriminations of two diagonal lines. The lines were superimposed on circles that were either grouped together or segmented on the basis of color, proximity or these two dimensions in competition. The magnitude of performance benefits evident for grouped circles, relative to ungrouped circles, provided an index of grouping under various conditions. Children with ASD showed comparable grouping by proximity to the TD group, but reduced grouping by similarity. ASD seems characterized by a selective bias away from grouping by similarity combined with typical levels of grouping by proximity, rather than by a pervasive integration deficit.

  3. System requirements for a computerised patient record information system at a busy primary health care clinic.

    PubMed

    Blignaut, P J; McDonald, T; Tolmie, C J

    2001-05-01

    A prototyping approach was used to determine the essential system requirements of a computerised patient record information system for a typical township primary health care clinic. A pilot clinic was identified and the existing manual system and business processes in this clinic was studied intensively before the first prototype was implemented. Interviews with users, incidental observations and analysis of actual data entered were used as primary techniques to refine the prototype system iteratively until a system with an acceptable data set and adequate functionalities were in place. Several non-functional and user-related requirements were also discovered during the prototyping period.

  4. Applying the Theory of Constraints to a Base Civil Engineering Operations Branch

    DTIC Science & Technology

    1991-09-01

    Figure Page 1. Typical Work Order Processing . .......... 7 2. Typical Job Order Processing . .......... 8 3. Typical Simplified In-Service Work Plan for...Customers’ Customer Request Service Planning Unit Production] Control Center Material Control Scheduling CE Shops Figure 1.. Typical Work Order Processing 7

  5. Turboexpander plant designs can provide high ethane recovery without inlet CO/sub 2/ removal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilkinson, J.D.; Hudson, H.M.

    1982-05-03

    New turboexpander plant designs can process natural gas streams containing moderate amounts of carbon dioxide (CO/sub 2/) for high ethane recovery without inlet gas treating. The designs will handle a wide range of inlet ethane-plus fractions. They also offer reduced horsepower requirements compared to other processes. CO/sub 2/ is a typical component of most natural gas streams. In many cases, processing of these gas streams in a turboexpander plant for high ethane recovery requires pre-treatment of the gas for CO/sub 2/ removal. This is required to avoid the formation of solid CO/sub 2/ (freezing) in the cold sections of themore » process and/or to meet necessary residue gas and liquid product CO/sub 2/ specifications. Depending on the quantities involved, the CO/sub 2/ removal systems is generally a significant portion of both the installed cost and operating cost for the ethane recovery facility. Therefore, turboexpander plant designs that are capable of handling increased quantities of CO/sub 2/ in the feed gas without freezing can offer the gas processor substantial economic benefits.« less

  6. The Face-Processing Network Is Resilient to Focal Resection of Human Visual Cortex

    PubMed Central

    Jonas, Jacques; Gomez, Jesse; Maillard, Louis; Brissart, Hélène; Hossu, Gabriela; Jacques, Corentin; Loftus, David; Colnat-Coulbois, Sophie; Stigliani, Anthony; Barnett, Michael A.; Grill-Spector, Kalanit; Rossion, Bruno

    2016-01-01

    Human face perception requires a network of brain regions distributed throughout the occipital and temporal lobes with a right hemisphere advantage. Present theories consider this network as either a processing hierarchy beginning with the inferior occipital gyrus (occipital face area; IOG-faces/OFA) or a multiple-route network with nonhierarchical components. The former predicts that removing IOG-faces/OFA will detrimentally affect downstream stages, whereas the latter does not. We tested this prediction in a human patient (Patient S.P.) requiring removal of the right inferior occipital cortex, including IOG-faces/OFA. We acquired multiple fMRI measurements in Patient S.P. before and after a preplanned surgery and multiple measurements in typical controls, enabling both within-subject/across-session comparisons (Patient S.P. before resection vs Patient S.P. after resection) and between-subject/across-session comparisons (Patient S.P. vs controls). We found that the spatial topology and selectivity of downstream ipsilateral face-selective regions were stable 1 and 8 month(s) after surgery. Additionally, the reliability of distributed patterns of face selectivity in Patient S.P. before versus after resection was not different from across-session reliability in controls. Nevertheless, postoperatively, representations of visual space were typical in dorsal face-selective regions but atypical in ventral face-selective regions and V1 of the resected hemisphere. Diffusion weighted imaging in Patient S.P. and controls identifies white matter tracts connecting retinotopic areas to downstream face-selective regions, which may contribute to the stable and plastic features of the face network in Patient S.P. after surgery. Together, our results support a multiple-route network of face processing with nonhierarchical components and shed light on stable and plastic features of high-level visual cortex following focal brain damage. SIGNIFICANCE STATEMENT Brain networks consist of interconnected functional regions commonly organized in processing hierarchies. Prevailing theories predict that damage to the input of the hierarchy will detrimentally affect later stages. We tested this prediction with multiple brain measurements in a rare human patient requiring surgical removal of the putative input to a network processing faces. Surprisingly, the spatial topology and selectivity of downstream face-selective regions are stable after surgery. Nevertheless, representations of visual space were typical in dorsal face-selective regions but atypical in ventral face-selective regions and V1. White matter connections from outside the face network may support these stable and plastic features. As processing hierarchies are ubiquitous in biological and nonbiological systems, our results have pervasive implications for understanding the construction of resilient networks. PMID:27511014

  7. Improvements in surface singularity analysis and design methods. [applicable to airfoils

    NASA Technical Reports Server (NTRS)

    Bristow, D. R.

    1979-01-01

    The coupling of the combined source vortex distribution of Green's potential flow function with contemporary numerical techniques is shown to provide accurate, efficient, and stable solutions to subsonic inviscid analysis and design problems for multi-element airfoils. The analysis problem is solved by direct calculation of the surface singularity distribution required to satisfy the flow tangency boundary condition. The design or inverse problem is solved by an iteration process. In this process, the geometry and the associated pressure distribution are iterated until the pressure distribution most nearly corresponding to the prescribed design distribution is obtained. Typically, five iteration cycles are required for convergence. A description of the analysis and design method is presented, along with supporting examples.

  8. Ultra-low roughness magneto-rheological finishing for EUV mask substrates

    NASA Astrophysics Data System (ADS)

    Dumas, Paul; Jenkins, Richard; McFee, Chuck; Kadaksham, Arun J.; Balachandran, Dave K.; Teki, Ranganath

    2013-09-01

    EUV mask substrates, made of titania-doped fused silica, ideally require sub-Angstrom surface roughness, sub-30 nm flatness, and no bumps/pits larger than 1 nm in height/depth. To achieve the above specifications, substrates must undergo iterative global and local polishing processes. Magnetorheological finishing (MRF) is a local polishing technique which can accurately and deterministically correct substrate figure, but typically results in a higher surface roughness than the current requirements for EUV substrates. We describe a new super-fine MRF® polishing fluid whichis able to meet both flatness and roughness specifications for EUV mask blanks. This eases the burden on the subsequent global polishing process by decreasing the polishing time, and hence the defectivity and extent of figure distortion.

  9. Laser displacement sensor to monitor the layup process of composite laminate production

    NASA Astrophysics Data System (ADS)

    Miesen, Nick; Groves, Roger M.; Sinke, Jos; Benedictus, Rinze

    2013-04-01

    Several types of flaw can occur during the layup process of prepreg composite laminates. Quality control after the production process checks the end product by testing the specimens for flaws which are included during the layup process or curing process, however by then these flaws are already irreversibly embedded in the laminate. This paper demonstrates the use of a laser displacement sensor technique applied during the layup process of prepreg laminates for in-situ flaw detection, for typical flaws that can occur during the composite production process. An incorrect number of layers and fibre wrinkling are dominant flaws during the process of layup. These and other dominant flaws have been modeled to determine the requirements for an in-situ monitoring during the layup process of prepreg laminates.

  10. Flight Systems Integration and Test

    NASA Technical Reports Server (NTRS)

    Wright, Michael R.

    2011-01-01

    Topics to be Covered in this presentation are: (1) Integration and Test (I&T) Planning (2) Integration and Test Flows (3) Overview of Typical Mission I&T (4) Supporting Elements (5) Lessons-Learned and Helpful Hints (6) I&T Mishaps and Failures (7) The Lighter Side of I&T and (8) Small-Group Activity. This presentation highlights a typical NASA "in-house" I&T program (1) For flight systems that are developed by NASA at a space flight center (like GSFC) (2) Requirements well-defined: qualification/acceptance, documentation, configuration management. (3) Factors: precedents, human flight, risk-aversion ("failure-phobia"), taxpayer dollars, jobs and (4) Some differences among NASA centers, but generally a resource-intensive process

  11. Process simulation during the design process makes the difference: process simulations applied to a traditional design.

    PubMed

    Traversari, Roberto; Goedhart, Rien; Schraagen, Jan Maarten

    2013-01-01

    The objective is evaluation of a traditionally designed operating room using simulation of various surgical workflows. A literature search showed that there is no evidence for an optimal operating room layout regarding the position and size of an ultraclean ventilation (UCV) canopy with a separate preparation room for laying out instruments and in which patients are induced in the operating room itself. Neither was literature found reporting on process simulation being used for this application. Many technical guidelines and designs have mainly evolved over time, and there is no evidence on whether the proposed measures are also effective for the optimization of the layout for workflows. The study was conducted by applying observational techniques to simulated typical surgical procedures. Process simulations which included complete surgical teams and equipment required for the intervention were carried out for four typical interventions. Four observers used a form to record conflicts with the clean area boundaries and the height of the supply bridge. Preferences for particular layouts were discussed with the surgical team after each simulated procedure. We established that a clean area measuring 3 × 3 m and a supply bridge height of 2.05 m was satisfactory for most situations, provided a movable operation table is used. The only cases in which conflicts with the supply bridge were observed were during the use of a surgical robot (Da Vinci) and a surgical microscope. During multiple trauma interventions, bottlenecks regarding the dimensions of the clean area will probably arise. The process simulation of four typical interventions has led to significantly different operating room layouts than were arrived at through the traditional design process. Evidence-based design, human factors, work environment, operating room, traditional design, process simulation, surgical workflowsPreferred Citation: Traversari, R., Goedhart, R., & Schraagen, J. M. (2013). Process simulation during the design process makes the difference: Process simulations applied to a traditional design. Health Environments Research & Design Journal 6(2), pp 58-76.

  12. Convergence and Extrusion Are Required for Normal Fusion of the Mammalian Secondary Palate

    PubMed Central

    Kim, Seungil; Lewis, Ace E.; Singh, Vivek; Ma, Xuefei; Adelstein, Robert; Bush, Jeffrey O.

    2015-01-01

    The fusion of two distinct prominences into one continuous structure is common during development and typically requires integration of two epithelia and subsequent removal of that intervening epithelium. Using confocal live imaging, we directly observed the cellular processes underlying tissue fusion, using the secondary palatal shelves as a model. We find that convergence of a multi-layered epithelium into a single-layer epithelium is an essential early step, driven by cell intercalation, and is concurrent to orthogonal cell displacement and epithelial cell extrusion. Functional studies in mice indicate that this process requires an actomyosin contractility pathway involving Rho kinase (ROCK) and myosin light chain kinase (MLCK), culminating in the activation of non-muscle myosin IIA (NMIIA). Together, these data indicate that actomyosin contractility drives cell intercalation and cell extrusion during palate fusion and suggest a general mechanism for tissue fusion in development. PMID:25848986

  13. Semantic Service Matchmaking in the ATM Domain Considering Infrastructure Capability Constraints

    NASA Astrophysics Data System (ADS)

    Moser, Thomas; Mordinyi, Richard; Sunindyo, Wikan Danar; Biffl, Stefan

    In a service-oriented environment business processes flexibly build on software services provided by systems in a network. A key design challenge is the semantic matchmaking of business processes and software services in two steps: 1. Find for one business process the software services that meet or exceed the BP requirements; 2. Find for all business processes the software services that can be implemented within the capability constraints of the underlying network, which poses a major problem since even for small scenarios the solution space is typically very large. In this chapter we analyze requirements from mission-critical business processes in the Air Traffic Management (ATM) domain and introduce an approach for semi-automatic semantic matchmaking for software services, the “System-Wide Information Sharing” (SWIS) business process integration framework. A tool-supported semantic matchmaking process like SWIS can provide system designers and integrators with a set of promising software service candidates and therefore strongly reduces the human matching effort by focusing on a much smaller space of matchmaking candidates. We evaluate the feasibility of the SWIS approach in an industry use case from the ATM domain.

  14. Advanced Survivable Radiator Development Program

    DTIC Science & Technology

    1993-03-01

    pyrolytic process. The ceramic fiber is amorphous with a typical elemental composition of 57% silicon, 28% nitrogen, 10% carbon , and 4% oxygen, and has an...exist: Optimum choice dependent on mission, operational requirements, and threat environment Configurations: Fibers , Rods, Fins Carbon - Carbon e ASpecial... Carbon Fiber Area Density ,,,--Stainless Aluminum Be Diamond BC Bond Copper Weave: Be, Al, SS. Ti, Nitinol Configurations: Low Z or High Z. depending

  15. Resource Production on the Moon

    NASA Technical Reports Server (NTRS)

    Landis, Geoffrey A.

    2014-01-01

    A self-sustaining settlement on the moon, or on other airless bodies such as asteroids, will require the ability to refine desired raw materials from available resources, such as lunar or asteroidal regolith. This work will focus on the example case of pro-duction from lunar regolith. The same process sequences could be used at other locations. Stony asteroids typically have regolith similar to that of the moon, and refining of asteroidal material could use the same techniques, adapted for microgravity. Likewise, Martian rock and soil could also be processed by the techniques discussed here.

  16. Avionics System Architecture for the NASA Orion Vehicle

    NASA Technical Reports Server (NTRS)

    Baggerman, Clint; McCabe, Mary; Verma, Dinesh

    2009-01-01

    It has been 30 years since the National Aeronautics and Space Administration (NASA) last developed a crewed spacecraft capable of launch, on-orbit operations, and landing. During that time, aerospace avionics technologies have greatly advanced in capability, and these technologies have enabled integrated avionics architectures for aerospace applications. The inception of NASA s Orion Crew Exploration Vehicle (CEV) spacecraft offers the opportunity to leverage the latest integrated avionics technologies into crewed space vehicle architecture. The outstanding question is to what extent to implement these advances in avionics while still meeting the unique crewed spaceflight requirements for safety, reliability and maintainability. Historically, aircraft and spacecraft have very similar avionics requirements. Both aircraft and spacecraft must have high reliability. They also must have as much computing power as possible and provide low latency between user control and effecter response while minimizing weight, volume, and power. However, there are several key differences between aircraft and spacecraft avionics. Typically, the overall spacecraft operational time is much shorter than aircraft operation time, but the typical mission time (and hence, the time between preventive maintenance) is longer for a spacecraft than an aircraft. Also, the radiation environment is typically more severe for spacecraft than aircraft. A "loss of mission" scenario (i.e. - the mission is not a success, but there are no casualties) arguably has a greater impact on a multi-million dollar spaceflight mission than a typical commercial flight. Such differences need to be weighted when determining if an aircraft-like integrated modular avionics (IMA) system is suitable for a crewed spacecraft. This paper will explore the preliminary design process of the Orion vehicle avionics system by first identifying the Orion driving requirements and the difference between Orion requirements and those of other previous crewed spacecraft avionics systems. Common systems engineering methods will be used to evaluate the value propositions, or the factors that weight most heavily in design consideration, of Orion and other aerospace systems. Then, the current Orion avionics architecture will be presented and evaluated.

  17. Thermal Remote Sensing with Uav-Based Workflows

    NASA Astrophysics Data System (ADS)

    Boesch, R.

    2017-08-01

    Climate change will have a significant influence on vegetation health and growth. Predictions of higher mean summer temperatures and prolonged summer draughts may pose a threat to agriculture areas and forest canopies. Rising canopy temperatures can be an indicator of plant stress because of the closure of stomata and a decrease in the transpiration rate. Thermal cameras are available for decades, but still often used for single image analysis, only in oblique view manner or with visual evaluations of video sequences. Therefore remote sensing using a thermal camera can be an important data source to understand transpiration processes. Photogrammetric workflows allow to process thermal images similar to RGB data. But low spatial resolution of thermal cameras, significant optical distortion and typically low contrast require an adapted workflow. Temperature distribution in forest canopies is typically completely unknown and less distinct than for urban or industrial areas, where metal constructions and surfaces yield high contrast and sharp edge information. The aim of this paper is to investigate the influence of interior camera orientation, tie point matching and ground control points on the resulting accuracy of bundle adjustment and dense cloud generation with a typically used photogrammetric workflow for UAVbased thermal imagery in natural environments.

  18. Integrating Safety and Mission Assurance into Systems Engineering Modeling Practices

    NASA Technical Reports Server (NTRS)

    Beckman, Sean; Darpel, Scott

    2015-01-01

    During the early development of products, flight, or experimental hardware, emphasis is often given to the identification of technical requirements, utilizing such tools as use case and activity diagrams. Designers and project teams focus on understanding physical and performance demands and challenges. It is typically only later, during the evaluation of preliminary designs that a first pass, if performed, is made to determine the process, safety, and mission quality assurance requirements. Evaluation early in the life cycle, though, can yield requirements that force a fundamental change in design. This paper discusses an alternate paradigm for using the concepts of use case or activity diagrams to identify safety hazard and mission quality assurance risks and concerns using the same systems engineering modeling tools being used to identify technical requirements. It contains two examples of how this process might be used in the development of a space flight experiment, and the design of a Human Powered Pizza Delivery Vehicle, along with the potential benefits to decrease development time, and provide stronger budget estimates.

  19. Simplified power processing for ion-thruster subsystems

    NASA Technical Reports Server (NTRS)

    Wessel, F. J.; Hancock, D. J.

    1983-01-01

    A design for a greatly simplified power-processing unit (SPPU) for the 8-cm diameter mercury-ion-thruster subsystem is discussed. This SPPU design will provide a tenfold reduction in parts count, a decrease in system mass and cost, and an increase in system reliability compared to the existing power-processing unit (PPU) used in the Hughes/NASA Lewis Research Center Ion Auxiliary Propulsion Subsystem. The simplifications achieved in this design will greatly increase the attractiveness of ion propulsion in near-term and future spacecraft propulsion applications. A description of a typical ion-thruster subsystem is given. An overview of the thruster/power-processor interface requirements is given. Simplified thruster power processing is discussed.

  20. Fuzzy logic particle tracking velocimetry

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.

    1993-01-01

    Fuzzy logic has proven to be a simple and robust method for process control. Instead of requiring a complex model of the system, a user defined rule base is used to control the process. In this paper the principles of fuzzy logic control are applied to Particle Tracking Velocimetry (PTV). Two frames of digitally recorded, single exposure particle imagery are used as input. The fuzzy processor uses the local particle displacement information to determine the correct particle tracks. Fuzzy PTV is an improvement over traditional PTV techniques which typically require a sequence (greater than 2) of image frames for accurately tracking particles. The fuzzy processor executes in software on a PC without the use of specialized array or fuzzy logic processors. A pair of sample input images with roughly 300 particle images each, results in more than 200 velocity vectors in under 8 seconds of processing time.

  1. Mars Atmospheric Capture and Gas Separation

    NASA Technical Reports Server (NTRS)

    Muscatello, Anthony; Santiago-Maldonado, Edgardo; Gibson, Tracy; Devor, Robert; Captain, James

    2011-01-01

    The Mars atmospheric capture and gas separation project is selecting, developing, and demonstrating techniques to capture and purify Martian atmospheric gases for their utilization for the production of hydrocarbons, oxygen, and water in ISRU systems. Trace gases will be required to be separated from Martian atmospheric gases to provide pure C02 to processing elements. In addition, other Martian gases, such as nitrogen and argon, occur in concentrations high enough to be useful as buffer gas and should be captured as welL To achieve these goals, highly efficient gas separation processes will be required. These gas separation techniques are also required across various areas within the ISRU project to support various consumable production processes. The development of innovative gas separation techniques will evaluate the current state-of-the-art for the gas separation required, with the objective to demonstrate and develop light-weight, low-power methods for gas separation. Gas separation requirements include, but are not limited to the selective separation of: (1) methane and water from un-reacted carbon oxides (C02- CO) and hydrogen typical of a Sabatier-type process, (2) carbon oxides and water from unreacted hydrogen from a Reverse Water-Gas Shift process, (3) carbon oxides from oxygen from a trash/waste processing reaction, and (4) helium from hydrogen or oxygen from a propellant scavenging process. Potential technologies for the separations include freezers, selective membranes, selective solvents, polymeric sorbents, zeolites, and new technologies. This paper and presentation will summarize the results of an extensive literature review and laboratory evaluations of candidate technologies for the capture and separation of C02 and other relevant gases.

  2. On the contribution of unconscious processes to recognition memory.

    PubMed

    Cleary, Anne M

    2012-01-01

    Abstract Voss et al. review work showing unconscious contributions to recognition memory. An electrophysiological effect, the N300, appears to signify an unconscious recognition process. Whether such unconscious recognition requires highly specific experimental circumstances or can occur in typical types of recognition testing situations has remained a question. The fact that the N300 has also been shown to be the sole electrophysiological correlate of the recognition-without-identification effect that occurs with visual word fragments suggests that unconscious processes may contribute to a wider range of recognition testing situations than those originally investigated by Voss and colleagues. Some implications of this possibility are discussed.

  3. Considerations of technology transfer barriers in the modification of strategic superalloys for aircraft turbine engines

    NASA Technical Reports Server (NTRS)

    Stephens, J. R.; Tien, J. K.

    1983-01-01

    A typical innovation-to-commercialization process for the development of a new hot section gas turbine material requires one to two decades with attendant costs in the tens of millions of dollars. This transfer process is examined to determine the potential rate-controlling steps for introduction of future low strategic metal content alloys or processes. Case studies are used to highlight the barriers to commercialization as well as to identify the means by which these barriers can be surmounted. The opportunities for continuing joint government-university-industry partnerships in planning and conducting strategic materials R&D programs are also discussed.

  4. Design tool for multiprocessor scheduling and evaluation of iterative dataflow algorithms

    NASA Technical Reports Server (NTRS)

    Jones, Robert L., III

    1995-01-01

    A graph-theoretic design process and software tool is defined for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described with a dataflow graph and are intended to be executed repetitively on a set of identical processors. Typical applications include signal processing and control law problems. Graph-search algorithms and analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool applies the design process to a given problem and includes performance optimization through the inclusion of additional precedence constraints among the schedulable tasks.

  5. Jet fuels from synthetic crudes

    NASA Technical Reports Server (NTRS)

    Antoine, A. C.; Gallagher, J. P.

    1977-01-01

    An investigation was conducted to determine the technical problems in the conversion of a significant portion of a barrel of either a shale oil or a coal synthetic crude oil into a suitable aviation turbine fuel. Three syncrudes were used, one from shale and two from coal, chosen as representative of typical crudes from future commercial production. The material was used to produce jet fuels of varying specifications by distillation, hydrotreating, and hydrocracking. Attention is given to process requirements, hydrotreating process conditions, the methods used to analyze the final products, the conditions for shale oil processing, and the coal liquid processing conditions. The results of the investigation show that jet fuels of defined specifications can be made from oil shale and coal syncrudes using readily available commercial processes.

  6. The minicell TMirradiator: A new system for a new market

    NASA Astrophysics Data System (ADS)

    Clouser, James F.; Beers, Eric W.

    1998-06-01

    Since the commissioning of the first industrial Gamma Irradiator design, designers and operators of irradiation systems have been attempting to meet the specific production requirements and challenges presented to them. This objective has resulted in many different versions of irradiators currently in service today, all of which had original charters and many of which still perform very well within even the new requirements of this industry. Continuing changes in the marketplace have, however, placed pressures on existing designs due to a combination of changing dose requirements for sterlization, increased economic pressures from the specific industry served for both time and location and the increasing variety of product types requiring processing. Additionally, certain market areas which could never economically support a typical gamma processing facility have either not been serviced, or have forced potential gamma users to transport product long distances to one of these existing facilities. The MiniCell TM removes many of the traditional barriers previously accepted in the radiation processing industry for building a processing facility in a location. Its reduced size and cost have allowed many potential users to consider in-house processing and its ability to be quickly assembled allow it to meet market needs in a much more timely fashion than the previous designs. The MiniCell system can cost effectively meet many of the current market needs of reducing total cost of processing and also be flexible enough to process product in a wide range of industries effectively.

  7. The NASA Commercial Crew Program (CCP) Mission Assurance Process

    NASA Technical Reports Server (NTRS)

    Canfield, Amy

    2016-01-01

    In 2010, NASA established the Commercial Crew Program in order to provide human access to the International Space Station and low earth orbit via the commercial (non-governmental) sector. A particular challenge to NASA has been how to determine the commercial providers transportation system complies with Programmatic safety requirements. The process used in this determination is the Safety Technical Review Board which reviews and approves provider submitted Hazard Reports. One significant product of the review is a set of hazard control verifications. In past NASA programs, 100 percent of these safety critical verifications were typically confirmed by NASA. The traditional Safety and Mission Assurance (SMA) model does not support the nature of the Commercial Crew Program. To that end, NASA SMA is implementing a Risk Based Assurance (RBA) process to determine which hazard control verifications require NASA authentication. Additionally, a Shared Assurance Model is also being developed to efficiently use the available resources to execute the verifications. This paper will describe the evolution of the CCP Mission Assurance process from the beginning of the Program to its current incarnation. Topics to be covered include a short history of the CCP; the development of the Programmatic mission assurance requirements; the current safety review process; a description of the RBA process and its products and ending with a description of the Shared Assurance Model.

  8. r-process nucleosynthesis in dynamic helium-burning environments

    NASA Technical Reports Server (NTRS)

    Cowan, J. J.; Cameron, A. G. W.; Truran, J. W.

    1985-01-01

    The results of an extended examination of r-process nucleosynthesis in helium-burning enviroments are presented. Using newly calculated nuclear rates, dynamical r-process calculations have been made of thermal runaways in helium cores typical of low-mass stars and in the helium zones of stars undergoing supernova explosions. These calculations show that, for a sufficient flux of neutrons produced by the C-13 neutron source, r-process nuclei in solar proportions can be produced. The conditions required for r-process production are found to be 10 to the 20th-10 to the 21st neutrons per cubic centimeter for times of 0.01-0.1 s and neutron number densities in excess of 10 to the 19th per cubic centimeter for times of about 1 s. The amount of C-13 required is found to be exceedingly high - larger than is found to occur in any current stellar evolutionary model. It is thus unlikely that these helium-burning environments are responsible for producing the bulk of the r-process elements seen in the solar system.

  9. Remote sensing requirements as suggested by watershed model sensitivity analyses

    NASA Technical Reports Server (NTRS)

    Salomonson, V. V.; Rango, A.; Ormsby, J. P.; Ambaruch, R.

    1975-01-01

    A continuous simulation watershed model has been used to perform sensitivity analyses that provide guidance in defining remote sensing requirements for the monitoring of watershed features and processes. The results show that out of 26 input parameters having meaningful effects on simulated runoff, 6 appear to be obtainable with existing remote sensing techniques. Of these six parameters, 3 require the measurement of the areal extent of surface features (impervious areas, water bodies, and the extent of forested area), two require the descrimination of land use that can be related to overland flow roughness coefficient or the density of vegetation so as to estimate the magnitude of precipitation interception, and one parameter requires the measurement of distance to get the length over which overland flow typically occurs. Observational goals are also suggested for monitoring such fundamental watershed processes as precipitation, soil moisture, and evapotranspiration. A case study on the Patuxent River in Maryland shows that runoff simulation is improved if recent satellite land use observations are used as model inputs as opposed to less timely topographic map information.

  10. The components of working memory updating: an experimental decomposition and individual differences.

    PubMed

    Ecker, Ullrich K H; Lewandowsky, Stephan; Oberauer, Klaus; Chee, Abby E H

    2010-01-01

    Working memory updating (WMU) has been identified as a cognitive function of prime importance for everyday tasks and has also been found to be a significant predictor of higher mental abilities. Yet, little is known about the constituent processes of WMU. We suggest that operations required in a typical WMU task can be decomposed into 3 major component processes: retrieval, transformation, and substitution. We report a large-scale experiment that instantiated all possible combinations of those 3 component processes. Results show that the 3 components make independent contributions to updating performance. We additionally present structural equation models that link WMU task performance and working memory capacity (WMC) measures. These feature the methodological advancement of estimating interindividual covariation and experimental effects on mean updating measures simultaneously. The modeling results imply that WMC is a strong predictor of WMU skills in general, although some component processes-in particular, substitution skills-were independent of WMC. Hence, the reported predictive power of WMU measures may rely largely on common WM functions also measured in typical WMC tasks, although substitution skills may make an independent contribution to predicting higher mental abilities. (PsycINFO Database Record (c) 2009 APA, all rights reserved).

  11. Words with and without internal structure: what determines the nature of orthographic and morphological processing?

    PubMed Central

    Velan, Hadas; Frost, Ram

    2010-01-01

    Recent studies suggest that basic effects which are markers of visual word recognition in Indo-European languages cannot be obtained in Hebrew or in Arabic. Although Hebrew has an alphabetic writing system, just like English, French, or Spanish, a series of studies consistently suggested that simple form-orthographic priming, or letter-transposition priming are not found in Hebrew. In four experiments, we tested the hypothesis that this is due to the fact that Semitic words have an underlying structure that constrains the possible alignment of phonemes and their respective letters. The experiments contrasted typical Semitic words which are root-derived, with Hebrew words of non-Semitic origin, which are morphologically simple and resemble base words in European languages. Using RSVP, TL priming, and form-priming manipulations, we show that Hebrew readers process Hebrew words which are morphologically simple similar to the way they process English words. These words indeed reveal the typical form-priming and TL priming effects reported in European languages. In contrast, words with internal structure are processed differently, and require a different code for lexical access. We discuss the implications of these findings for current models of visual word recognition. PMID:21163472

  12. Group Contribution Methods for Phase Equilibrium Calculations.

    PubMed

    Gmehling, Jürgen; Constantinescu, Dana; Schmid, Bastian

    2015-01-01

    The development and design of chemical processes are carried out by solving the balance equations of a mathematical model for sections of or the whole chemical plant with the help of process simulators. For process simulation, besides kinetic data for the chemical reaction, various pure component and mixture properties are required. Because of the great importance of separation processes for a chemical plant in particular, a reliable knowledge of the phase equilibrium behavior is required. The phase equilibrium behavior can be calculated with the help of modern equations of state or g(E)-models using only binary parameters. But unfortunately, only a very small part of the experimental data for fitting the required binary model parameters is available, so very often these models cannot be applied directly. To solve this problem, powerful predictive thermodynamic models have been developed. Group contribution methods allow the prediction of the required phase equilibrium data using only a limited number of group interaction parameters. A prerequisite for fitting the required group interaction parameters is a comprehensive database. That is why for the development of powerful group contribution methods almost all published pure component properties, phase equilibrium data, excess properties, etc., were stored in computerized form in the Dortmund Data Bank. In this review, the present status, weaknesses, advantages and disadvantages, possible applications, and typical results of the different group contribution methods for the calculation of phase equilibria are presented.

  13. Coherent diffractive imaging of time-evolving samples with improved temporal resolution

    DOE PAGES

    Ulvestad, A.; Tripathi, A.; Hruszkewycz, S. O.; ...

    2016-05-19

    Bragg coherent x-ray diffractive imaging is a powerful technique for investigating dynamic nanoscale processes in nanoparticles immersed in reactive, realistic environments. Its temporal resolution is limited, however, by the oversampling requirements of three-dimensional phase retrieval. Here, we show that incorporating the entire measurement time series, which is typically a continuous physical process, into phase retrieval allows the oversampling requirement at each time step to be reduced, leading to a subsequent improvement in the temporal resolution by a factor of 2-20 times. The increased time resolution will allow imaging of faster dynamics and of radiation-dose-sensitive samples. Furthermore, this approach, which wemore » call "chrono CDI," may find use in improving the time resolution in other imaging techniques.« less

  14. A hybrid life cycle inventory of nano-scale semiconductor manufacturing.

    PubMed

    Krishnan, Nikhil; Boyd, Sarah; Somani, Ajay; Raoux, Sebastien; Clark, Daniel; Dornfeld, David

    2008-04-15

    The manufacturing of modern semiconductor devices involves a complex set of nanoscale fabrication processes that are energy and resource intensive, and generate significant waste. It is important to understand and reduce the environmental impacts of semiconductor manufacturing because these devices are ubiquitous components in electronics. Furthermore, the fabrication processes used in the semiconductor industry are finding increasing application in other products, such as microelectromechanical systems (MEMS), flat panel displays, and photovoltaics. In this work we develop a library of typical gate-to-gate materials and energy requirements, as well as emissions associated with a complete set of fabrication process models used in manufacturing a modern microprocessor. In addition, we evaluate upstream energy requirements associated with chemicals and materials using both existing process life cycle assessment (LCA) databases and an economic input-output (EIO) model. The result is a comprehensive data set and methodology that may be used to estimate and improve the environmental performance of a broad range of electronics and other emerging applications that involve nano and micro fabrication.

  15. An Approach for Implementation of Project Management Information Systems

    NASA Astrophysics Data System (ADS)

    Běrziša, Solvita; Grabis, Jānis

    Project management is governed by project management methodologies, standards, and other regulatory requirements. This chapter proposes an approach for implementing and configuring project management information systems according to requirements defined by these methodologies. The approach uses a project management specification framework to describe project management methodologies in a standardized manner. This specification is used to automatically configure the project management information system by applying appropriate transformation mechanisms. Development of the standardized framework is based on analysis of typical project management concepts and process and existing XML-based representations of project management. A demonstration example of project management information system's configuration is provided.

  16. Strategic Adaptation of SCA for STRS

    NASA Technical Reports Server (NTRS)

    Quinn, Todd; Kacpura, Thomas

    2007-01-01

    The Space Telecommunication Radio System (STRS) architecture is being developed to provide a standard framework for future NASA space radios with greater degrees of interoperability and flexibility to meet new mission requirements. The space environment imposes unique operational requirements with restrictive size, weight, and power constraints that are significantly smaller than terrestrial-based military communication systems. With the harsh radiation environment of space, the computing and processing resources are typically one or two generations behind current terrestrial technologies. Despite these differences, there are elements of the SCA that can be adapted to facilitate the design and implementation of the STRS architecture.

  17. Subsetting Tools for Enabling Easy Access to International Airborne Chemistry Data

    NASA Astrophysics Data System (ADS)

    Northup, E. A.; Chen, G.; Quam, B. M.; Beach, A. L., III; Silverman, M. L.; Early, A. B.

    2017-12-01

    In response to the Research Opportunities in Earth and Space Science (ROSES) 2015 release announcement for Advancing Collaborative Connections for Earth System Science (ACCESS), researchers at NASA Langley Research Center (LaRC) proposed to extend the capabilities of the existing Toolsets for Airborne Data (TAD) to include subsetting functionality to allow for easier access to international airborne field campaign data. Airborne field studies are commonly used to gain a detailed understanding of atmospheric processes for scientific research on international climate change and air quality issues. To accommodate the rigorous process for manipulating airborne field study chemistry data, and to lessen barriers for researchers, TAD was created with the ability to geolocate data from various sources measured on different time scales from a single flight. The analysis of airborne chemistry data typically requires data subsetting, which can be challenging and resource-intensive for end users. In an effort to streamline this process, new data subsetting features and updates to the current database model will be added to the TAD toolset. These will include two subsetters: temporal and spatial, and vertical profile. The temporal and spatial subsetter will allow users to both focus on data from a specific location and/or time period. The vertical profile subsetter will retrieve data collected during an individual aircraft ascent or descent spiral. These new web-based tools will allow for automation of the typically labor-intensive manual data subsetting process, which will provide users with data tailored to their specific research interests. The system has been designed to allow for new in-situ airborne missions to be added as they become available, with only minor pre-processing required. The development of these enhancements will be discussed in this presentation.

  18. Rectenna System Design. [energy conversion solar power satellites

    NASA Technical Reports Server (NTRS)

    Woodcock, G. R.; Andryczyk, R. W.

    1980-01-01

    The fundamental processes involved in the operation of the rectenna system designed for the solar power satellite system are described. The basic design choices are presented based on the desired microwave rf field concentration prior to rectification and based on the ground clearance requirements for the rectenna structure. A nonconcentrating inclined planar panel with a 2 meter minimum clearance configuration is selected as a representative of the typical rectenna.

  19. Neurologic manifestations of electrolyte disturbances.

    PubMed

    Riggs, Jack E

    2002-02-01

    Electrolyte disturbances occur commonly and are associated with a variety of characteristic neurologic manifestations involving both the central and peripheral nervous systems. Electrolyte disturbances are essentially always secondary processes. Effective management requires identification and treatment of the underlying primary disorder. Since neurological symptoms of electrolyte disorders are generally functional rather than structural, the neurologic manifestations of electrolyte disturbances are typically reversible. The neurologic manifestations of serum sodium, potassium, calcium, and magnesium disturbances are reviewed.

  20. Embedded Figures Detection in Autism and Typical Development: Preliminary Evidence of a Double Dissociation in Relationships with Visual Search

    ERIC Educational Resources Information Center

    Jarrold, Christopher; Gilchrist, Iain D.; Bender, Alison

    2005-01-01

    Individuals with autism show relatively strong performance on tasks that require them to identify the constituent parts of a visual stimulus. This is assumed to be the result of a bias towards processing the local elements in a display that follows from a weakened ability to integrate information at the global level. The results of the current…

  1. Implementing ARFORGEN: Installation Capability and Feasibility Study of Meeting ARFORGEN Guidelines

    DTIC Science & Technology

    2007-07-26

    aligning troop requirements with the Army’s new strategic mission, the force stabilization element of ARFORGEN was developed to raise the morale of...a discrete event simulation model developed for the project to mirror the reset process. The Unit Reset model is implemented in Java as a discrete...and transportation. Further, the typical installation support staff is manned by a Table of Distribution and Allowance ( TDA ) designed to

  2. Discrete event simulation and the resultant data storage system response in the operational mission environment of Jupiter-Saturn /Voyager/ spacecraft

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, A. K.

    1978-01-01

    The Data Storage Subsystem Simulator (DSSSIM) simulating (by ground software) occurrence of discrete events in the Voyager mission is described. Functional requirements for Data Storage Subsystems (DSS) simulation are discussed, and discrete event simulation/DSSSIM processing is covered. Four types of outputs associated with a typical DSSSIM run are presented, and DSSSIM limitations and constraints are outlined.

  3. Extreme temperature packaging: challenges and opportunities

    NASA Astrophysics Data System (ADS)

    Johnson, R. Wayne

    2016-05-01

    Consumer electronics account for the majority of electronics manufactured today. Given the temperature limits of humans, consumer electronics are typically rated for operation from -40°C to +85°C. Military applications extend the range to -65°C to +125°C while underhood automotive electronics may see +150°C. With the proliferation of the Internet of Things (IoT), the goal of instrumenting (sensing, computation, transmission) to improve safety and performance in high temperature environments such as geothermal wells, nuclear reactors, combustion chambers, industrial processes, etc. requires sensors, electronics and packaging compatible with these environments. Advances in wide bandgap semiconductors (SiC and GaN) allow the fabrication of high temperature compatible sensors and electronics. Integration and packaging of these devices is required for implementation into actual applications. The basic elements of packaging are die attach, electrical interconnection and the package or housing. Consumer electronics typically use conductive adhesives or low melting point solders for die attach, wire bonds or low melting solder for electrical interconnection and epoxy for the package. These materials melt or decompose in high temperature environments. This paper examines materials and processes for high temperature packaging including liquid transient phase and sintered nanoparticle die attach, high melting point wires for wire bonding and metal and ceramic packages. The limitations of currently available solutions will also be discussed.

  4. Atmospheric Capture On Mars (and Processing)

    NASA Technical Reports Server (NTRS)

    Muscatello, Tony

    2017-01-01

    The ultimate destination of NASA's human exploration program is Mars. In Situ Resource Utilization (ISRU) is a key technology required to enable such missions, as first proposed by Prof. Robert Ash in 1976. This presentation will review progress in the systems required to produce rocket propellant, oxygen, and other consumables on Mars using the carbon dioxide atmosphere and other potential resources. For many years, NASA, commercial companies, and academia have been developing, and demonstrating techniques to capture and purify Martian atmospheric gases for their utilization for the production of hydrocarbons, oxygen, and water in ISRU systems. Other gases will be required to be separated from Martian atmospheric gases to provide pure CO2 for processing elements. Significant progress has been demonstrated in CO2 collection via adsorption by molecular sieves, freezing, and direct compression. Early stage work in adsorption in Ionic Liquids followed by electrolysis to oxygen is also underway. In addition, other Martian gases, such as nitrogen and argon, occur in concentrations high enough to be useful as buffer gas and could be captured as well. Gas separation requirements include, but are not limited to the selective separation of: (1) methane and water from unreacted carbon oxides (CO2-CO) and hydrogen typical of a Sabatier-type process, (2) carbon oxides and water from unreacted hydrogen from a Reverse Water-Gas Shift process, and (3) carbon oxides from oxygen from a trash/waste processing reaction.

  5. Multiobjective optimisation design for enterprise system operation in the case of scheduling problem with deteriorating jobs

    NASA Astrophysics Data System (ADS)

    Wang, Hongfeng; Fu, Yaping; Huang, Min; Wang, Junwei

    2016-03-01

    The operation process design is one of the key issues in the manufacturing and service sectors. As a typical operation process, the scheduling with consideration of the deteriorating effect has been widely studied; however, the current literature only studied single function requirement and rarely considered the multiple function requirements which are critical for a real-world scheduling process. In this article, two function requirements are involved in the design of a scheduling process with consideration of the deteriorating effect and then formulated into two objectives of a mathematical programming model. A novel multiobjective evolutionary algorithm is proposed to solve this model with combination of three strategies, i.e. a multiple population scheme, a rule-based local search method and an elitist preserve strategy. To validate the proposed model and algorithm, a series of randomly-generated instances are tested and the experimental results indicate that the model is effective and the proposed algorithm can achieve the satisfactory performance which outperforms the other state-of-the-art multiobjective evolutionary algorithms, such as nondominated sorting genetic algorithm II and multiobjective evolutionary algorithm based on decomposition, on all the test instances.

  6. A two-dimensionally coincident second difference cosmic ray spike removal method for the fully automated processing of Raman spectra.

    PubMed

    Schulze, H Georg; Turner, Robin F B

    2014-01-01

    Charge-coupled device detectors are vulnerable to cosmic rays that can contaminate Raman spectra with positive going spikes. Because spikes can adversely affect spectral processing and data analyses, they must be removed. Although both hardware-based and software-based spike removal methods exist, they typically require parameter and threshold specification dependent on well-considered user input. Here, we present a fully automated spike removal algorithm that proceeds without requiring user input. It is minimally dependent on sample attributes, and those that are required (e.g., standard deviation of spectral noise) can be determined with other fully automated procedures. At the core of the method is the identification and location of spikes with coincident second derivatives along both the spectral and spatiotemporal dimensions of two-dimensional datasets. The method can be applied to spectra that are relatively inhomogeneous because it provides fairly effective and selective targeting of spikes resulting in minimal distortion of spectra. Relatively effective spike removal obtained with full automation could provide substantial benefits to users where large numbers of spectra must be processed.

  7. Algal biochar enhances the re-vegetation of stockpiled mine soils with native grass.

    PubMed

    Roberts, David A; Cole, Andrew J; Paul, Nicholas A; de Nys, Rocky

    2015-09-15

    In most countries the mining industry is required to rehabilitate disturbed land with native vegetation. A typical approach is to stockpile soils during mining and then use this soil to recreate landforms after mining. Soil that has been stockpiled for an extended period typically contains little or no organic matter and nutrient, making soil rehabilitation a slow and difficult process. Here, we take freshwater macroalgae (Oedogonium) cultivated in waste water at a coal-fired power station and use it as a feedstock for the production of biochar, then use this biochar to enhance the rehabilitation of two types of stockpiled soil - a ferrosol and a sodosol - from the adjacent coal mine. While the biomass had relatively high concentrations of some metals, due to its cultivation in waste water, the resulting biochar did not leach metals into the pore water of soil-biochar mixtures. The biochar did, however, contribute essential trace elements (particularly K) to soil pore water. The biochar had very strong positive effects on the establishment and growth of a native plant (Kangaroo grass, Themeda australis) in both of the soils. The addition of the algal biochar to both soils at 10 t ha(-1) reduced the time to germination by the grass and increased the growth and production of plant biomass. Somewhat surprisingly, there was no beneficial effect of a higher application rate (25 t ha(-1)) of the biochar in the ferrosol, which highlights the importance of matching biochar application rates to the requirements of different types of soil. Nevertheless, we demonstrate that algal biochar can be produced from biomass cultivated in waste water and used at low application rates to improve the rehabilitation of a variety of soils typical of coal mines. This novel process links biomass production in waste water to end use of the biomass in land rehabilitation, simultaneously addressing two environmental issues associated with coal-mining and processing. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Insights to primitive replication derived from structures of small oligonucleotides

    NASA Technical Reports Server (NTRS)

    Smith, G. K.; Fox, G. E.

    1995-01-01

    Available information on the structure of small oligonucleotides is surveyed. It is observed that even small oligomers typically exhibit defined structures over a wide range of pH and temperature. These structures rely on a plethora of non-standard base-base interactions in addition to the traditional Watson-Crick pairings. Stable duplexes, though typically antiparallel, can be parallel or staggered and perfect complementarity is not essential. These results imply that primitive template directed reactions do not require high fidelity. Hence, the extensive use of Watson-Crick complementarity in genes rather than being a direct consequence of the primitive condensation process, may instead reflect subsequent selection based on the advantage of accuracy in maintaining the primitive genetic machinery once it arose.

  9. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches.

    PubMed

    Almutairy, Meznah; Torng, Eric

    2018-01-01

    Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method.

  10. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches

    PubMed Central

    Torng, Eric

    2018-01-01

    Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method. PMID:29389989

  11. EVA Development and Verification Testing at NASA's Neutral Buoyancy Laboratory

    NASA Technical Reports Server (NTRS)

    Jairala, Juniper C.; Durkin, Robert; Marak, Ralph J.; Sipila, Stepahnie A.; Ney, Zane A.; Parazynski, Scott E.; Thomason, Arthur H.

    2012-01-01

    As an early step in the preparation for future Extravehicular Activities (EVAs), astronauts perform neutral buoyancy testing to develop and verify EVA hardware and operations. Neutral buoyancy demonstrations at NASA Johnson Space Center's Sonny Carter Training Facility to date have primarily evaluated assembly and maintenance tasks associated with several elements of the International Space Station (ISS). With the retirement of the Shuttle, completion of ISS assembly, and introduction of commercial players for human transportation to space, evaluations at the Neutral Buoyancy Laboratory (NBL) will take on a new focus. Test objectives are selected for their criticality, lack of previous testing, or design changes that justify retesting. Assembly tasks investigated are performed using procedures developed by the flight hardware providers and the Mission Operations Directorate (MOD). Orbital Replacement Unit (ORU) maintenance tasks are performed using a more systematic set of procedures, EVA Concept of Operations for the International Space Station (JSC-33408), also developed by the MOD. This paper describes the requirements and process for performing a neutral buoyancy test, including typical hardware and support equipment requirements, personnel and administrative resource requirements, examples of ISS systems and operations that are evaluated, and typical operational objectives that are evaluated.

  12. Key issues in the thermal design of spaceborne cryogenic infrared instruments

    NASA Astrophysics Data System (ADS)

    Schember, Helene R.; Rapp, Donald

    1992-12-01

    Thermal design and analysis play an integral role in the development of spaceborne cryogenic infrared (IR) instruments. From conceptual sketches to final testing, both direct and derived thermal requirements place significant constraints on the instrument design. Although in practice these thermal requirements are interdependent, the sources of most thermal constraints may be grouped into six distinct categories. These are: (1) Detector temperatures, (2) Optics temperatures, (3) Pointing or alignment stability, (4) Mission lifetime, (5) Orbit, and (6) Test and Integration. In this paper, we discuss these six sources of thermal requirements with particular regard to development of instrument packages for low background infrared astronomical observatories. In the end, the thermal performance of these instruments must meet a set of thermal requirements. The development of these requirements is typically an ongoing and interactive process, however, and the thermal design must maintain flexibility and robustness throughout the process. The thermal (or cryogenic) engineer must understand the constraints imposed by the science requirements, the specific hardware, the observing environment, the mission design, and the testing program. By balancing these often competing factors, the system-oriented thermal engineer can work together with the experiment team to produce an effective overall design of the instrument.

  13. Processing of false belief passages during natural story comprehension: An fMRI study.

    PubMed

    Kandylaki, Katerina D; Nagels, Arne; Tune, Sarah; Wiese, Richard; Bornkessel-Schlesewsky, Ina; Kircher, Tilo

    2015-11-01

    The neural correlates of theory of mind (ToM) are typically studied using paradigms which require participants to draw explicit, task-related inferences (e.g., in the false belief task). In a natural setup, such as listening to stories, false belief mentalizing occurs incidentally as part of narrative processing. In our experiment, participants listened to auditorily presented stories with false belief passages (implicit false belief processing) and immediately after each story answered comprehension questions (explicit false belief processing), while neural responses were measured with functional magnetic resonance imaging (fMRI). All stories included (among other situations) one false belief condition and one closely matched control condition. For the implicit ToM processing, we modeled the hemodynamic response during the false belief passages in the story and compared it to the hemodynamic response during the closely matched control passages. For implicit mentalizing, we found activation in typical ToM processing regions, that is the angular gyrus (AG), superior medial frontal gyrus (SmFG), precuneus (PCUN), middle temporal gyrus (MTG) as well as in the inferior frontal gyrus (IFG) billaterally. For explicit ToM, we only found AG activation. The conjunction analysis highlighted the left AG and MTG as well as the bilateral IFG as overlapping ToM processing regions for both implicit and explicit modes. Implicit ToM processing during listening to false belief passages, recruits the left SmFG and billateral PCUN in addition to the "mentalizing network" known form explicit processing tasks. © 2015 Wiley Periodicals, Inc.

  14. Microstructure and Mechanical Behavior of 17-4 Precipitation Hardenable Steel Processed by Selective Laser Melting

    NASA Astrophysics Data System (ADS)

    Rafi, H. Khalid; Pal, Deepankar; Patil, Nachiket; Starr, Thomas L.; Stucker, Brent E.

    2014-12-01

    The mechanical behavior and the microstructural evolution of 17-4 precipitation hardenable (PH) stainless steel processed using selective laser melting have been studied. Test coupons were produced from 17-4 PH stainless steel powder in argon and nitrogen atmospheres. Characterization studies were carried out using mechanical testing, optical microscopy, scanning electron microscopy, and x-ray diffraction. The results show that post-process heat treatment is required to obtain typically desired tensile properties. Columnar grains of smaller diameters (<2 µm) emerged within the melt pool with a mixture of martensite and retained austenite phases. It was found that the phase content of the samples is greatly influenced by the powder chemistry, processing environment, and grain diameter.

  15. Method for fabricating beryllium-based multilayer structures

    DOEpatents

    Skulina, Kenneth M.; Bionta, Richard M.; Makowiecki, Daniel M.; Alford, Craig S.

    2003-02-18

    Beryllium-based multilayer structures and a process for fabricating beryllium-based multilayer mirrors, useful in the wavelength region greater than the beryllium K-edge (111 .ANG. or 11.1 nm). The process includes alternating sputter deposition of beryllium and a metal, typically from the fifth row of the periodic table, such as niobium (Nb), molybdenum (Mo), ruthenium (Ru), and rhodium (Rh). The process includes not only the method of sputtering the materials, but the industrial hygiene controls for safe handling of beryllium. The mirrors made in accordance with the process may be utilized in soft x-ray and extreme-ultraviolet projection lithography, which requires mirrors of high reflectivity (>60%) for x-rays in the range of 60-140 .ANG. (60-14.0 nm).

  16. A Human Factors Framework for Payload Display Design

    NASA Technical Reports Server (NTRS)

    Dunn, Mariea C.; Hutchinson, Sonya L.

    1998-01-01

    During missions to space, one charge of the astronaut crew is to conduct research experiments. These experiments, referred to as payloads, typically are controlled by computers. Crewmembers interact with payload computers by using visual interfaces or displays. To enhance the safety, productivity, and efficiency of crewmember interaction with payload displays, particular attention must be paid to the usability of these displays. Enhancing display usability requires adoption of a design process that incorporates human factors engineering principles at each stage. This paper presents a proposed framework for incorporating human factors engineering principles into the payload display design process.

  17. Evaluation of reinitialization-free nonvolatile computer systems for energy-harvesting Internet of things applications

    NASA Astrophysics Data System (ADS)

    Onizawa, Naoya; Tamakoshi, Akira; Hanyu, Takahiro

    2017-08-01

    In this paper, reinitialization-free nonvolatile computer systems are designed and evaluated for energy-harvesting Internet of things (IoT) applications. In energy-harvesting applications, as power supplies generated from renewable power sources cause frequent power failures, data processed need to be backed up when power failures occur. Unless data are safely backed up before power supplies diminish, reinitialization processes are required when power supplies are recovered, which results in low energy efficiencies and slow operations. Using nonvolatile devices in processors and memories can realize a faster backup than a conventional volatile computer system, leading to a higher energy efficiency. To evaluate the energy efficiency upon frequent power failures, typical computer systems including processors and memories are designed using 90 nm CMOS or CMOS/magnetic tunnel junction (MTJ) technologies. Nonvolatile ARM Cortex-M0 processors with 4 kB MRAMs are evaluated using a typical computing benchmark program, Dhrystone, which shows a few order-of-magnitude reductions in energy in comparison with a volatile processor with SRAM.

  18. Multimicrometer Noncovalent Monolayer Domains on Layered Materials through Thermally Controlled Langmuir-Schaefer Conversion for Noncovalent 2D Functionalization.

    PubMed

    Hayes, Tyler R; Bang, Jae Jin; Davis, Tyson C; Peterson, Caroline F; McMillan, David G; Claridge, Shelley A

    2017-10-18

    As functionalized 2D materials are incorporated into hybrid materials, ensuring large-area structural control in noncovalently adsorbed films becomes increasingly important. Noncovalent functionalization avoids disrupting electronic structure in 2D materials; however, relatively weak molecular interactions in such monolayers typically reduce stability toward solution processing and other common material handling conditions. Here, we find that controlling substrate temperature during Langmuir-Schaefer conversion of a standing phase monolayer of diynoic amphiphiles on water to a horizontally oriented monolayer on a 2D substrate routinely produces multimicrometer domains, at least an order of magnitude larger than those typically achieved through drop-casting. Following polymerization, these highly ordered monolayers retain their structures during vigorous washing with solvents including water, ethanol, tetrahydrofuran, and toluene. These findings point to a convenient and broadly applicable strategy for noncovalent functionalization of 2D materials in applications that require large-area structural control, for instance, to minimize desorption at defects during subsequent solution processing.

  19. The Face-Processing Network Is Resilient to Focal Resection of Human Visual Cortex.

    PubMed

    Weiner, Kevin S; Jonas, Jacques; Gomez, Jesse; Maillard, Louis; Brissart, Hélène; Hossu, Gabriela; Jacques, Corentin; Loftus, David; Colnat-Coulbois, Sophie; Stigliani, Anthony; Barnett, Michael A; Grill-Spector, Kalanit; Rossion, Bruno

    2016-08-10

    Human face perception requires a network of brain regions distributed throughout the occipital and temporal lobes with a right hemisphere advantage. Present theories consider this network as either a processing hierarchy beginning with the inferior occipital gyrus (occipital face area; IOG-faces/OFA) or a multiple-route network with nonhierarchical components. The former predicts that removing IOG-faces/OFA will detrimentally affect downstream stages, whereas the latter does not. We tested this prediction in a human patient (Patient S.P.) requiring removal of the right inferior occipital cortex, including IOG-faces/OFA. We acquired multiple fMRI measurements in Patient S.P. before and after a preplanned surgery and multiple measurements in typical controls, enabling both within-subject/across-session comparisons (Patient S.P. before resection vs Patient S.P. after resection) and between-subject/across-session comparisons (Patient S.P. vs controls). We found that the spatial topology and selectivity of downstream ipsilateral face-selective regions were stable 1 and 8 month(s) after surgery. Additionally, the reliability of distributed patterns of face selectivity in Patient S.P. before versus after resection was not different from across-session reliability in controls. Nevertheless, postoperatively, representations of visual space were typical in dorsal face-selective regions but atypical in ventral face-selective regions and V1 of the resected hemisphere. Diffusion weighted imaging in Patient S.P. and controls identifies white matter tracts connecting retinotopic areas to downstream face-selective regions, which may contribute to the stable and plastic features of the face network in Patient S.P. after surgery. Together, our results support a multiple-route network of face processing with nonhierarchical components and shed light on stable and plastic features of high-level visual cortex following focal brain damage. Brain networks consist of interconnected functional regions commonly organized in processing hierarchies. Prevailing theories predict that damage to the input of the hierarchy will detrimentally affect later stages. We tested this prediction with multiple brain measurements in a rare human patient requiring surgical removal of the putative input to a network processing faces. Surprisingly, the spatial topology and selectivity of downstream face-selective regions are stable after surgery. Nevertheless, representations of visual space were typical in dorsal face-selective regions but atypical in ventral face-selective regions and V1. White matter connections from outside the face network may support these stable and plastic features. As processing hierarchies are ubiquitous in biological and nonbiological systems, our results have pervasive implications for understanding the construction of resilient networks. Copyright © 2016 the authors 0270-6474/16/368426-16$15.00/0.

  20. Multi-Disciplinary, Multi-Fidelity Discrete Data Transfer Using Degenerate Geometry Forms

    NASA Technical Reports Server (NTRS)

    Olson, Erik D.

    2016-01-01

    In a typical multi-fidelity design process, different levels of geometric abstraction are used for different analysis methods, and transitioning from one phase of design to the next often requires a complete re-creation of the geometry. To maintain consistency between lower-order and higher-order analysis results, Vehicle Sketch Pad (OpenVSP) recently introduced the ability to generate and export several degenerate forms of the geometry, representing the type of abstraction required to perform low- to medium-order analysis for a range of aeronautical disciplines. In this research, the functionality of these degenerate models was extended, so that in addition to serving as repositories for the geometric information that is required as input to an analysis, the degenerate models can also store the results of that analysis mapped back onto the geometric nodes. At the same time, the results are also mapped indirectly onto the nodes of lower-order degenerate models using a process called aggregation, and onto higher-order models using a process called disaggregation. The mapped analysis results are available for use by any subsequent analysis in an integrated design and analysis process. A simple multi-fidelity analysis process for a single-aisle subsonic transport aircraft is used as an example case to demonstrate the value of the approach.

  1. Predictive Rate-Distortion for Infinite-Order Markov Processes

    NASA Astrophysics Data System (ADS)

    Marzen, Sarah E.; Crutchfield, James P.

    2016-06-01

    Predictive rate-distortion analysis suffers from the curse of dimensionality: clustering arbitrarily long pasts to retain information about arbitrarily long futures requires resources that typically grow exponentially with length. The challenge is compounded for infinite-order Markov processes, since conditioning on finite sequences cannot capture all of their past dependencies. Spectral arguments confirm a popular intuition: algorithms that cluster finite-length sequences fail dramatically when the underlying process has long-range temporal correlations and can fail even for processes generated by finite-memory hidden Markov models. We circumvent the curse of dimensionality in rate-distortion analysis of finite- and infinite-order processes by casting predictive rate-distortion objective functions in terms of the forward- and reverse-time causal states of computational mechanics. Examples demonstrate that the resulting algorithms yield substantial improvements.

  2. Performance of Adsorption - Based CO2 Acquisition Hardware for Mars ISRU

    NASA Technical Reports Server (NTRS)

    Finn, John E.; Mulloth, Lila M.; Borchers, Bruce A.; Luna, Bernadette (Technical Monitor)

    2000-01-01

    Chemical processing of the dusty, low-pressure Martian atmosphere typically requires conditioning and compression of the gases as first steps. A temperature-swing adsorption process can perform these tasks using nearly solid-state hardware and with relatively low power consumption compared to alternative processes. In addition, the process can separate the atmospheric constituents, producing both pressurized CO2 and a buffer gas mixture of nitrogen and argon. To date we have developed and tested adsorption compressors at scales appropriate for the near-term robotic missions that will lead the way to ISRU-based human exploration missions. In this talk we describe the characteristics, testing, and performance of these devices. We also discuss scale-up issues associated with meeting the processing demands of sample return and human missions.

  3. Boosting pitch encoding with audiovisual interactions in congenital amusia.

    PubMed

    Albouy, Philippe; Lévêque, Yohana; Hyde, Krista L; Bouchet, Patrick; Tillmann, Barbara; Caclin, Anne

    2015-01-01

    The combination of information across senses can enhance perception, as revealed for example by decreased reaction times or improved stimulus detection. Interestingly, these facilitatory effects have been shown to be maximal when responses to unisensory modalities are weak. The present study investigated whether audiovisual facilitation can be observed in congenital amusia, a music-specific disorder primarily ascribed to impairments of pitch processing. Amusic individuals and their matched controls performed two tasks. In Task 1, they were required to detect auditory, visual, or audiovisual stimuli as rapidly as possible. In Task 2, they were required to detect as accurately and as rapidly as possible a pitch change within an otherwise monotonic 5-tone sequence that was presented either only auditorily (A condition), or simultaneously with a temporally congruent, but otherwise uninformative visual stimulus (AV condition). Results of Task 1 showed that amusics exhibit typical auditory and visual detection, and typical audiovisual integration capacities: both amusics and controls exhibited shorter response times for audiovisual stimuli than for either auditory stimuli or visual stimuli. Results of Task 2 revealed that both groups benefited from simultaneous uninformative visual stimuli to detect pitch changes: accuracy was higher and response times shorter in the AV condition than in the A condition. The audiovisual improvements of response times were observed for different pitch interval sizes depending on the group. These results suggest that both typical listeners and amusic individuals can benefit from multisensory integration to improve their pitch processing abilities and that this benefit varies as a function of task difficulty. These findings constitute the first step towards the perspective to exploit multisensory paradigms to reduce pitch-related deficits in congenital amusia, notably by suggesting that audiovisual paradigms are effective in an appropriate range of unimodal performance. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. An Investigation of Low Earth Orbit Internal Charging

    NASA Technical Reports Server (NTRS)

    Parker, Linda Neergaard; Minow, Joseph; Willis, Emily

    2014-01-01

    Internal charging is not generally considered a threat in low Earth orbit due to the relatively short exposure times and low flux of electrons with energies of a few MeV encountered in typical orbits. There are configurations, however, where insulators and ungrounded conductors used on the outside of a spacecraft hull may charge when exposed to much lower energy electrons of some 100's keV in a process that is better characterized as internal charging than surface charging. We investigate the conditions required for this internal charging process to occur in low Earth orbit using a one-dimensional charging model and evaluate the environments for which the process may be a threat to spacecraft.

  5. Cyber-physical system for a water reclamation plant: Balancing aeration, energy, and water quality to maintain process resilience

    NASA Astrophysics Data System (ADS)

    Zhu, Junjie

    Aeration accounts for a large fraction of energy consumption in conventional water reclamation plants (WRPs). Although process operations at older WRPs can satisfy effluent permit requirements, they typically operate with excess aeration. More effective process controls at older WRPs can be challenging as operators work to balance higher energy costs and more stringent effluent limitations while managing fluctuating loads. Therefore, understandings of process resilience or ability to quickly return to original operation conditions at a WRP are important. A state-of-art WRP should maintain process resilience to deal with different kinds of perturbations even after optimization of energy demands. This work was to evaluate the applicability and feasibility of cyber-physical system (CPS) for improving operation at Metropolitan Water Reclamation District of Greater Chicago (MWRDGC) Calumet WRP. In this work, a process model was developed and used to better understand the conditions of current Calumet WRP, with additional valuable information from two dissolved oxygen field measurements. Meanwhile, a classification system was developed to reveal the pattern of historical influent scenario based on cluster analysis and cross-tabulation analysis. Based on the results from the classification, typical process control options were investigated. To ensure the feasibility of information acquisition, the reliability and flexibility of soft sensors were assessed to typical influent conditions. Finally, the process resilience was investigated to better balance influent perturbations, energy demands, and effluent quality for long-term operations. These investigations and evaluations show that although the energy demands change as the influent conditions and process controls. In general, aeration savings could be up to 50% from the level of current consumption; with a more complex process controls, the saving could be up to 70% in relatively steady-state conditions and at least 40% in relatively challenging transient conditions. The soft sensors can provide reliable and flexible performance on target predictions. The plant can still maintain at a similar level of process resilience after 50% aeration saving, even during long-term perturbations. Overall, this work shows that it is well feasible to provide more cost-effective operations at the Calumet WRP, and meanwhile influent perturbations, effluent quality, and process resilience are well in balance.

  6. A Medical Area Network of Virtual Technology (MANVT)

    DTIC Science & Technology

    2011-10-01

    translational research projects be captured, undergo quality control and are stored/managed in such a way that they can be mined to test and generate new...translational research is that access to detailed clinical data typically requires proper informed consent and IRB approval; however, in order to design ...attributes of interest at an early stage of the process. 12B2 overcomes these challenges by enabling researchers to design and execute queries

  7. Materials Genome Initiative Element

    NASA Technical Reports Server (NTRS)

    Vickers, John

    2015-01-01

    NASA is committed to developing new materials and manufacturing methods that can enable new missions with ever increasing mission demands. Typically, the development and certification of new materials and manufacturing methods in the aerospace industry has required more than 20 years of development time with a costly testing and certification program. To reduce the cost and time to mature these emerging technologies, NASA is developing computational materials tools to improve understanding of the material and guide the certification process.

  8. Spectral Processing Analysis System (SPANS).

    DTIC Science & Technology

    1980-11-01

    Approximately 750 pounds Temperature Range: 60 - 80 degrees Farenheit Humidity: 40 - 70 percent (relative) Duty Cycle: Continuous Power Requirements: 5 wire, 3...displayed per display frame, local or absolute scaling, number of display points per line and waveform av- A eraging. A typical display is shown in Figure 3...the waveform. In the case of white noise, a high degree of correlation is found at zero lag only with the remaining lags showing little correlation

  9. New Tests to Measure Individual Differences in Matching and Labelling Facial Expressions of Emotion, and Their Association with Ability to Recognise Vocal Emotions and Facial Identity

    PubMed Central

    Palermo, Romina; O’Connor, Kirsty B.; Davis, Joshua M.; Irons, Jessica; McKone, Elinor

    2013-01-01

    Although good tests are available for diagnosing clinical impairments in face expression processing, there is a lack of strong tests for assessing “individual differences” – that is, differences in ability between individuals within the typical, nonclinical, range. Here, we develop two new tests, one for expression perception (an odd-man-out matching task in which participants select which one of three faces displays a different expression) and one additionally requiring explicit identification of the emotion (a labelling task in which participants select one of six verbal labels). We demonstrate validity (careful check of individual items, large inversion effects, independence from nonverbal IQ, convergent validity with a previous labelling task), reliability (Cronbach’s alphas of.77 and.76 respectively), and wide individual differences across the typical population. We then demonstrate the usefulness of the tests by addressing theoretical questions regarding the structure of face processing, specifically the extent to which the following processes are common or distinct: (a) perceptual matching and explicit labelling of expression (modest correlation between matching and labelling supported partial independence); (b) judgement of expressions from faces and voices (results argued labelling tasks tap into a multi-modal system, while matching tasks tap distinct perceptual processes); and (c) expression and identity processing (results argued for a common first step of perceptual processing for expression and identity). PMID:23840821

  10. New tests to measure individual differences in matching and labelling facial expressions of emotion, and their association with ability to recognise vocal emotions and facial identity.

    PubMed

    Palermo, Romina; O'Connor, Kirsty B; Davis, Joshua M; Irons, Jessica; McKone, Elinor

    2013-01-01

    Although good tests are available for diagnosing clinical impairments in face expression processing, there is a lack of strong tests for assessing "individual differences"--that is, differences in ability between individuals within the typical, nonclinical, range. Here, we develop two new tests, one for expression perception (an odd-man-out matching task in which participants select which one of three faces displays a different expression) and one additionally requiring explicit identification of the emotion (a labelling task in which participants select one of six verbal labels). We demonstrate validity (careful check of individual items, large inversion effects, independence from nonverbal IQ, convergent validity with a previous labelling task), reliability (Cronbach's alphas of.77 and.76 respectively), and wide individual differences across the typical population. We then demonstrate the usefulness of the tests by addressing theoretical questions regarding the structure of face processing, specifically the extent to which the following processes are common or distinct: (a) perceptual matching and explicit labelling of expression (modest correlation between matching and labelling supported partial independence); (b) judgement of expressions from faces and voices (results argued labelling tasks tap into a multi-modal system, while matching tasks tap distinct perceptual processes); and (c) expression and identity processing (results argued for a common first step of perceptual processing for expression and identity).

  11. Manned geosynchronous mission requirements and systems analysis study extension. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1981-01-01

    A study was performed to determine the types of manned missions that will likely be performed in the late 1980's or early 1990's timeframe, to define MOTV configurations which satisfy these missions requirements, and to develop a program plan for its development. Twenty generic missions were originally defined for MOTV but, to simplify the selection process, five of these missions were selected as typical and used as Design Reference Missions. Systems and subsystems requirements were re-examined and sensitivity analyses performed to determine optimum point designs. Turnaround modes were considered to determine the most effective combination of ground based and spaced based activities. A preferred concept for the crew capsule and for the mission mode was developed.

  12. Approaches for advancing scientific understanding of macrosystems

    USGS Publications Warehouse

    Levy, Ofir; Ball, Becky A.; Bond-Lamberty, Ben; Cheruvelil, Kendra S.; Finley, Andrew O.; Lottig, Noah R.; Surangi W. Punyasena,; Xiao, Jingfeng; Zhou, Jizhong; Buckley, Lauren B.; Filstrup, Christopher T.; Keitt, Tim H.; Kellner, James R.; Knapp, Alan K.; Richardson, Andrew D.; Tcheng, David; Toomey, Michael; Vargas, Rodrigo; Voordeckers, James W.; Wagner, Tyler; Williams, John W.

    2014-01-01

    The emergence of macrosystems ecology (MSE), which focuses on regional- to continental-scale ecological patterns and processes, builds upon a history of long-term and broad-scale studies in ecology. Scientists face the difficulty of integrating the many elements that make up macrosystems, which consist of hierarchical processes at interacting spatial and temporal scales. Researchers must also identify the most relevant scales and variables to be considered, the required data resources, and the appropriate study design to provide the proper inferences. The large volumes of multi-thematic data often associated with macrosystem studies typically require validation, standardization, and assimilation. Finally, analytical approaches need to describe how cross-scale and hierarchical dynamics and interactions relate to macroscale phenomena. Here, we elaborate on some key methodological challenges of MSE research and discuss existing and novel approaches to meet them.

  13. Typical action perception and interpretation without motor simulation.

    PubMed

    Vannuscorps, Gilles; Caramazza, Alfonso

    2016-01-05

    Every day, we interact with people synchronously, immediately understand what they are doing, and easily infer their mental state and the likely outcome of their actions from their kinematics. According to various motor simulation theories of perception, such efficient perceptual processing of others' actions cannot be achieved by visual analysis of the movements alone but requires a process of motor simulation--an unconscious, covert imitation of the observed movements. According to this hypothesis, individuals incapable of simulating observed movements in their motor system should have difficulty perceiving and interpreting observed actions. Contrary to this prediction, we found across eight sensitive experiments that individuals born with absent or severely shortened upper limbs (upper limb dysplasia), despite some variability, could perceive, anticipate, predict, comprehend, and memorize upper limb actions, which they cannot simulate, as efficiently as typically developed participants. We also found that, like the typically developed participants, the dysplasic participants systematically perceived the position of moving upper limbs slightly ahead of their real position but only when the anticipated position was not biomechanically awkward. Such anticipatory bias and its modulation by implicit knowledge of the body biomechanical constraints were previously considered as indexes of the crucial role of motor simulation in action perception. Our findings undermine this assumption and the theories that place the locus of action perception and comprehension in the motor system and invite a shift in the focus of future research to the question of how the visuo-perceptual system represents and processes observed body movements and actions.

  14. Typical action perception and interpretation without motor simulation

    PubMed Central

    Vannuscorps, Gilles; Caramazza, Alfonso

    2016-01-01

    Every day, we interact with people synchronously, immediately understand what they are doing, and easily infer their mental state and the likely outcome of their actions from their kinematics. According to various motor simulation theories of perception, such efficient perceptual processing of others’ actions cannot be achieved by visual analysis of the movements alone but requires a process of motor simulation—an unconscious, covert imitation of the observed movements. According to this hypothesis, individuals incapable of simulating observed movements in their motor system should have difficulty perceiving and interpreting observed actions. Contrary to this prediction, we found across eight sensitive experiments that individuals born with absent or severely shortened upper limbs (upper limb dysplasia), despite some variability, could perceive, anticipate, predict, comprehend, and memorize upper limb actions, which they cannot simulate, as efficiently as typically developed participants. We also found that, like the typically developed participants, the dysplasic participants systematically perceived the position of moving upper limbs slightly ahead of their real position but only when the anticipated position was not biomechanically awkward. Such anticipatory bias and its modulation by implicit knowledge of the body biomechanical constraints were previously considered as indexes of the crucial role of motor simulation in action perception. Our findings undermine this assumption and the theories that place the locus of action perception and comprehension in the motor system and invite a shift in the focus of future research to the question of how the visuo-perceptual system represents and processes observed body movements and actions. PMID:26699468

  15. D 2 and DT Liquid-Layer Target Shots on NIF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walters, Curtis; Alger, Ethan; Bhandarkar, Suhas

    Experiments at the National Ignition Facility (NIF) using targets containing a Deuterium-Tritium (DT) fuel layer have, until recently, required that a high-quality layer of solid deuterium-tritium (herein referred to as an "ice-layer") be formed in the capsule. The development of a process to line the inner surface of a target capsule with a foam layer of a thickness that is typical of icelayers has resulted in the ability to field targets with liquid layers wetting the foam. Successful fielding of liquid-layer targets on NIF required not only a foam lined capsule, but also changes to the capsule filling process andmore » the manner with which the inventory is maintained in the capsule. Additionally, changes to target heater power and the temperature drops across target components were required in order to achieve the desired range of shot temperatures. These changes, and the target's performance during four target shots on NIF will be discussed.« less

  16. Cost-effective lightweight mirrors for aerospace and defense

    NASA Astrophysics Data System (ADS)

    Woodard, Kenneth S.; Comstock, Lovell E.; Wamboldt, Leonard; Roy, Brian P.

    2015-05-01

    The demand for high performance, lightweight mirrors was historically driven by aerospace and defense (A&D) but now we are also seeing similar requirements for commercial applications. These applications range from aerospace-like platforms such as small unmanned aircraft for agricultural, mineral and pollutant aerial mapping to an eye tracking gimbaled mirror for optometry offices. While aerospace and defense businesses can often justify the high cost of exotic, low density materials, commercial products rarely can. Also, to obtain high performance with low overall optical system weight, aspheric surfaces are often prescribed. This may drive the manufacturing process to diamond machining thus requiring the reflective side of the mirror to be a diamond machinable material. This paper summarizes the diamond machined finishing and coating of some high performance, lightweight designs using non-exotic substrates to achieve cost effective mirrors. The results indicate that these processes can meet typical aerospace and defense requirements but may also be competitive in some commercial applications.

  17. Rights of Conscience Protections for Armed Forces Service Members and Their Chaplains

    DTIC Science & Technology

    2015-07-22

    established five categories of religious accommodation requests: dietary, grooming, medical , uniform, and worship practices.2 • Dietary: typically, these... Medical : typically, these are requests for a waiver of mandatory immunizations. • Uniform: typically, these are requests to wear religious jewelry or...service members in their units. Requirements A chaplain applicant is required to meet DoD medical and physical standards for commissioning as an

  18. ERP correlates of object recognition memory in Down syndrome: Do active and passive tasks measure the same thing?

    PubMed

    Van Hoogmoed, A H; Nadel, L; Spanò, G; Edgin, J O

    2016-02-01

    Event related potentials (ERPs) can help to determine the cognitive and neural processes underlying memory functions and are often used to study populations with severe memory impairment. In healthy adults, memory is typically assessed with active tasks, while in patient studies passive memory paradigms are generally used. In this study we examined whether active and passive continuous object recognition tasks measure the same underlying memory process in typically developing (TD) adults and in individuals with Down syndrome (DS), a population with known hippocampal impairment. We further explored how ERPs in these tasks relate to behavioral measures of memory. Data-driven analysis techniques revealed large differences in old-new effects in the active versus passive task in TD adults, but no difference between these tasks in DS. The group with DS required additional processing in the active task in comparison to the TD group in two ways. First, the old-new effect started 150 ms later. Second, more repetitions were required to show the old-new effect. In the group with DS, performance on a behavioral measure of object-location memory was related to ERP measures across both tasks. In total, our results suggest that active and passive ERP memory measures do not differ in DS and likely reflect the use of implicit memory, but not explicit processing, on both tasks. Our findings highlight the need for a greater understanding of the comparison between active and passive ERP paradigms before they are inferred to measure similar functions across populations (e.g., infants or intellectual disability). Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. A computer program for uncertainty analysis integrating regression and Bayesian methods

    USGS Publications Warehouse

    Lu, Dan; Ye, Ming; Hill, Mary C.; Poeter, Eileen P.; Curtis, Gary

    2014-01-01

    This work develops a new functionality in UCODE_2014 to evaluate Bayesian credible intervals using the Markov Chain Monte Carlo (MCMC) method. The MCMC capability in UCODE_2014 is based on the FORTRAN version of the differential evolution adaptive Metropolis (DREAM) algorithm of Vrugt et al. (2009), which estimates the posterior probability density function of model parameters in high-dimensional and multimodal sampling problems. The UCODE MCMC capability provides eleven prior probability distributions and three ways to initialize the sampling process. It evaluates parametric and predictive uncertainties and it has parallel computing capability based on multiple chains to accelerate the sampling process. This paper tests and demonstrates the MCMC capability using a 10-dimensional multimodal mathematical function, a 100-dimensional Gaussian function, and a groundwater reactive transport model. The use of the MCMC capability is made straightforward and flexible by adopting the JUPITER API protocol. With the new MCMC capability, UCODE_2014 can be used to calculate three types of uncertainty intervals, which all can account for prior information: (1) linear confidence intervals which require linearity and Gaussian error assumptions and typically 10s–100s of highly parallelizable model runs after optimization, (2) nonlinear confidence intervals which require a smooth objective function surface and Gaussian observation error assumptions and typically 100s–1,000s of partially parallelizable model runs after optimization, and (3) MCMC Bayesian credible intervals which require few assumptions and commonly 10,000s–100,000s or more partially parallelizable model runs. Ready access allows users to select methods best suited to their work, and to compare methods in many circumstances.

  20. Optimization of nonlinear, non-Gaussian Bayesian filtering for diagnosis and prognosis of monotonic degradation processes

    NASA Astrophysics Data System (ADS)

    Corbetta, Matteo; Sbarufatti, Claudio; Giglio, Marco; Todd, Michael D.

    2018-05-01

    The present work critically analyzes the probabilistic definition of dynamic state-space models subject to Bayesian filters used for monitoring and predicting monotonic degradation processes. The study focuses on the selection of the random process, often called process noise, which is a key perturbation source in the evolution equation of particle filtering. Despite the large number of applications of particle filtering predicting structural degradation, the adequacy of the picked process noise has not been investigated. This paper reviews existing process noise models that are typically embedded in particle filters dedicated to monitoring and predicting structural damage caused by fatigue, which is monotonic in nature. The analysis emphasizes that existing formulations of the process noise can jeopardize the performance of the filter in terms of state estimation and remaining life prediction (i.e., damage prognosis). This paper subsequently proposes an optimal and unbiased process noise model and a list of requirements that the stochastic model must satisfy to guarantee high prognostic performance. These requirements are useful for future and further implementations of particle filtering for monotonic system dynamics. The validity of the new process noise formulation is assessed against experimental fatigue crack growth data from a full-scale aeronautical structure using dedicated performance metrics.

  1. Self-Reacting Friction Stir Welding for Aluminum Alloy Circumferential Weld Applications

    NASA Technical Reports Server (NTRS)

    Bjorkman, Gerry; Cantrell, Mark; Carter, Robert

    2003-01-01

    Friction stir welding is an innovative weld process that continues to grow in use, in the commercial, defense, and space sectors. It produces high quality and high strength welds in aluminum alloys. The process consists of a rotating weld pin tool that plasticizes material through friction. The plasticized material is welded by applying a high weld forge force through the weld pin tool against the material during pin tool rotation. The high weld forge force is reacted against an anvil and a stout tool structure. A variation of friction stir welding currently being evaluated is self-reacting friction stir welding. Self-reacting friction stir welding incorporates two opposing shoulders on the crown and root sides of the weld joint. In self-reacting friction stir welding, the weld forge force is reacted against the crown shoulder portion of the weld pin tool by the root shoulder. This eliminates the need for a stout tooling structure to react the high weld forge force required in the typical friction stir weld process. Therefore, the self-reacting feature reduces tooling requirements and, therefore, process implementation costs. This makes the process attractive for aluminum alloy circumferential weld applications. To evaluate the application of self-reacting friction stir welding for aluminum alloy circumferential welding, a feasibility study was performed. The study consisted of performing a fourteen-foot diameter aluminum alloy circumferential demonstration weld using typical fusion weld tooling. To accomplish the demonstration weld, weld and tack weld development were performed and fourteen-foot diameter rings were fabricated. Weld development consisted of weld pin tool selection and the generation of a process map and envelope. Tack weld development evaluated gas tungsten arc welding and friction stir welding for tack welding rings together for circumferential welding. As a result of the study, a successful circumferential demonstration weld was produced leading the way for future circumferential weld implementation.

  2. Martial Art Training and Cognitive Performance in Middle-Aged Adults.

    PubMed

    Douris, Peter; Douris, Christopher; Balder, Nicole; LaCasse, Michael; Rand, Amir; Tarapore, Freya; Zhuchkan, Aleskey; Handrakis, John

    2015-09-29

    Cognitive performance includes the processes of attention, memory, processing speed, and executive functioning, which typically declines with aging. Previous research has demonstrated that aerobic and resistance exercise improves cognitive performance immediately following exercise. However, there is limited research examining the effect that a cognitively complex exercise such as martial art training has on these cognitive processes. Our study compared the acute effects of 2 types of martial art training to aerobic exercise on cognitive performance in middle-aged adults. We utilized a repeated measures design with the order of the 3 exercise conditions randomly assigned and counterbalanced. Ten recreational middle-aged martial artists (mean age = 53.5 ± 8.6 years) participated in 3 treatment conditions: a typical martial art class, an atypical martial art class, and a one-hour walk at a self-selected speed. Cognitive performance was assessed by the Stroop Color and Word test. While all 3 exercise conditions improved attention and processing speed, only the 2 martial art conditions improved the highest order of cognitive performance, executive function. The effect of the 2 martial art conditions on executive function was not different. The improvement in executive function may be due to the increased cortical demand required by the more complex, coordinated motor tasks of martial art exercise compared to the more repetitive actions of walking.

  3. Martial Art Training and Cognitive Performance in Middle-Aged Adults

    PubMed Central

    Douris, Peter; Douris, Christopher; Balder, Nicole; LaCasse, Michael; Rand, Amir; Tarapore, Freya; Zhuchkan, Aleskey; Handrakis, John

    2015-01-01

    Cognitive performance includes the processes of attention, memory, processing speed, and executive functioning, which typically declines with aging. Previous research has demonstrated that aerobic and resistance exercise improves cognitive performance immediately following exercise. However, there is limited research examining the effect that a cognitively complex exercise such as martial art training has on these cognitive processes. Our study compared the acute effects of 2 types of martial art training to aerobic exercise on cognitive performance in middle-aged adults. We utilized a repeated measures design with the order of the 3 exercise conditions randomly assigned and counterbalanced. Ten recreational middle-aged martial artists (mean age = 53.5 ± 8.6 years) participated in 3 treatment conditions: a typical martial art class, an atypical martial art class, and a one-hour walk at a self-selected speed. Cognitive performance was assessed by the Stroop Color and Word test. While all 3 exercise conditions improved attention and processing speed, only the 2 martial art conditions improved the highest order of cognitive performance, executive function. The effect of the 2 martial art conditions on executive function was not different. The improvement in executive function may be due to the increased cortical demand required by the more complex, coordinated motor tasks of martial art exercise compared to the more repetitive actions of walking. PMID:26672872

  4. Activation of sputter-processed indium-gallium-zinc oxide films by simultaneous ultraviolet and thermal treatments.

    PubMed

    Tak, Young Jun; Ahn, Byung Du; Park, Sung Pyo; Kim, Si Joon; Song, Ae Ran; Chung, Kwun-Bum; Kim, Hyun Jae

    2016-02-23

    Indium-gallium-zinc oxide (IGZO) films, deposited by sputtering at room temperature, still require activation to achieve satisfactory semiconductor characteristics. Thermal treatment is typically carried out at temperatures above 300 °C. Here, we propose activating sputter- processed IGZO films using simultaneous ultraviolet and thermal (SUT) treatments to decrease the required temperature and enhance their electrical characteristics and stability. SUT treatment effectively decreased the amount of carbon residues and the number of defect sites related to oxygen vacancies and increased the number of metal oxide (M-O) bonds through the decomposition-rearrangement of M-O bonds and oxygen radicals. Activation of IGZO TFTs using the SUT treatment reduced the processing temperature to 150 °C and improved various electrical performance metrics including mobility, on-off ratio, and threshold voltage shift (positive bias stress for 10,000 s) from 3.23 to 15.81 cm(2)/Vs, 3.96 × 10(7) to 1.03 × 10(8), and 11.2 to 7.2 V, respectively.

  5. The Evolution of the NASA Commercial Crew Program Mission Assurance Process

    NASA Technical Reports Server (NTRS)

    Canfield, Amy C.

    2016-01-01

    In 2010, the National Aeronautics and Space Administration (NASA) established the Commercial Crew Program (CCP) in order to provide human access to the International Space Station and low Earth orbit via the commercial (non-governmental) sector. A particular challenge to NASA has been how to determine that the Commercial Provider's transportation system complies with programmatic safety requirements. The process used in this determination is the Safety Technical Review Board which reviews and approves provider submitted hazard reports. One significant product of the review is a set of hazard control verifications. In past NASA programs, 100% of these safety critical verifications were typically confirmed by NASA. The traditional Safety and Mission Assurance (S&MA) model does not support the nature of the CCP. To that end, NASA S&MA is implementing a Risk Based Assurance process to determine which hazard control verifications require NASA authentication. Additionally, a Shared Assurance Model is also being developed to efficiently use the available resources to execute the verifications.

  6. Collaborative Manufacturing for Small-Medium Enterprises

    NASA Astrophysics Data System (ADS)

    Irianto, D.

    2016-02-01

    Manufacturing systems involve decisions concerning production processes, capacity, planning, and control. In a MTO manufacturing systems, strategic decisions concerning fulfilment of customer requirement, manufacturing cost, and due date of delivery are the most important. In order to accelerate the decision making process, research on decision making structure when receiving order and sequencing activities under limited capacity is required. An effective decision making process is typically required by small-medium components and tools maker as supporting industries to large industries. On one side, metal small-medium enterprises are expected to produce parts, components or tools (i.e. jigs, fixture, mold, and dies) with high precision, low cost, and exact delivery time. On the other side, a metal small- medium enterprise may have weak bargaining position due to aspects such as low production capacity, limited budget for material procurement, and limited high precision machine and equipment. Instead of receiving order exclusively, a small-medium enterprise can collaborate with other small-medium enterprise in order to fulfill requirements high quality, low manufacturing cost, and just in time delivery. Small-medium enterprises can share their best capabilities to form effective supporting industries. Independent body such as community service at university can take a role as a collaboration manager. The Laboratory of Production Systems at Bandung Institute of Technology has implemented shared manufacturing systems for small-medium enterprise collaboration.

  7. Global Processing Speed in Children With Low Reading Ability and in Children and Adults With Typical Reading Ability: Exploratory Factor Analytic Models

    PubMed Central

    Peter, Beate; Matsushita, Mark; Raskind, Wendy H.

    2013-01-01

    Purpose To investigate processing speed as a latent dimension in children with dyslexia and children and adults with typical reading skills. Method Exploratory factor analysis (FA) was based on a sample of multigenerational families, each ascertained through a child with dyslexia. Eleven measures—6 of them timed—represented verbal and nonverbal processes, alphabet writing, and motor sequencing in the hand and oral motor system. FA was conducted in 4 cohorts (all children, a subset of children with low reading scores, a subset of children with typical reading scores, and adults with typical reading scores; total N = 829). Results Processing speed formed the first factor in all cohorts. Both measures of motor sequencing speed loaded on the speed factor with the other timed variables. Children with poor reading scores showed lower speed factor scores than did typical peers. The speed factor was negatively correlated with age in the adults. Conclusions The speed dimension was observed independently of participant cohort, gender, and reading ability. Results are consistent with a unified theory of processing speed as a quadratic function of age in typical development and with slowed processing in poor readers. PMID:21081672

  8. Managing the Nuclear Fuel Cycle: Policy Implications of Expanding Global Access to Nuclear Power

    DTIC Science & Technology

    2008-01-20

    critical aspect of the nuclear fuel cycle for the United States, where longstanding nonproliferation policy discouraged commercial nuclear fuel...have U.S. government officials. However, the case of Iran raises perhaps the most critical question in this decade for strengthening the nuclear...slight difference in atomic mass between 235U and 238U. The typical enrichment process requires about 10 lbs of uranium U3O8 to produce 1 lb of low

  9. The Military Spouse Education and Career Opportunities Program: Recommendations for an Internal Monitoring System

    DTIC Science & Technology

    2016-01-01

    Family Policy’s SECO program, which reviewed existing SECO metrics and data sources, as well as analytic methods of previ- ous research, to determine ...process that requires an iterative cycle of assessment of collected data (typically, but not solely, quantitative data) to determine whether SECO...RAND suggests five steps to develop and implement the SECO inter- nal monitoring system: Step 1. Describe the logic or theory of how activities are

  10. Copper-catalyzed decarboxylative trifluoromethylation of allylic bromodifluoroacetates.

    PubMed

    Ambler, Brett R; Altman, Ryan A

    2013-11-01

    The development of new synthetic fluorination reactions has important implications in medicinal, agricultural, and materials chemistries. Given the prevalence and accessibility of alcohols, methods to convert alcohols to trifluoromethanes are desirable. However, this transformation typically requires four-step processes, specialty chemicals, and/or stoichiometric metals to access the trifluoromethyl-containing product. A two-step copper-catalyzed decarboxylative protocol for converting allylic alcohols to trifluoromethanes is reported. Preliminary mechanistic studies distinguish this reaction from previously reported Cu-mediated reactions.

  11. Feasibility demonstration of a hyperfiltration technique to reclaim shower wastewater at elevated temperatures

    NASA Technical Reports Server (NTRS)

    Hester, J. C.; Brandon, C. A.

    1972-01-01

    A feasibility demonstration of a hyperfiltration technique to determine its capability to reclaim shower wastewater at elevated temperature was conducted. Approximately twenty (20) gallons of typical shower water were processed through a dynamically formed membrane at a temperature of 167 F. Chemical and bacterial analyses of the product water are presented which show compliance with all potable water requirements established for extended manned space missions. In addition, subsystem characteristics and capabilities are discussed.

  12. Parallel processing implementations of a contextual classifier for multispectral remote sensing data

    NASA Technical Reports Server (NTRS)

    Siegel, H. J.; Swain, P. H.; Smith, B. W.

    1980-01-01

    Contextual classifiers are being developed as a method to exploit the spatial/spectral context of a pixel to achieve accurate classification. Classification algorithms such as the contextual classifier typically require large amounts of computation time. One way to reduce the execution time of these tasks is through the use of parallelism. The applicability of the CDC flexible processor system and of a proposed multimicroprocessor system (PASM) for implementing contextual classifiers is examined.

  13. Aerodynamic instability: A case history

    NASA Technical Reports Server (NTRS)

    Eisenmann, R. C.

    1985-01-01

    The identification, diagnosis, and final correction of complex machinery malfunctions typically require the correlation of many parameters such as mechanical construction, process influence, maintenance history, and vibration response characteristics. The progression is reviewed of field testing, diagnosis, and final correction of a specific machinery instability problem. The case history presented addresses a unique low frequency instability problem on a high pressure barrel compressor. The malfunction was eventually diagnosed as a fluidic mechanism that manifested as an aerodynamic disturbance to the rotor assembly.

  14. Spacelab mission dependent training parametric resource requirements study

    NASA Technical Reports Server (NTRS)

    Ogden, D. H.; Watters, H.; Steadman, J.; Conrad, L.

    1976-01-01

    Training flows were developed for typical missions, resource relationships analyzed, and scheduling optimization algorithms defined. Parametric analyses were performed to study the effect of potential changes in mission model, mission complexity and training time required on the resource quantities required to support training of payload or mission specialists. Typical results of these analyses are presented both in graphic and tabular form.

  15. A high-speed linear algebra library with automatic parallelism

    NASA Technical Reports Server (NTRS)

    Boucher, Michael L.

    1994-01-01

    Parallel or distributed processing is key to getting highest performance workstations. However, designing and implementing efficient parallel algorithms is difficult and error-prone. It is even more difficult to write code that is both portable to and efficient on many different computers. Finally, it is harder still to satisfy the above requirements and include the reliability and ease of use required of commercial software intended for use in a production environment. As a result, the application of parallel processing technology to commercial software has been extremely small even though there are numerous computationally demanding programs that would significantly benefit from application of parallel processing. This paper describes DSSLIB, which is a library of subroutines that perform many of the time-consuming computations in engineering and scientific software. DSSLIB combines the high efficiency and speed of parallel computation with a serial programming model that eliminates many undesirable side-effects of typical parallel code. The result is a simple way to incorporate the power of parallel processing into commercial software without compromising maintainability, reliability, or ease of use. This gives significant advantages over less powerful non-parallel entries in the market.

  16. Processes involved in the development of latent fingerprints using the cyanoacrylate fuming method.

    PubMed

    Lewis, L A; Smithwick, R W; Devault, G L; Bolinger, B; Lewis, S A

    2001-03-01

    Chemical processes involved in the development of latent fingerprints using the cyanoacrylate fuming method have been studied. Two major types of latent prints have been investigated-clean and oily prints. Scanning electron microscopy (SEM) has been used as a tool for determining the morphology of the polymer developed separately on clean and oily prints after cyanoacrylate fuming. A correlation between the chemical composition of an aged latent fingerprint, prior to development, and the quality of a developed fingerprint has been observed in the morphology. The moisture in the print prior to fuming has been found to be more important than the moisture in the air during fuming for the development of a useful latent print. In addition, the amount of time required to develop a high quality latent print has been found to be within 2 min. The cyanoacrylate polymerization process is extremely rapid. When heat is used to accelerate the fuming process, typically a period of 2 min is required to develop the print. The optimum development time depends upon the concentration of cyanoacrylate vapors within the enclosure.

  17. High Cycle Fatigue Crack Initiation Study of Case Blade Alloy Rene 125

    NASA Technical Reports Server (NTRS)

    Kantzos, P.; Gayda, J.; Miner, R. V.; Telesman, J.; Dickerson, P.

    2000-01-01

    This study was conducted in order to investigate and document the high cycle fatigue crack initiation characteristics of blade alloy Rene 125 as cast by three commercially available processes. This alloy is typically used in turbine blade applications. It is currently being considered as a candidate alloy for high T3 compressor airfoil applications. This effort is part of NASA's Advanced Subsonic Technology (AST) program which aims to develop improved capabilities for the next generation subsonic gas turbine engine for commercial carriers. Wrought alloys, which are customarily used for airfoils in the compressor, cannot meet the property goals at the higher compressor exit temperatures that would be required for advanced ultra-high bypass engines. As a result cast alloys are currently being considered for such applications. Traditional blade materials such as Rene 125 have the high temperature capabilities required for such applications. However, the implementation of cast alloys in compressor airfoil applications where airfoils are typically much thinner does raise some issues of concern such as thin wall castability, casting cleaningness, and susceptibility to high-cycle fatigue (HCF) loading.

  18. Inkjet-Printed Porous Silver Thin Film as a Cathode for a Low-Temperature Solid Oxide Fuel Cell.

    PubMed

    Yu, Chen-Chiang; Baek, Jong Dae; Su, Chun-Hao; Fan, Liangdong; Wei, Jun; Liao, Ying-Chih; Su, Pei-Chen

    2016-04-27

    In this work we report a porous silver thin film cathode that was fabricated by a simple inkjet printing process for low-temperature solid oxide fuel cell applications. The electrochemical performance of the inkjet-printed silver cathode was studied at 300-450 °C and was compared with that of silver cathodes that were fabricated by the typical sputtering method. Inkjet-printed silver cathodes showed lower electrochemical impedance due to their porous structure, which facilitated oxygen gaseous diffusion and oxygen surface adsorption-dissociation reactions. A typical sputtered nanoporous silver cathode became essentially dense after the operation and showed high impedance due to a lack of oxygen supply. The results of long-term fuel cell operation show that the cell with an inkjet-printed cathode had a more stable current output for more than 45 h at 400 °C. A porous silver cathode is required for high fuel cell performance, and the simple inkjet printing technique offers an alternative method of fabrication for such a desirable porous structure with the required thermal-morphological stability.

  19. Low-temperature atomic layer deposition of TiO{sub 2} thin layers for the processing of memristive devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Porro, Samuele, E-mail: samuele.porro@polito.it; Conti, Daniele; Guastella, Salvatore

    2016-01-15

    Atomic layer deposition (ALD) represents one of the most fundamental techniques capable of satisfying the strict technological requirements imposed by the rapidly evolving electronic components industry. The actual scaling trend is rapidly leading to the fabrication of nanoscaled devices able to overcome limits of the present microelectronic technology, of which the memristor is one of the principal candidates. Since their development in 2008, TiO{sub 2} thin film memristors have been identified as the future technology for resistive random access memories because of their numerous advantages in producing dense, low power-consuming, three-dimensional memory stacks. The typical features of ALD, such asmore » self-limiting and conformal deposition without line-of-sight requirements, are strong assets for fabricating these nanosized devices. This work focuses on the realization of memristors based on low-temperature ALD TiO{sub 2} thin films. In this process, the oxide layer was directly grown on a polymeric photoresist, thus simplifying the fabrication procedure with a direct liftoff patterning instead of a complex dry etching process. The TiO{sub 2} thin films deposited in a temperature range of 120–230 °C were characterized via Raman spectroscopy and x-ray photoelectron spectroscopy, and electrical current–voltage measurements taken in voltage sweep mode were employed to confirm the existence of resistive switching behaviors typical of memristors. These measurements showed that these low-temperature devices exhibit an ON/OFF ratio comparable to that of a high-temperature memristor, thus exhibiting similar performances with respect to memory applications.« less

  20. Needs Assessment for the Use of NASA Remote Sensing Data in the Development and Implementation of Estuarine and Coastal Water Quality Standards

    NASA Technical Reports Server (NTRS)

    Spiering, Bruce; Underwood, Lauren; Ellis, Chris; Lehrter, John; Hagy, Jim; Schaeffer, Blake

    2010-01-01

    The goals of the project are to provide information from satellite remote sensing to support numeric nutrient criteria development and to determine data processing methods and data quality requirements to support nutrient criteria development and implementation. The approach is to identify water quality indicators that are used by decision makers to assess water quality and that are related to optical properties of the water; to develop remotely sensed data products based on algorithms relating remote sensing imagery to field-based observations of indicator values; to develop methods to assess estuarine water quality, including trends, spatial and temporal variability, and seasonality; and to develop tools to assist in the development and implementation of estuarine and coastal nutrient criteria. Additional slides present process, criteria development, typical data sources and analyses for criteria process, the power of remote sensing data for the process, examples from Pensacola Bay, spatial and temporal variability, pixel matchups, remote sensing validation, remote sensing in coastal waters, requirements for remotely sensed data products, and needs assessment. An additional presentation examines group engagement and information collection. Topics include needs assessment purpose and objectives, understanding water quality decision making, determining information requirements, and next steps.

  1. Updating National Topographic Data Base Using Change Detection Methods

    NASA Astrophysics Data System (ADS)

    Keinan, E.; Felus, Y. A.; Tal, Y.; Zilberstien, O.; Elihai, Y.

    2016-06-01

    The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA), the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS) classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.

  2. Perspective: Evolutionary design of granular media and block copolymer patterns

    NASA Astrophysics Data System (ADS)

    Jaeger, Heinrich M.; de Pablo, Juan J.

    2016-05-01

    The creation of new materials "by design" is a process that starts from desired materials properties and proceeds to identify requirements for the constituent components. Such process is challenging because it inverts the typical modeling approach, which starts from given micro-level components to predict macro-level properties. We describe how to tackle this inverse problem using concepts from evolutionary computation. These concepts have widespread applicability and open up new opportunities for design as well as discovery. Here we apply them to design tasks involving two very different classes of soft materials, shape-optimized granular media and nanopatterned block copolymer thin films.

  3. Impact of Site Elevation on Mg Smelter Design

    NASA Astrophysics Data System (ADS)

    Baker, Phillip W.

    Site elevation has many surprising and significant impacts on the engineering design of metallurgical plant of all types. Electrolytic magnesium smelters maybe built at high elevation for a variety of reasons including availability of raw material, energy or electric power. Because of the unit processes they typically involve, Mg smelters can be extensively impacted by site elevation. In this paper, generic examples of the design changes required to adapt a smelter originally designed for sea level to operate at 2700 m are presented. While the examples are drawn from a magnesium plant design case, these changes are generically applicable to all industrial plants utilizing similar unit processes irrespective of product.

  4. DEVELOPMENT OF IMPROVED FABRICATION METHODS, PROCESS AND TECHNIQUES FOR PRODUCING TYPICAL AIRCRAFT SHAPES FROM BERYLLIUM. Interim Technical Documentary Progress Report for the Period ending October 31, 1962

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jenkins, R.G.; Siergiej, J.M.

    1962-12-28

    In a program to develop a complete manufacturing process for ihe production of beryllium channels, techniques are being sought for drawing to obtain a flnal product meeting specifications more rigorous than are obtainable by direct extrusion. Progress in designing and procuring the special tooling required to draw complex shapes at elevated temperature is described, and the flrst set of draw dies is evaluated with respect to design and quality. Three experimental draw attempts have been made on U-channels, in addition to draw tests on flats. (auth)

  5. Protein Folding Using a Vortex Fluidic Device.

    PubMed

    Britton, Joshua; Smith, Joshua N; Raston, Colin L; Weiss, Gregory A

    2017-01-01

    Essentially all biochemistry and most molecular biology experiments require recombinant proteins. However, large, hydrophobic proteins typically aggregate into insoluble and misfolded species, and are directed into inclusion bodies. Current techniques to fold proteins recovered from inclusion bodies rely on denaturation followed by dialysis or rapid dilution. Such approaches can be time consuming, wasteful, and inefficient. Here, we describe rapid protein folding using a vortex fluidic device (VFD). This process uses mechanical energy introduced into thin films to rapidly and efficiently fold proteins. With the VFD in continuous flow mode, large volumes of protein solution can be processed per day with 100-fold reductions in both folding times and buffer volumes.

  6. Continuous welding of unidirectional fiber reinforced thermoplastic tape material

    NASA Astrophysics Data System (ADS)

    Schledjewski, Ralf

    2017-10-01

    Continuous welding techniques like thermoplastic tape placement with in situ consolidation offer several advantages over traditional manufacturing processes like autoclave consolidation, thermoforming, etc. However, still there is a need to solve several important processing issues before it becomes a viable economic process. Intensive process analysis and optimization has been carried out in the past through experimental investigation, model definition and simulation development. Today process simulation is capable to predict resulting consolidation quality. Effects of material imperfections or process parameter variations are well known. But using this knowledge to control the process based on online process monitoring and according adaption of the process parameters is still challenging. Solving inverse problems and using methods for automated code generation allowing fast implementation of algorithms on targets are required. The paper explains the placement technique in general. Process-material-property-relationships and typical material imperfections are described. Furthermore, online monitoring techniques and how to use them for a model based process control system are presented.

  7. Development of optical-electronic system for the separation of cullet

    NASA Astrophysics Data System (ADS)

    Solovey, Alexey A.; Alekhin, Artem A.

    2017-06-01

    Broken glass being the waste in many fields of production is usually used as a raw material in the production of construction materials. The purity level of collected and processed glass cullet, as a rule, is quite low. Direct usage of these materials without preliminary processing leads to the emergence of defects in the end product or sometimes even to technological downtime. That's why purity control of cullet should be strictly verified. The study shows the method of construction and requirements for an optical-electronic system designed for cullet separation. Moreover, the author proposes a registration channel scheme and shows a scheme of control exposure area. Also the issues of image processing for the implementation of a typical system are examined.

  8. Aspects of forming metal-clad melt-processed Y-Ba-Cu-O tapes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kozlowski, G.; Oberly, C.E.; Ho, J.

    1991-03-01

    This paper reports on melt-processing of Y-Ba-Cu-O superconductor in a usable form for magnet winding which requires the development of a cladding with demanding properties. Numerous recent efforts in cold forming Bi-based superconductor tapes have been successful because a silver tube can be used to constrain the ceramic material, which is sintered at much lower temperature than the Y-Ba-Cu-O. Typical high temperature metals which can be used to encase Y-Ba-Cu-O during sintering do not permit ready diffusion of oxygen as silver does. Recently, the full or partial recovery of superconductivity has been achieved in transition-metal- doped Y-Ba-Cu-O due to themore » partial-melt processing.« less

  9. Techniques for using diazo materials in remote sensor data analysis

    NASA Technical Reports Server (NTRS)

    Whitebay, L. E.; Mount, S.

    1978-01-01

    The use of data derived from LANDSAT is facilitated when special products or computer enhanced images can be analyzed. However, the facilities required to produce and analyze such products prevent many users from taking full advantages of the LANDSAT data. A simple, low-cost method is presented by which users can make their own specially enhanced composite images from the four band black and white LANDSAT images by using the diazo process. The diazo process is described and a detailed procedure for making various color composites, such as color infrared, false natural color, and false color, is provided. The advantages and limitations of the diazo process are discussed. A brief discussion interpretation of diazo composites for land use mapping with some typical examples is included.

  10. Successfully use agglomeration for size enlargement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pietsch, W.

    1996-04-01

    The processing of fine and ultrafine particles by size enlargement finds an ever increasing application. At the same time, undesirable agglomeration such as buildup, caking, bridging, and uncontrolled aggregation of fine particles can occur during processing and handling of these particulate solids. This article will provide a survey of the phenomena of agglomeration and discuss the unit operation of size enlargement by agglomeration. This article is also an invitation, particularly to young engineers, to become interested in agglomeration. Considering that mechanical process technologies are requiring more energy every year than any other group of consumers and efficiencies are typically inmore » the single digits or teens at best, considerable rewards can be expected from the development of scientifically modified, more energy-efficient methods and equipment.« less

  11. Computational problems and signal processing in SETI

    NASA Technical Reports Server (NTRS)

    Deans, Stanley R.; Cullers, D. K.; Stauduhar, Richard

    1991-01-01

    The Search for Extraterrestrial Intelligence (SETI), currently being planned at NASA, will require that an enormous amount of data (on the order of 10 exp 11 distinct signal paths for a typical observation) be analyzed in real time by special-purpose hardware. Even though the SETI system design is not based on maximum entropy and Bayesian methods (partly due to the real-time processing constraint), it is expected that enough data will be saved to be able to apply these and other methods off line where computational complexity is not an overriding issue. Interesting computational problems that relate directly to the system design for processing such an enormous amount of data have emerged. Some of these problems are discussed, along with the current status on their solution.

  12. Methodology for the systems engineering process. Volume 3: Operational availability

    NASA Technical Reports Server (NTRS)

    Nelson, J. H.

    1972-01-01

    A detailed description and explanation of the operational availability parameter is presented. The fundamental mathematical basis for operational availability is developed, and its relationship to a system's overall performance effectiveness is illustrated within the context of identifying specific availability requirements. Thus, in attempting to provide a general methodology for treating both hypothetical and existing availability requirements, the concept of an availability state, in conjunction with the more conventional probability-time capability, is investigated. In this respect, emphasis is focused upon a balanced analytical and pragmatic treatment of operational availability within the system design process. For example, several applications of operational availability to typical aerospace systems are presented, encompassing the techniques of Monte Carlo simulation, system performance availability trade-off studies, analytical modeling of specific scenarios, as well as the determination of launch-on-time probabilities. Finally, an extensive bibliography is provided to indicate further levels of depth and detail of the operational availability parameter.

  13. Confessions of a robot lobotomist

    NASA Technical Reports Server (NTRS)

    Gottshall, R. Marc

    1994-01-01

    Since its inception, numerically controlled (NC) machining methods have been used throughout the aerospace industry to mill, drill, and turn complex shapes by sequentially stepping through motion programs. However, the recent demand for more precision, faster feeds, exotic sensors, and branching execution have existing computer numerical control (CNC) and distributed numerical control (DNC) systems running at maximum controller capacity. Typical disadvantages of current CNC's include fixed memory capacities, limited communication ports, and the use of multiple control languages. The need to tailor CNC's to meet specific applications, whether it be expanded memory, additional communications, or integrated vision, often requires replacing the original controller supplied with the commercial machine tool with a more powerful and capable system. This paper briefly describes the process and equipment requirements for new controllers and their evolutionary implementation in an aerospace environment. The process of controller retrofit with currently available machines is examined, along with several case studies and their computational and architectural implications.

  14. Optimally designing games for behavioural research

    PubMed Central

    Rafferty, Anna N.; Zaharia, Matei; Griffiths, Thomas L.

    2014-01-01

    Computer games can be motivating and engaging experiences that facilitate learning, leading to their increasing use in education and behavioural experiments. For these applications, it is often important to make inferences about the knowledge and cognitive processes of players based on their behaviour. However, designing games that provide useful behavioural data are a difficult task that typically requires significant trial and error. We address this issue by creating a new formal framework that extends optimal experiment design, used in statistics, to apply to game design. In this framework, we use Markov decision processes to model players' actions within a game, and then make inferences about the parameters of a cognitive model from these actions. Using a variety of concept learning games, we show that in practice, this method can predict which games will result in better estimates of the parameters of interest. The best games require only half as many players to attain the same level of precision. PMID:25002821

  15. PI controller design for indirect vector controlled induction motor: A decoupling approach.

    PubMed

    Jain, Jitendra Kr; Ghosh, Sandip; Maity, Somnath; Dworak, Pawel

    2017-09-01

    Decoupling of the stator currents is important for smoother torque response of indirect vector controlled induction motors. Typically, feedforward decoupling is used to take care of current coupling that requires exact knowledge of motor parameters, additional circuitry and signal processing. In this paper, a method is proposed to design the regulating proportional-integral gains that minimize coupling without any requirement of the additional decoupler. The variation of the coupling terms for change in load torque is considered as the performance measure. An iterative linear matrix inequality based H ∞ control design approach is used to obtain the controller gains. A comparison between the feedforward and the proposed decoupling schemes is presented through simulation and experimental results. The results show that the proposed scheme is simple yet effective even without additional block or burden on signal processing. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  16. Plug and Process Loads Capacity and Power Requirements Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheppy, M.; Gentile-Polese, L.

    2014-09-01

    This report addresses gaps in actionable knowledge that would help reduce the plug load capacities designed into buildings. Prospective building occupants and real estate brokers lack accurate references for plug and process load (PPL) capacity requirements, so they often request 5-10 W/ft2 in their lease agreements. Limited initial data, however, suggest that actual PPL densities in leased buildings are substantially lower. Overestimating PPL capacity leads designers to oversize electrical infrastructure and cooling systems. Better guidance will enable improved sizing and design of these systems, decrease upfront capital costs, and allow systems to operate more energy efficiently. The main focus ofmore » this report is to provide industry with reliable, objective third-party guidance to address the information gap in typical PPL densities for commercial building tenants. This could drive changes in negotiations about PPL energy demands.« less

  17. Characterization of Monomethyihydrazine (MMH) Non-Volatile Residue

    NASA Technical Reports Server (NTRS)

    Davis, Chuck; Howard, Philip M.

    2009-01-01

    The Space Shuttle program has a unique propellant purity requirement for determination of nonvolatile residue (NVR) in monomethylhydrazine (MMH). TIis requirement differs from the Military Specification procurement specification by requiring a NVR analysis with a limit of less than or equal to 10 milligrams per liter. In June 2008, a routine MMH replenishment delivery was transferred into a NASA KSC owned tanker for future delivery to the Space Shuffle pad MMH storage tank. Per Shuffle standard operating procedure, the receiving tanker was sampled and analyzed for purity and surprisingly it failed the Shuttle use NVR specification limit. Detailed examination of the NVR revealed that it was fundamentally different than the typical MMH NVR. This paper will examine various aspects of NVR determination in MMH and the analytical characterization processes used to identify the NVR.

  18. Optical design and performance of F-Theta lenses for high-power and high-precision applications

    NASA Astrophysics Data System (ADS)

    Yurevich, V. I.; Grimm, V. A.; Afonyushkin, A. A.; Yudin, K. V.; Gorny, S. G.

    2015-09-01

    F-Theta lenses are widely used in remote laser processing. Nowadays, a large variety of scanning systems utilizing these devices are commercially available. In this paper, we demonstrate that all practical issues lose their triviality in designing high-performance F-Theta scanning systems. Laser power scaling requires attention to thermally-induced phenomena and ghost reflections. This requirement considerably complicates optimization of the optical configuration of the system and primary aberration correction, even during preliminary design. Obtaining high positioning accuracy requires taking into consideration all probable reasons for processing field distortion. We briefly describe the key engineering relationships and invariants as well as the typical design of a scanner lens and the main field-flattening techniques. Specific emphasis is directed to consideration of the fundamental nonlinearity of two-mirror scanners. To the best of our knowledge, this issue has not been yet studied. We also demonstrate the benefits of our F-Theta lens optimization technique, which uses a plurality of entrance pupils. The problems of eliminating focused ghost reflections and the effects of thermally-induced processes in high-power F-Theta lenses are considered. A set of multi-path 3D processing and laser cutting experiments were conducted and are presented herein to demonstrate the impact of laser beam degradation on the process performance. A selection of our non-standard optical designs is presented.

  19. Implementation of Statistical Process Control: Evaluating the Mechanical Performance of a Candidate Silicone Elastomer Docking Seal

    NASA Technical Reports Server (NTRS)

    Oravec, Heather Ann; Daniels, Christopher C.

    2014-01-01

    The National Aeronautics and Space Administration has been developing a novel docking system to meet the requirements of future exploration missions to low-Earth orbit and beyond. A dynamic gas pressure seal is located at the main interface between the active and passive mating components of the new docking system. This seal is designed to operate in the harsh space environment, but is also to perform within strict loading requirements while maintaining an acceptable level of leak rate. In this study, a candidate silicone elastomer seal was designed, and multiple subscale test articles were manufactured for evaluation purposes. The force required to fully compress each test article at room temperature was quantified and found to be below the maximum allowable load for the docking system. However, a significant amount of scatter was observed in the test results. Due to the stochastic nature of the mechanical performance of this candidate docking seal, a statistical process control technique was implemented to isolate unusual compression behavior from typical mechanical performance. The results of this statistical analysis indicated a lack of process control, suggesting a variation in the manufacturing phase of the process. Further investigation revealed that changes in the manufacturing molding process had occurred which may have influenced the mechanical performance of the seal. This knowledge improves the chance of this and future space seals to satisfy or exceed design specifications.

  20. Fast processing of microscopic images using object-based extended depth of field.

    PubMed

    Intarapanich, Apichart; Kaewkamnerd, Saowaluck; Pannarut, Montri; Shaw, Philip J; Tongsima, Sissades

    2016-12-22

    Microscopic analysis requires that foreground objects of interest, e.g. cells, are in focus. In a typical microscopic specimen, the foreground objects may lie on different depths of field necessitating capture of multiple images taken at different focal planes. The extended depth of field (EDoF) technique is a computational method for merging images from different depths of field into a composite image with all foreground objects in focus. Composite images generated by EDoF can be applied in automated image processing and pattern recognition systems. However, current algorithms for EDoF are computationally intensive and impractical, especially for applications such as medical diagnosis where rapid sample turnaround is important. Since foreground objects typically constitute a minor part of an image, the EDoF technique could be made to work much faster if only foreground regions are processed to make the composite image. We propose a novel algorithm called object-based extended depths of field (OEDoF) to address this issue. The OEDoF algorithm consists of four major modules: 1) color conversion, 2) object region identification, 3) good contrast pixel identification and 4) detail merging. First, the algorithm employs color conversion to enhance contrast followed by identification of foreground pixels. A composite image is constructed using only these foreground pixels, which dramatically reduces the computational time. We used 250 images obtained from 45 specimens of confirmed malaria infections to test our proposed algorithm. The resulting composite images with all in-focus objects were produced using the proposed OEDoF algorithm. We measured the performance of OEDoF in terms of image clarity (quality) and processing time. The features of interest selected by the OEDoF algorithm are comparable in quality with equivalent regions in images processed by the state-of-the-art complex wavelet EDoF algorithm; however, OEDoF required four times less processing time. This work presents a modification of the extended depth of field approach for efficiently enhancing microscopic images. This selective object processing scheme used in OEDoF can significantly reduce the overall processing time while maintaining the clarity of important image features. The empirical results from parasite-infected red cell images revealed that our proposed method efficiently and effectively produced in-focus composite images. With the speed improvement of OEDoF, this proposed algorithm is suitable for processing large numbers of microscope images, e.g., as required for medical diagnosis.

  1. Managing the Nuclear Fuel Cycle: Policy Implications of Expanding Global Access to Nuclear Power

    DTIC Science & Technology

    2009-07-01

    inalienable right and, by and large, neither have U.S. government officials. However, the case of Iran raises perhaps the most critical question in this...slight difference in atomic mass between 235U and 238U. The typical enrichment process requires about 10 lbs of uranium U3O8 to produce 1 lb of low...thermal neutrons but can induce fission in all actinides , including all plutonium isotopes. Therefore, nuclear fuel for a fast reactor must have a

  2. A new active solder for joining electronic components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SMITH,RONALD W.; VIANCO,PAUL T.; HERNANDEZ,CYNTHIA L.

    Electronic components and micro-sensors utilize ceramic substrates, copper and aluminum interconnect and silicon. The joining of these combinations require pre-metallization such that solders with fluxes can wet such combinations of metals and ceramics. The paper will present a new solder alloy that can bond metals, ceramics and composites. The alloy directly wets and bonds in air without the use flux or premetallized layers. The paper will present typical processing steps and joint microstructures in copper, aluminum, aluminum oxide, aluminum nitride, and silicon joints.

  3. [Intraparotid first branchial arch cyst: complex diagnostic and therapeutic process].

    PubMed

    Gilabert Rodríguez, R; Berenguer, B; González Meli, B; Marín Molina, C; de Tomás Palacios, E; Buitrago Weiland, G; Aguado del Hoyo, A

    2013-01-01

    First branchial arch cysts are uncommon. Therefore, together with its variable clinical and age presentation they are often misdiagnosed at first. The treatment is surgical, requiring a correct procedure to avoid future recurrences. In this paper we describe a typical case of first branchial arch cyst in which as described in other reports, we first made several misdiagnoses and therefore an inadequate treatment and lastly, with the correct diagnosis, we performed a meticulous complete excision under facial nerve monitoring.

  4. Neurofeedback Training for BCI Control

    NASA Astrophysics Data System (ADS)

    Neuper, Christa; Pfurtscheller, Gert

    Brain-computer interface (BCI) systems detect changes in brain signals that reflect human intention, then translate these signals to control monitors or external devices (for a comprehensive review, see [1]). BCIs typically measure electrical signals resulting from neural firing (i.e. neuronal action potentials, Electroencephalogram (ECoG), or Electroencephalogram (EEG)). Sophisticated pattern recognition and classification algorithms convert neural activity into the required control signals. BCI research has focused heavily on developing powerful signal processing and machine learning techniques to accurately classify neural activity [2-4].

  5. Solar energy research and utilization

    NASA Technical Reports Server (NTRS)

    Cherry, W. R.

    1974-01-01

    The role of solar energy is visualized in the heating and cooling of buildings, in the production of renewable gaseous, liquid and solid fuels, and in the production of electric power over the next 45 years. Potential impacts of solar energy on various energy markets, and estimated costs of such solar energy systems are discussed. Some typical solar energy utilization processes are described in detail. It is expected that at least 20% of the U.S. total energy requirements by 2020 will be delivered from solar energy.

  6. The National Shipbuilding Research Program, 1991 Ship Production Symposium Proceedings: Paper No. IVA-3: Improving Your Competitive Position Through Total Quality Management (TQM)

    DTIC Science & Technology

    1991-09-01

    process of TQM, it will expect help from suppliers in the pursuit of increased product and service quality . So if your customers are describing their...customer expectations typically prompt the changes. Remaining competitive in today’s global economy requires an increased level of product and service ... quality at lower cost. In government the motivation often arises from Presidential Order #12552, or more importantly, con- strained budgets. The

  7. Automated video surveillance: teaching an old dog new tricks

    NASA Astrophysics Data System (ADS)

    McLeod, Alastair

    1993-12-01

    The automated video surveillance market is booming with new players, new systems, new hardware and software, and an extended range of applications. This paper reviews available technology, and describes the features required for a good automated surveillance system. Both hardware and software are discussed. An overview of typical applications is also given. A shift towards PC-based hybrid systems, use of parallel processing, neural networks, and exploitation of modern telecomms are introduced, highlighting the evolution modern video surveillance systems.

  8. Lock hopper values for coal gasification plant service

    NASA Technical Reports Server (NTRS)

    Schoeneweis, E. F.

    1977-01-01

    Although the operating principle of the lock hopper system is extremely simple, valve applications involving this service for coal gasification plants are likewise extremely difficult. The difficulties center on the requirement of handling highly erosive pulverized coal or char (either in dry or slurry form) combined with the requirement of providing tight sealing against high-pressure (possibly very hot) gas. Operating pressures and temperatures in these applications typically range up to 1600 psi (110bar) and 600F (316C), with certain process requirements going even higher. In addition, and of primary concern, is the need for reliable operation over long service periods with the provision for practical and economical maintenance. Currently available data indicate the requirement for something in the order of 20,000 to 30,000 open-close cycles per year and a desire to operate at least that long without valve failure.

  9. The fluid dynamics of microjet explosions caused by extremely intense X-ray pulses

    NASA Astrophysics Data System (ADS)

    Stan, Claudiu; Laksmono, Hartawan; Sierra, Raymond; Milathianaki, Despina; Koglin, Jason; Messerschmidt, Marc; Williams, Garth; Demirci, Hasan; Botha, Sabine; Nass, Karol; Stone, Howard; Schlichting, Ilme; Shoeman, Robert; Boutet, Sebastien

    2014-11-01

    Femtosecond X-ray scattering experiments at free-electron laser facilities typically requires liquid jet delivery methods to bring samples to the region of interaction with X-rays. We have imaged optically the damage process in water microjets due to intense hard X-ray pulses at the Linac Coherent Light Source (LCLS), using time-resolved imaging techniques to record movies at rates up to half a billion frames per second. For pulse energies larger than a few percent of the maximum pulse energy available at LCLS, the X-rays deposit energies much larger than the latent heat of vaporization in water, and induce a phase explosion that opens a gap in the jet. The LCLS pulses last a few tens of femtoseconds, but the full evolution of the broken jet is orders of magnitude slower - typically in the microsecond range - due to complex fluid dynamics processes triggered by the phase explosion. Although the explosion results in a complex sequence of phenomena, they lead to an approximately self-similar flow of the liquid in the jet.

  10. Learning new faces in typical and atypical populations of children.

    PubMed

    Jones, Rebecca R; Blades, Mark; Coleman, Mike; Pascalis, Olivier

    2013-02-01

    Recognizing an individual as familiar is an important aspect of our social cognition, which requires both learning a face and recalling it. It has been suggested that children with autistic spectrum disorder (ASD) have deficits and abnormalities in face processing. We investigated whether the process by which unfamiliar faces become familiar differs in typically developing (TD) children, children with ASD, and children with developmental delay. Children were familiarized with a set of moving novel faces presented over a three-day period. Recognition of the learned faces was assessed at five time points during the three-day period. Both immediate and delayed recall of faces was tested. All groups showed improvements in face recognition at immediate recall, which indicated that learning had occurred. The TD population showed slightly better performance than the two other groups, however no difference was specific to the ASD group. All groups showed similar levels of improvements with time. Our results are discussed in terms of learning in ASD. © 2013 The Authors. Scandinavian Journal of Psychology © 2013 The Scandinavian Psychological Associations.

  11. Image simulation for automatic license plate recognition

    NASA Astrophysics Data System (ADS)

    Bala, Raja; Zhao, Yonghui; Burry, Aaron; Kozitsky, Vladimir; Fillion, Claude; Saunders, Craig; Rodríguez-Serrano, José

    2012-01-01

    Automatic license plate recognition (ALPR) is an important capability for traffic surveillance applications, including toll monitoring and detection of different types of traffic violations. ALPR is a multi-stage process comprising plate localization, character segmentation, optical character recognition (OCR), and identification of originating jurisdiction (i.e. state or province). Training of an ALPR system for a new jurisdiction typically involves gathering vast amounts of license plate images and associated ground truth data, followed by iterative tuning and optimization of the ALPR algorithms. The substantial time and effort required to train and optimize the ALPR system can result in excessive operational cost and overhead. In this paper we propose a framework to create an artificial set of license plate images for accelerated training and optimization of ALPR algorithms. The framework comprises two steps: the synthesis of license plate images according to the design and layout for a jurisdiction of interest; and the modeling of imaging transformations and distortions typically encountered in the image capture process. Distortion parameters are estimated by measurements of real plate images. The simulation methodology is successfully demonstrated for training of OCR.

  12. Production of Methane and Water from Crew Plastic Waste

    NASA Technical Reports Server (NTRS)

    Captain, Janine; Santiago, Eddie; Parrish, Clyde; Strayer, Richard F.; Garland, Jay L.

    2008-01-01

    Recycling is a technology that will be key to creating a self sustaining lunar outpost. The plastics used for food packaging provide a source of material that could be recycled to produce water and methane. The recycling of these plastics will require some additional resources that will affect the initial estimate of starting materials that will have to be transported from earth, mainly oxygen, energy and mass. These requirements will vary depending on the recycling conditions. The degredation products of these plastics will vary under different atmospheric conditions. An estimate of the the production rate of methane and water using typical ISRU processes along with the plastic recycling will be presented.

  13. Reproducible analyses of microbial food for advanced life support systems

    NASA Technical Reports Server (NTRS)

    Petersen, Gene R.

    1988-01-01

    The use of yeasts in controlled ecological life support systems (CELSS) for microbial food regeneration in space required the accurate and reproducible analysis of intracellular carbohydrate and protein levels. The reproducible analysis of glycogen was a key element in estimating overall content of edibles in candidate yeast strains. Typical analytical methods for estimating glycogen in Saccharomyces were not found to be entirely aplicable to other candidate strains. Rigorous cell lysis coupled with acid/base fractionation followed by specific enzymatic glycogen analyses were required to obtain accurate results in two strains of Candida. A profile of edible fractions of these strains was then determined. The suitability of yeasts as food sources in CELSS food production processes is discussed.

  14. Maintainability Program Requirements for Space Systems

    NASA Technical Reports Server (NTRS)

    1987-01-01

    This document is established to provide common general requirements for all NASA programs to: design maintainability into all systems where maintenance is a factor in system operation and mission success; and ensure that maintainability characteristics are developed through the systems engineering process. These requirements are not new. Design for ease of maintenance and minimization of repair time have always been fundamental requirements of the systems engineering process. However, new or reusable orbital manned and in-flight maintainable unmanned space systems demand special emphasis on maintainability, and this document has been prepared to meet that need. Maintainability requirements on many NASA programs differ in phasing and task emphasis from requirements promulgated by other Government agencies. This difference is due to the research and development nature of NASA programs where quantities produced are generally small; therefore, the depth of logistics support typical of many programs is generally not warranted. The cost of excessive maintenance is very high due to the logistics problems associated with the space environment. The ability to provide timely maintenance often involves safety considerations for manned space flight applications. This document represents a basic set of requirements that will achieve a design for maintenance. These requirements are directed primarily at manned and unmanned orbital space systems. To be effective, maintainability requirements should be tailored to meet specific NASA program and project needs and constraints. NASA activities shall invoke the requirements of this document consistent with program planning in procurements or on inhouse development efforts.

  15. How to justify small-refinery info/control system modernization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haskins, D.E.

    1993-05-01

    Information and control systems modernization can be justified by successful implementation of advanced process control (APC) in nearly all refineries, even the small ones. However, the small refineries require special solutions to meet the challenges of limited resources in both finance and manpower. Based on a number of case studies, a typical small refinery as it operates today is described. A sample information and control system modernization plan is described and the typical cost and benefits show how the project cost can be justified. Business objectives of an HPI plant are to satisfy customers by providing specific products, to satisfymore » the owners by maximizing profits and to satisfy the public by being safe and environmentally correct. Managers have always tried to meet these objectives with functions for the total plant.« less

  16. Evaluation of Mars CO2 Capture and Gas Separation Technologies

    NASA Technical Reports Server (NTRS)

    Muscatello, Anthony C.; Santiago-Maldonado, Edgardo; Gibson, Tracy; Devor, Robert; Captain, James

    2011-01-01

    Recent national policy statements have established that the ultimate destination of NASA's human exploration program is Mars. In Situ Resource Utilization (ISRU) is a key technology required to ,enable such missions and it is appropriate to review progress in this area and continue to advance the systems required to produce rocket propellant, oxygen, and other consumables on Mars using the carbon dioxide atmosphere and other potential resources. The Mars Atmospheric Capture and Gas separation project is selecting, developing, and demonstrating techniques to capture and purify Martian atmospheric gases for their utilization for the production of hydrocarbons, oxygen, and water in ISRU systems. Trace gases will be required to be separated from Martian atmospheric gases to provide pure CO2 to processing elements. In addition, other Martian gases, such as nitrogen and argon, occur in concentrations high enough to be useful as buffer gas and should be captured as well. To achieve these goals, highly efficient gas separation processes will be required. These gas separation techniques are also required across various areas within the ISRU project to support various consumable production processes. The development of innovative gas separation techniques will evaluate the current state-of-the-art for the gas separation required, with the objective to demonstrate and develop light-weight, low-power methods for gas separation. Gas separation requirements include, but are not limited to the selective separation of: (1) methane and water from unreacted carbon oxides (C02-CO) and hydrogen typical of a Sabatier-type process, (2) carbon oxides and water from unreacted hydrogen from a Reverse Water-Gas Shift process, (3)/carbon oxides from oxygen from a trash/waste processing reaction, and (4) helium from hydrogen or oxygen from a propellant scavenging process. Potential technologies for the separations include' freezers, selective membranes, selective solvents, polymeric sorbents, zeolites, and new technologies. This paper summarizes the results of an extensive literature review of candidate technologies for the capture and separation of CO2 and other relevant gases. This information will be used to prioritize the technologies to be developed further during this and other ISRU projects.

  17. Reviewed approach to defining the Active Interlock Envelope for Front End ray tracing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seletskiy, S.; Shaftan, T.

    To protect the NSLS-II Storage Ring (SR) components from damage from synchrotron radiation produced by insertion devices (IDs) the Active Interlock (AI) keeps electron beam within some safe envelope (a.k.a Active Interlock Envelope or AIE) in the transverse phase space. The beamline Front Ends (FEs) are designed under assumption that above certain beam current (typically 2 mA) the ID synchrotron radiation (IDSR) fan is produced by the interlocked e-beam. These assumptions also define how the ray tracing for FE is done. To simplify the FE ray tracing for typical uncanted ID it was decided to provide the Mechanical Engineering groupmore » with a single set of numbers (x,x’,y,y’) for the AIE at the center of the long (or short) ID straight section. Such unified approach to the design of the beamline Front Ends will accelerate the design process and save valuable human resources. In this paper we describe our new approach to defining the AI envelope and provide the resulting numbers required for design of the typical Front End.« less

  18. Design of an FMCW radar baseband signal processing system for automotive application.

    PubMed

    Lin, Jau-Jr; Li, Yuan-Ping; Hsu, Wei-Chiang; Lee, Ta-Sung

    2016-01-01

    For a typical FMCW automotive radar system, a new design of baseband signal processing architecture and algorithms is proposed to overcome the ghost targets and overlapping problems in the multi-target detection scenario. To satisfy the short measurement time constraint without increasing the RF front-end loading, a three-segment waveform with different slopes is utilized. By introducing a new pairing mechanism and a spatial filter design algorithm, the proposed detection architecture not only provides high accuracy and reliability, but also requires low pairing time and computational loading. This proposed baseband signal processing architecture and algorithms balance the performance and complexity, and are suitable to be implemented in a real automotive radar system. Field measurement results demonstrate that the proposed automotive radar signal processing system can perform well in a realistic application scenario.

  19. Model of areas for identifying risks influencing the compliance of technological processes and products

    NASA Astrophysics Data System (ADS)

    Misztal, A.; Belu, N.

    2016-08-01

    Operation of every company is associated with the risk of interfering with proper performance of its fundamental processes. This risk is associated with various internal areas of the company, as well as the environment in which it operates. From the point of view of ensuring compliance of the course of specific technological processes and, consequently, product conformity with requirements, it is important to identify these threats and eliminate or reduce the risk of their occurrence. The purpose of this article is to present a model of areas of identifying risk affecting the compliance of processes and products, which is based on multiregional targeted monitoring of typical places of interference and risk management methods. The model is based on the verification of risk analyses carried out in small and medium-sized manufacturing companies in various industries..

  20. Quantum state conversion in opto-electro-mechanical systems via shortcut to adiabaticity

    NASA Astrophysics Data System (ADS)

    Zhou, Xiao; Liu, Bao-Jie; Shao, L.-B.; Zhang, Xin-Ding; Xue, Zheng-Yuan

    2017-09-01

    Adiabatic processes have found many important applications in modern physics, the distinct merit of which is that accurate control over process timing is not required. However, such processes are slow, which limits their application in quantum computation, due to the limited coherent times of typical quantum systems. Here, we propose a scheme to implement quantum state conversion in opto-electro-mechanical systems via a shortcut to adiabaticity, where the process can be greatly speeded up while precise timing control is still not necessary. In our scheme, by modifying only the coupling strength, we can achieve fast quantum state conversion with high fidelity, where the adiabatic condition does not need to be met. In addition, the population of the unwanted intermediate state can be further suppressed. Therefore, our protocol presents an important step towards practical state conversion between optical and microwave photons, and thus may find many important applications in hybrid quantum information processing.

  1. New Insights on Coastal Foredune Growth: The Relative Contributions of Marine and Aeolian Processes

    NASA Astrophysics Data System (ADS)

    Cohn, Nicholas; Ruggiero, Peter; de Vries, Sierd; Kaminsky, George M.

    2018-05-01

    Coastal foredune growth is typically associated with aeolian sediment transport processes, while foredune erosion is associated with destructive marine processes. New data sets collected at a high energy, dissipative beach suggest that total water levels in the collision regime can cause dunes to accrete—requiring a paradigm shift away from considering collisional wave impacts as unconditionally erosional. From morphologic change data sets, it is estimated that marine processes explain between 9% and 38% of annual dune growth with aeolian processes accounting for the remaining 62% to 91%. The largest wind-driven dune growth occurs during the winter, in response to high wind velocities, but out of phase with summertime beach growth via intertidal sandbar welding. The lack of synchronization between maximum beach sediment supply and wind-driven dune growth indicates that aeolian transport at this site is primarily transport, rather than supply, limited, likely due to a lack of fetch limitations.

  2. Software for biomedical engineering signal processing laboratory experiments.

    PubMed

    Tompkins, Willis J; Wilson, J

    2009-01-01

    In the early 1990's we developed a special computer program called UW DigiScope to provide a mechanism for anyone interested in biomedical digital signal processing to study the field without requiring any other instrument except a personal computer. There are many digital filtering and pattern recognition algorithms used in processing biomedical signals. In general, students have very limited opportunity to have hands-on access to the mechanisms of digital signal processing. In a typical course, the filters are designed non-interactively, which does not provide the student with significant understanding of the design constraints of such filters nor their actual performance characteristics. UW DigiScope 3.0 is the first major update since version 2.0 was released in 1994. This paper provides details on how the new version based on MATLAB! works with signals, including the filter design tool that is the programming interface between UW DigiScope and processing algorithms.

  3. Interference effects of vocalization on dual task performance

    NASA Astrophysics Data System (ADS)

    Owens, J. M.; Goodman, L. S.; Pianka, M. J.

    1984-09-01

    Voice command and control systems have been proposed as a potential means of off-loading the typically overburdened visual information processing system. However, prior to introducing novel human-machine interfacing technologies in high workload environments, consideration must be given to the integration of the new technologists within existing task structures to ensure that no new sources of workload or interference are systematically introduced. This study examined the use of voice interactive systems technology in the joint performance of two cognitive information processing tasks requiring continuous memory and choice reaction wherein a basis for intertask interference might be expected. Stimuli for the continuous memory task were presented aurally and either voice or keyboard responding was required in the choice reaction task. Performance was significantly degraded in each task when voice responding was required in the choice reaction time task. Performance degradation was evident in higher error scores for both the choice reaction and continuous memory tasks. Performance decrements observed under conditions of high intertask stimulus similarity were not statistically significant. The results signal the need to consider further the task requirements for verbal short-term memory when applying speech technology in multitask environments.

  4. Visual Motion Prediction and Verbal False Memory Performance in Autistic Children.

    PubMed

    Tewolde, Furtuna G; Bishop, Dorothy V M; Manning, Catherine

    2018-03-01

    Recent theoretical accounts propose that atypical predictive processing can explain the diverse cognitive and behavioral features associated with autism, and that difficulties in making predictions may be related to reduced contextual processing. In this pre-registered study, 30 autistic children aged 6-14 years and 30 typically developing children matched in age and non-verbal IQ completed visual extrapolation and false memory tasks to assess predictive abilities and contextual processing, respectively. In the visual extrapolation tasks, children were asked to predict when an occluded car would reach the end of a road and when an occluded set of lights would fill up a grid. Autistic children made predictions that were just as precise as those made by typically developing children, across a range of occlusion durations. In the false memory task, autistic and typically developing children did not differ significantly in their discrimination between items presented in a list and semantically related, non-presented items, although the data were insensitive, suggesting the need for larger samples. Our findings help to refine theoretical accounts by challenging the notion that autism is caused by pervasively disordered prediction abilities. Further studies will be required to assess the relationship between predictive processing and context use in autism, and to establish the conditions under which predictive processing may be impaired. Autism Res 2018, 11: 509-518. © 2017 The Authors Autism Research published by International Society for Autism Research and Wiley Periodicals, Inc. It has been suggested that autistic individuals have difficulties making predictions and perceiving the overall gist of things. Yet, here we found that autistic children made similar predictions about hidden objects as non-autistic children. In a memory task, autistic children were slightly less confused about whether they had heard a word before, when words were closely related in meaning. We conclude that autistic children do not show difficulties with this type of prediction. © 2017 The Authors Autism Research published by International Society for Autism Research and Wiley Periodicals, Inc.

  5. A Highly Flexible, Automated System Providing Reliable Sample Preparation in Element- and Structure-Specific Measurements.

    PubMed

    Vorberg, Ellen; Fleischer, Heidi; Junginger, Steffen; Liu, Hui; Stoll, Norbert; Thurow, Kerstin

    2016-10-01

    Life science areas require specific sample pretreatment to increase the concentration of the analytes and/or to convert the analytes into an appropriate form for the detection and separation systems. Various workstations are commercially available, allowing for automated biological sample pretreatment. Nevertheless, due to the required temperature, pressure, and volume conditions in typical element and structure-specific measurements, automated platforms are not suitable for analytical processes. Thus, the purpose of the presented investigation was the design, realization, and evaluation of an automated system ensuring high-precision sample preparation for a variety of analytical measurements. The developed system has to enable system adaption and high performance flexibility. Furthermore, the system has to be capable of dealing with the wide range of required vessels simultaneously, allowing for less cost and time-consuming process steps. However, the system's functionality has been confirmed in various validation sequences. Using element-specific measurements, the automated system was up to 25% more precise compared to the manual procedure and as precise as the manual procedure using structure-specific measurements. © 2015 Society for Laboratory Automation and Screening.

  6. low-Cost, High-Performance Alternatives for Target Temperature Monitoring Using the Near-Infrared Spectrum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Virgo, Mathew; Quigley, Kevin J.; Chemerisov, Sergey

    A process is being developed for commercial production of the medical isotope Mo-99 through a photo-nuclear reaction on a Mo-100 target using a highpower electron accelerator. This process requires temperature monitoring of the window through which a high-current electron beam is transmitted to the target. For this purpose, we evaluated two near infrared technologies: the OMEGA Engineering iR2 pyrometer and the Ocean Optics Maya2000 spectrometer with infrared-enhanced charge-coupled device (CCD) sensor. Measuring in the near infrared spectrum, in contrast to the long-wavelength infrared spectrum, offers a few immediate advantages: (1) ordinary glass or quartz optical elements can be used; (2)more » alignment can be performed without heating the target; and (3) emissivity corrections to temperature are typically less than 10%. If spatial resolution is not required, the infrared pyrometer is attractive because of its accuracy, low cost, and simplicity. If spatial resolution is required, we make recommendations for near-infrared imaging based on our data augmented by calculations« less

  7. Preliminary evidence that different mechanisms underlie the anger superiority effect in children with and without Autism Spectrum Disorders

    PubMed Central

    Isomura, Tomoko; Ogawa, Shino; Yamada, Satoko; Shibasaki, Masahiro; Masataka, Nobuo

    2014-01-01

    Previous studies have demonstrated that angry faces capture humans' attention more rapidly than emotionally positive faces. This phenomenon is referred to as the anger superiority effect (ASE). Despite atypical emotional processing, adults and children with Autism Spectrum Disorders (ASD) have been reported to show ASE as well as typically developed (TD) individuals. So far, however, few studies have clarified whether or not the mechanisms underlying ASE are the same for both TD and ASD individuals. Here, we tested how TD and ASD children process schematic emotional faces during detection by employing a recognition task in combination with a face-in-the-crowd task. Results of the face-in-the-crowd task revealed the prevalence of ASE both in TD and ASD children. However, the results of the recognition task revealed group differences: In TD children, detection of angry faces required more configural face processing and disrupted the processing of local features. In ASD children, on the other hand, it required more feature-based processing rather than configural processing. Despite the small sample sizes, these findings provide preliminary evidence that children with ASD, in contrast to TD children, show quick detection of angry faces by extracting local features in faces. PMID:24904477

  8. Optimization of Primary Drying in Lyophilization during Early Phase Drug Development using a Definitive Screening Design with Formulation and Process Factors.

    PubMed

    Goldman, Johnathan M; More, Haresh T; Yee, Olga; Borgeson, Elizabeth; Remy, Brenda; Rowe, Jasmine; Sadineni, Vikram

    2018-06-08

    Development of optimal drug product lyophilization cycles is typically accomplished via multiple engineering runs to determine appropriate process parameters. These runs require significant time and product investments, which are especially costly during early phase development when the drug product formulation and lyophilization process are often defined simultaneously. Even small changes in the formulation may require a new set of engineering runs to define lyophilization process parameters. In order to overcome these development difficulties, an eight factor definitive screening design (DSD), including both formulation and process parameters, was executed on a fully human monoclonal antibody (mAb) drug product. The DSD enables evaluation of several interdependent factors to define critical parameters that affect primary drying time and product temperature. From these parameters, a lyophilization development model is defined where near optimal process parameters can be derived for many different drug product formulations. This concept is demonstrated on a mAb drug product where statistically predicted cycle responses agree well with those measured experimentally. This design of experiments (DoE) approach for early phase lyophilization cycle development offers a workflow that significantly decreases the development time of clinically and potentially commercially viable lyophilization cycles for a platform formulation that still has variable range of compositions. Copyright © 2018. Published by Elsevier Inc.

  9. Simplified signal processing for impedance spectroscopy with spectrally sparse sequences

    NASA Astrophysics Data System (ADS)

    Annus, P.; Land, R.; Reidla, M.; Ojarand, J.; Mughal, Y.; Min, M.

    2013-04-01

    Classical method for measurement of the electrical bio-impedance involves excitation with sinusoidal waveform. Sinusoidal excitation at fixed frequency points enables wide variety of signal processing options, most general of them being Fourier transform. Multiplication with two quadrature waveforms at desired frequency could be easily accomplished both in analogue and in digital domains, even simplest quadrature square waves can be considered, which reduces signal processing task in analogue domain to synchronous switching followed by low pass filter, and in digital domain requires only additions. So called spectrally sparse excitation sequences (SSS), which have been recently introduced into bio-impedance measurement domain, are very reasonable choice when simultaneous multifrequency excitation is required. They have many good properties, such as ease of generation and good crest factor compared to similar multisinusoids. Typically, the usage of discrete or fast Fourier transform in signal processing step is considered so far. Usage of simplified methods nevertheless would reduce computational burden, and enable simpler, less costly and less energy hungry signal processing platforms. Accuracy of the measurement with SSS excitation when using different waveforms for quadrature demodulation will be compared in order to evaluate the feasibility of the simplified signal processing. Sigma delta modulated sinusoid (binary signal) is considered to be a good alternative for a synchronous demodulation.

  10. Constraint of the 13C(α,n) Cross Section Toward Astrophysical Energies for the Main s-Process

    NASA Astrophysics Data System (ADS)

    Toomey, Rebecca; Febbraro, Michael T.; Pain, Steven D.; Peters, William A.; Cizewski, Jolie A.; Havener, Charles C.; Bannister, Mark E.; Chipps, Kelly A.; Walter, David G.; Ummel, Chad C.; Sims, Harrison

    2017-09-01

    The slow neutron capture process (s-process) typically occurs in relatively low neutron flux environments, such as AGB stars, and is a key mechanism in heavy-element synthesis. The dominant source of neutrons for the main s-process is the 13C(α,n) reaction, which proceeds at stellar temperatures ( 0.1 GK, 200 keV), via reactions well below the Coulomb barrier. Direct measurements of the reaction rate in the Gamow window ( 140- 230 keV) is difficult, complicated by the low yields and high beam currents required. Current measurements have constrained the cross section down to approximately 320 keV - still well above stellar conditions- with significant statistical uncertainties. These uncertainties, and the influence of a near-threshold 1 /2+ state at 6.4 MeV, means that extrapolation of the data into the Gamow window is unreliable. These measurements typically use high-efficiency moderated neutron counter detectors, meaning energy information of the incident neutrons is lost. A quasi-spectroscopic approach has been used to measure the 13C(α,n) reaction rate at energies between 300-350 keV with the aim of reducing uncertainties in current measurements. Work supported in part by U.S. D.O.E., the National Science Foundation and the LDRD Program of ORNL, managed by UT-Battelle, LLC.

  11. Incorporating Non-Linear Sorption into High Fidelity Subsurface Reactive Transport Models

    NASA Astrophysics Data System (ADS)

    Matott, L. S.; Rabideau, A. J.; Allen-King, R. M.

    2014-12-01

    A variety of studies, including multiple NRC (National Research Council) reports, have stressed the need for simulation models that can provide realistic predictions of contaminant behavior during the groundwater remediation process, most recently highlighting the specific technical challenges of "back diffusion and desorption in plume models". For a typically-sized remediation site, a minimum of about 70 million grid cells are required to achieve desired cm-level thickness among low-permeability lenses responsible for driving the back-diffusion phenomena. Such discretization is nearly three orders of magnitude more than is typically seen in modeling practice using public domain codes like RT3D (Reactive Transport in Three Dimensions). Consequently, various extensions have been made to the RT3D code to support efficient modeling of recently proposed dual-mode non-linear sorption processes (e.g. Polanyi with linear partitioning) at high-fidelity scales of grid resolution. These extensions have facilitated development of exploratory models in which contaminants are introduced into an aquifer via an extended multi-decade "release period" and allowed to migrate under natural conditions for centuries. These realistic simulations of contaminant loading and migration provide high fidelity representation of the underlying diffusion and sorption processes that control remediation. Coupling such models with decision support processes is expected to facilitate improved long-term management of complex remediation sites that have proven intractable to conventional remediation strategies.

  12. Detection of Subsurface Defects in Levees in Correlation to Weather Conditions Utilizing Ground Penetrating Radar

    NASA Astrophysics Data System (ADS)

    Martinez, I. A.; Eisenmann, D.

    2012-12-01

    Ground Penetrating Radar (GPR) has been used for many years in successful subsurface detection of conductive and non-conductive objects in all types of material including different soils and concrete. Typical defect detection is based on subjective examination of processed scans using data collection and analysis software to acquire and analyze the data, often requiring a developed expertise or an awareness of how a GPR works while collecting data. Processing programs, such as GSSI's RADAN analysis software are then used to validate the collected information. Iowa State University's Center for Nondestructive Evaluation (CNDE) has built a test site, resembling a typical levee used near rivers, which contains known sub-surface targets of varying size, depth, and conductivity. Scientist at CNDE have developed software with the enhanced capabilities, to decipher a hyperbola's magnitude and amplitude for GPR signal processing. With this enhanced capability, the signal processing and defect detection capabilities for GPR have the potential to be greatly enhanced. This study will examine the effects of test parameters, antenna frequency (400MHz), data manipulation methods (which include data filters and restricting the range of depth in which the chosen antenna's signal can reach), and real-world conditions using this test site (such as varying weather conditions) , with the goal of improving GPR tests sensitivity for differing soil conditions.

  13. A Cognitive Systems Engineering Approach to Developing HMI Requirements for New Technologies

    NASA Technical Reports Server (NTRS)

    Fern, Lisa Carolynn

    2016-01-01

    This document examines the challenges inherent in designing and regulating to support human-automation interaction for new technologies that will deployed into complex systems. A key question for new technologies, is how work will be accomplished by the human and machine agents. This question has traditionally been framed as how functions should be allocated between humans and machines. Such framing misses the coordination and synchronization that is needed for the different human and machine roles in the system to accomplish their goals. Coordination and synchronization demands are driven by the underlying human-automation architecture of the new technology, which are typically not specified explicitly by the designers. The human machine interface (HMI) which is intended to facilitate human-machine interaction and cooperation, however, typically is defined explicitly and therefore serves as a proxy for human-automation cooperation requirements with respect to technical standards for technologies. Unfortunately, mismatches between the HMI and the coordination and synchronization demands of the underlying human-automation architecture, can lead to system breakdowns. A methodology is needed that both designers and regulators can utilize to evaluate the expected performance of a new technology given potential human-automation architectures. Three experiments were conducted to inform the minimum HMI requirements a detect and avoid system for unmanned aircraft systems (UAS). The results of the experiments provided empirical input to specific minimum operational performance standards that UAS manufacturers will have to meet in order to operate UAS in the National Airspace System (NAS). These studies represent a success story for how to objectively and systematically evaluate prototype technologies as part of the process for developing regulatory requirements. They also provide an opportunity to reflect on the lessons learned from a recent research effort in order to improve the methodology for defining technology requirements for regulators in the future. The biggest shortcoming of the presented research program was the absence of the explicit definition, generation and analysis of potential human-automation architectures. Failure to execute this step in the research process resulted in less efficient evaluation of the candidate prototypes technologies in addition to the complete absence of different approaches to human-automation cooperation. For example, all of the prototype technologies that were evaluated in the research program assumed a human-automation architecture that relied on serial processing from the automation to the human. While this type of human-automation architecture is typical across many different technologies and in many different domains, it ignores different architectures where humans and automation work in parallel. Defining potential human-automation architectures a priori also allows regulators to develop scenarios that will stress the performance boundaries of the technology during the evaluation phase. The importance of adding this step of generating and evaluating candidate human-automation architectures prior to formal empirical evaluation is discussed.

  14. Transitioning from conceptual design to construction performance specification

    NASA Astrophysics Data System (ADS)

    Jeffers, Paul; Warner, Mark; Craig, Simon; Hubbard, Robert; Marshall, Heather

    2012-09-01

    On successful completion of a conceptual design review by a funding agency or customer, there is a transition phase before construction contracts can be placed. The nature of this transition phase depends on the Project's approach to construction and the particular subsystem being considered. There are generically two approaches; project retention of design authority and issuance of build to print contracts, or issuance of subsystem performance specifications with controlled interfaces. This paper relates to the latter where a proof of concept (conceptual or reference design) is translated into performance based sub-system specifications for competitive tender. This translation is not a straightforward process and there are a number of different issues to consider in the process. This paper deals with primarily the Telescope mount and Enclosure subsystems. The main subjects considered in this paper are: • Typical status of design at Conceptual Design Review compared with the desired status of Specifications and Interface Control Documents at Request for Quotation. • Options for capture and tracking of system requirements flow down from science / operating requirements and sub-system requirements, and functional requirements derived from reference design. • Requirements that may come specifically from the contracting approach. • Methods for effective use of reference design work without compromising a performance based specification. • Management of project team's expectation relating to design. • Effects on cost estimates from reference design to actual. This paper is based on experience and lessons learned through this process on both the VISTA and the ATST projects.

  15. Procurement engineering - the productivity factor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bargerstock, S.B.

    1993-01-01

    The industry is several years on the road to implementation of the Nuclear Management and Resources Council (NUMARC) initiatives on commercial-grade item dedication and procurement. Utilities have taken several approaches to involve engineering in the procurement process. A common result for the approaches is the additional operations and maintenance (O M) cost imposed by the added resource requirements. Procurement engineering productivity is a key element in controlling this business area. Experience shows that 400 to 500% improvements in productivity are possible with a 2-yr period. Improving the productivity of the procurement engineering function is important in today's competitive utility environment.more » Procurement engineering typically involves four distinct technical evaluation responsibilities along with several administrative areas. Technical evaluations include the functionally based safety classification of replacement components and parts (lacking a master parts list), the determination of dedication requirements for safety-related commercial-grade items, the preparation of a procurement specification to maintain the licensed design bases, and the equivalency evaluation of alternate items not requiring the design-change process. Administrative duties include obtaining technical review of vendor-supplied documentation, identifying obsolete parts and components, resolving material nonconformances, initiating the design-change process for replacement items (as needed), and providing technical support to O M. Although most utilities may not perform or require all the noted activities, a large percentage will apply to each utility station.« less

  16. Intra-individual cognitive imbalance in ASD between perceptual reasoning and ambiguity-solving related to tool use: Comparison among children exhibiting ASD, AD/HD, and typical development.

    PubMed

    Wakusawa, Keisuke; Nara, Chieko; Kubota, Yuki; Tomizawa, Yayoi; Taki, Yasuyuki; Sassa, Yuko; Kobayashi, Satoru; Suzuki-Muromoto, Sato; Hirose, Mieko; Yokoyama, Hiroyuki; Nara, Takahiro; Kure, Shigeo; Mori, Norio; Takei, Noriyoshi; Kawashima, Ryuta

    2018-01-01

    Several studies have suggested that objective deficits in the processing of abstract information in conjunction with an enhanced ability to process concrete information is a definitive characteristic of autism spectrum disorder (ASD). However, this cognitive imbalance is not necessarily clear in high-functioning autistic individuals who do not display absolute differences relative to typically developing (TD) populations. Thus, the purpose of this study was to identify this cognitive tendency in high-functioning autistic individuals using intra-individual cognitive comparisons. The reaction times (RTs) of TD children, children with ASD, and children with attention deficit hyperactivity disorder (AD/HD) (n=17 in each group, mean age=11.9years, age range=9.8-15.8years) were compared using the Which/How-to-Apply Tools (W/HAT) test, which consists of tasks requiring the adaptive use of novel tools and familiar tools in atypical and typical situations. Differences in RTs between the atypical and typical trials ([A-T]) were used to assess intra-individual cognitive imbalances. As predicted, the [A-T] scores of the ASD group were significantly higher than those of the TD group even though the RTs in the atypical and typical trials did not differ. Additionally, the [A-T] values were significantly higher in the ASD group than in the AD/HD group, which indicates that the cognitive imbalance was specific to ASD individuals. No significant interaction was detected between the trial and subject group. The findings of this study demonstrate that a cognitive imbalance in ASD individuals may enhance the current understanding of the pathophysiology of this disorder, which is found in a range of individuals, including those with obvious cortical dysfunction to those with only intra-individual imbalances. Copyright © 2017 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.

  17. A method for reduction of Acoustic Emission (AE) data with application in machine failure detection and diagnosis

    NASA Astrophysics Data System (ADS)

    Vicuña, Cristián Molina; Höweler, Christoph

    2017-12-01

    The use of AE in machine failure diagnosis has increased over the last years. Most AE-based failure diagnosis strategies use digital signal processing and thus require the sampling of AE signals. High sampling rates are required for this purpose (e.g. 2 MHz or higher), leading to streams of large amounts of data. This situation is aggravated if fine resolution and/or multiple sensors are required. These facts combine to produce bulky data, typically in the range of GBytes, for which sufficient storage space and efficient signal processing algorithms are required. This situation probably explains why, in practice, AE-based methods consist mostly in the calculation of scalar quantities such as RMS and Kurtosis, and the analysis of their evolution in time. While the scalar-based approach offers the advantage of maximum data reduction; it has the disadvantage that most part of the information contained in the raw AE signal is lost unrecoverably. This work presents a method offering large data reduction, while keeping the most important information conveyed by the raw AE signal, useful for failure detection and diagnosis. The proposed method consist in the construction of a synthetic, unevenly sampled signal which envelopes the AE bursts present on the raw AE signal in a triangular shape. The constructed signal - which we call TriSignal - also permits the estimation of most scalar quantities typically used for failure detection. But more importantly, it contains the information of the time of occurrence of the bursts, which is key for failure diagnosis. Lomb-Scargle normalized periodogram is used to construct the TriSignal spectrum, which reveals the frequency content of the TriSignal and provides the same information as the classic AE envelope. The paper includes application examples in planetary gearbox and low-speed rolling element bearing.

  18. Multi-Product Microalgae Biorefineries: From Concept Towards Reality.

    PubMed

    't Lam, G P; Vermuë, M H; Eppink, M H M; Wijffels, R H; van den Berg, C

    2018-02-01

    Although microalgae are a promising biobased feedstock, industrial scale production is still far off. To enhance the economic viability of large-scale microalgae processes, all biomass components need to be valorized, requiring a multi-product biorefinery. However, this concept is still too expensive. Typically, downstream processing of industrial biotechnological bulk products accounts for 20-40% of the total production costs, while for a microalgae multi-product biorefinery the costs are substantially higher (50-60%). These costs are high due to the lack of appropriate and mild technologies to access the different product fractions such as proteins, carbohydrates, and lipids. To reduce the costs, simplified processes need to be developed for the main unit operations including harvesting, cell disruption, extraction, and possibly fractionation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Qualification of a rapid readout biological indicator with moist heat sterilization.

    PubMed

    McCormick, Patrick; Finocchario, Catherine; Manchester, Robert; Glasgow, Louis; Costanzo, Stephen

    2003-01-01

    Biological indicators are recognized as an important component in the validation and routine monitoring of moist heat (steam) sterilization processes. Due to the need to allow for the recovery and outgrowth of test organisms that may have been sub-lethally injured, between 2-5 days of incubation are typically required before the outcome of sterilization processing can be reliably interpreted. Rapid readout biological indicators that incorporate the response of a heat resistant enzyme provide a means for assessing the efficacy of moist heat sterilization within hours of processing. This study describes the qualification of the 3M Attest 1292 Rapid Readout Biological Indicator with moist heat sterilization according to the procedures described in the PDA Technical Report No. 33, "Evaluation, Validation and Implementation of New Microbiological Testing Methods".

  20. Methodology for stereoscopic motion-picture quality assessment

    NASA Astrophysics Data System (ADS)

    Voronov, Alexander; Vatolin, Dmitriy; Sumin, Denis; Napadovsky, Vyacheslav; Borisov, Alexey

    2013-03-01

    Creating and processing stereoscopic video imposes additional quality requirements related to view synchronization. In this work we propose a set of algorithms for detecting typical stereoscopic-video problems, which appear owing to imprecise setup of capture equipment or incorrect postprocessing. We developed a methodology for analyzing the quality of S3D motion pictures and for revealing their most problematic scenes. We then processed 10 modern stereo films, including Avatar, Resident Evil: Afterlife and Hugo, and analyzed changes in S3D-film quality over the years. This work presents real examples of common artifacts (color and sharpness mismatch, vertical disparity and excessive horizontal disparity) in the motion pictures we processed, as well as possible solutions for each problem. Our results enable improved quality assessment during the filming and postproduction stages.

  1. Natural Language Processing Methods and Systems for Biomedical Ontology Learning

    PubMed Central

    Liu, Kaihong; Hogan, William R.; Crowley, Rebecca S.

    2010-01-01

    While the biomedical informatics community widely acknowledges the utility of domain ontologies, there remain many barriers to their effective use. One important requirement of domain ontologies is that they must achieve a high degree of coverage of the domain concepts and concept relationships. However, the development of these ontologies is typically a manual, time-consuming, and often error-prone process. Limited resources result in missing concepts and relationships as well as difficulty in updating the ontology as knowledge changes. Methodologies developed in the fields of natural language processing, information extraction, information retrieval and machine learning provide techniques for automating the enrichment of an ontology from free-text documents. In this article, we review existing methodologies and developed systems, and discuss how existing methods can benefit the development of biomedical ontologies. PMID:20647054

  2. Saving Material with Systematic Process Designs

    NASA Astrophysics Data System (ADS)

    Kerausch, M.

    2011-08-01

    Global competition is forcing the stamping industry to further increase quality, to shorten time-to-market and to reduce total cost. Continuous balancing between these classical time-cost-quality targets throughout the product development cycle is required to ensure future economical success. In today's industrial practice, die layout standards are typically assumed to implicitly ensure the balancing of company specific time-cost-quality targets. Although die layout standards are a very successful approach, there are two methodical disadvantages. First, the capabilities for tool design have to be continuously adapted to technological innovations; e.g. to take advantage of the full forming capability of new materials. Secondly, the great variety of die design aspects have to be reduced to a generic rule or guideline; e.g. binder shape, draw-in conditions or the use of drawbeads. Therefore, it is important to not overlook cost or quality opportunities when applying die design standards. This paper describes a systematic workflow with focus on minimizing material consumption. The starting point of the investigation is a full process plan for a typical structural part. All requirements are definedaccording to a predefined set of die design standards with industrial relevance are fulfilled. In a first step binder and addendum geometry is systematically checked for material saving potentials. In a second step, blank shape and draw-in are adjusted to meet thinning, wrinkling and springback targets for a minimum blank solution. Finally the identified die layout is validated with respect to production robustness versus splits, wrinkles and springback. For all three steps the applied methodology is based on finite element simulation combined with a stochastical variation of input variables. With the proposed workflow a well-balanced (time-cost-quality) production process assuring minimal material consumption can be achieved.

  3. Models of unit operations used for solid-waste processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savage, G.M.; Glaub, J.C.; Diaz, L.F.

    1984-09-01

    This report documents the unit operations models that have been developed for typical refuse-derived-fuel (RDF) processing systems. These models, which represent the mass balances, energy requirements, and economics of the unit operations, are derived, where possible, from basic principles. Empiricism has been invoked where a governing theory has yet to be developed. Field test data and manufacturers' information, where available, supplement the analytical development of the models. A literature review has also been included for the purpose of compiling and discussing in one document the available information pertaining to the modeling of front-end unit operations. Separate analytics have been donemore » for each task.« less

  4. Dosimetry procedures for an industrial irradiation plant

    NASA Astrophysics Data System (ADS)

    Grahn, Ch.

    Accurate and reliable dosimetry procedures constitute a very important part of process control and quality assurance at a radiation processing plant. γ-Dose measurements were made on the GBS 84 irradiator for food and other products on pallets or in containers. Chemical dosimeters wre exposed in the facility under conditions of the typical plant operation. The choice of the dosimeter systems employed was based on the experience in chemical dosimetry gained over several years. Dose uniformity information was obtained in air, spices, bulbs, feeds, cosmetics, plastics and surgical goods. Most products currently irradiated require dose uniformity which can be efficiently provided by pallet or box irradiators like GBS 84. The radiation performance characteristics and some dosimetry procedures are discussed.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, D.J.; Warner, J.A.; LeBarron, N.

    Processes that use energetic ions for large substrates require that the time-averaged erosion effects from the ion flux be uniform across the surface. A numerical model has been developed to determine this flux and its effects on surface etching of a silica/photoresist combination. The geometry of the source and substrate is very similar to a typical deposition geometry with single or planetary substrate rotation. The model was used to tune an inert ion-etching process that used single or multiple Kaufman sources to less than 3% uniformity over a 30-cm aperture after etching 8 {micro}m of material. The same model canmore » be used to predict uniformity for ion-assisted deposition (IAD).« less

  6. Luminance- and Texture-Defined Information Processing in School-Aged Children with Autism

    PubMed Central

    Rivest, Jessica B.; Jemel, Boutheina; Bertone, Armando; McKerral, Michelle; Mottron, Laurent

    2013-01-01

    According to the complexity-specific hypothesis, the efficacy with which individuals with autism spectrum disorder (ASD) process visual information varies according to the extensiveness of the neural network required to process stimuli. Specifically, adults with ASD are less sensitive to texture-defined (or second-order) information, which necessitates the implication of several cortical visual areas. Conversely, the sensitivity to simple, luminance-defined (or first-order) information, which mainly relies on primary visual cortex (V1) activity, has been found to be either superior (static material) or intact (dynamic material) in ASD. It is currently unknown if these autistic perceptual alterations are present in childhood. In the present study, behavioural (threshold) and electrophysiological measures were obtained for static luminance- and texture-defined gratings presented to school-aged children with ASD and compared to those of typically developing children. Our behavioural and electrophysiological (P140) results indicate that luminance processing is likely unremarkable in autistic children. With respect to texture processing, there was no significant threshold difference between groups. However, unlike typical children, autistic children did not show reliable enhancements of brain activity (N230 and P340) in response to texture-defined gratings relative to luminance-defined gratings. This suggests reduced efficiency of neuro-integrative mechanisms operating at a perceptual level in autism. These results are in line with the idea that visual atypicalities mediated by intermediate-scale neural networks emerge before or during the school-age period in autism. PMID:24205355

  7. Luminance- and texture-defined information processing in school-aged children with autism.

    PubMed

    Rivest, Jessica B; Jemel, Boutheina; Bertone, Armando; McKerral, Michelle; Mottron, Laurent

    2013-01-01

    According to the complexity-specific hypothesis, the efficacy with which individuals with autism spectrum disorder (ASD) process visual information varies according to the extensiveness of the neural network required to process stimuli. Specifically, adults with ASD are less sensitive to texture-defined (or second-order) information, which necessitates the implication of several cortical visual areas. Conversely, the sensitivity to simple, luminance-defined (or first-order) information, which mainly relies on primary visual cortex (V1) activity, has been found to be either superior (static material) or intact (dynamic material) in ASD. It is currently unknown if these autistic perceptual alterations are present in childhood. In the present study, behavioural (threshold) and electrophysiological measures were obtained for static luminance- and texture-defined gratings presented to school-aged children with ASD and compared to those of typically developing children. Our behavioural and electrophysiological (P140) results indicate that luminance processing is likely unremarkable in autistic children. With respect to texture processing, there was no significant threshold difference between groups. However, unlike typical children, autistic children did not show reliable enhancements of brain activity (N230 and P340) in response to texture-defined gratings relative to luminance-defined gratings. This suggests reduced efficiency of neuro-integrative mechanisms operating at a perceptual level in autism. These results are in line with the idea that visual atypicalities mediated by intermediate-scale neural networks emerge before or during the school-age period in autism.

  8. The role of temporo-parietal junction (TPJ) in global Gestalt perception.

    PubMed

    Huberle, Elisabeth; Karnath, Hans-Otto

    2012-07-01

    Grouping processes enable the coherent perception of our environment. A number of brain areas has been suggested to be involved in the integration of elements into objects including early and higher visual areas along the ventral visual pathway as well as motion-processing areas of the dorsal visual pathway. However, integration not only is required for the cortical representation of individual objects, but is also essential for the perception of more complex visual scenes consisting of several different objects and/or shapes. The present fMRI experiments aimed to address such integration processes. We investigated the neural correlates underlying the global Gestalt perception of hierarchically organized stimuli that allowed parametrical degrading of the object at the global level. The comparison of intact versus disturbed perception of the global Gestalt revealed a network of cortical areas including the temporo-parietal junction (TPJ), anterior cingulate cortex and the precuneus. The TPJ location corresponds well with the areas known to be typically lesioned in stroke patients with simultanagnosia following bilateral brain damage. These patients typically show a deficit in identifying the global Gestalt of a visual scene. Further, we found the closest relation between behavioral performance and fMRI activation for the TPJ. Our data thus argue for a significant role of the TPJ in human global Gestalt perception.

  9. Micro Machining of Injection Mold Inserts for Fluidic Channel of Polymeric Biochips

    PubMed Central

    Jung, Woo-Chul; Heo, Young-Moo; Yoon, Gil-Sang; Shin, Kwang-Ho; Chang, Sung-Ho; Kim, Gun-Hee; Cho, Myeong-Woo

    2007-01-01

    Recently, the polymeric micro-fluidic biochip, often called LOC (lab-on-a-chip), has been focused as a cheap, rapid and simplified method to replace the existing biochemical laboratory works. It becomes possible to form miniaturized lab functionalities on a chip with the development of MEMS technologies. The micro-fluidic chips contain many micro-channels for the flow of sample and reagents, mixing, and detection tasks. Typical substrate materials for the chip are glass and polymers. Typical techniques for microfluidic chip fabrication are utilizing various micro pattern forming methods, such as wet-etching, micro-contact printing, and hot-embossing, micro injection molding, LIGA, and micro powder blasting processes, etc. In this study, to establish the basis of the micro pattern fabrication and mass production of polymeric micro-fluidic chips using injection molding process, micro machining method was applied to form micro-channels on the LOC molds. In the research, a series of machining experiments using micro end-mills were performed to determine optimum machining conditions to improve surface roughness and shape accuracy of designed simplified micro-channels. Obtained conditions were used to machine required mold inserts for micro-channels using micro end-mills. Test injection processes using machined molds and COC polymer were performed, and then the results were investigated.

  10. Fast mapping semantic features: performance of adults with normal language, history of disorders of spoken and written language, and attention deficit hyperactivity disorder on a word-learning task.

    PubMed

    Alt, Mary; Gutmann, Michelle L

    2009-01-01

    This study was designed to test the word learning abilities of adults with typical language abilities, those with a history of disorders of spoken or written language (hDSWL), and hDSWL plus attention deficit hyperactivity disorder (+ADHD). Sixty-eight adults were required to associate a novel object with a novel label, and then recognize semantic features of the object and phonological features of the label. Participants were tested for overt ability (accuracy) and covert processing (reaction time). The +ADHD group was less accurate at mapping semantic features and slower to respond to lexical labels than both other groups. Different factors correlated with word learning performance for each group. Adults with language and attention deficits are more impaired at word learning than adults with language deficits only. Despite behavioral profiles like typical peers, adults with hDSWL may use different processing strategies than their peers. Readers will be able to: (1) recognize the influence of a dual disability (hDSWL and ADHD) on word learning outcomes; (2) identify factors that may contribute to word learning in adults in terms of (a) the nature of the words to be learned and (b) the language processing of the learner.

  11. Enabling Low-Power, Multi-Modal Neural Interfaces Through a Common, Low-Bandwidth Feature Space.

    PubMed

    Irwin, Zachary T; Thompson, David E; Schroeder, Karen E; Tat, Derek M; Hassani, Ali; Bullard, Autumn J; Woo, Shoshana L; Urbanchek, Melanie G; Sachs, Adam J; Cederna, Paul S; Stacey, William C; Patil, Parag G; Chestek, Cynthia A

    2016-05-01

    Brain-Machine Interfaces (BMIs) have shown great potential for generating prosthetic control signals. Translating BMIs into the clinic requires fully implantable, wireless systems; however, current solutions have high power requirements which limit their usability. Lowering this power consumption typically limits the system to a single neural modality, or signal type, and thus to a relatively small clinical market. Here, we address both of these issues by investigating the use of signal power in a single narrow frequency band as a decoding feature for extracting information from electrocorticographic (ECoG), electromyographic (EMG), and intracortical neural data. We have designed and tested the Multi-modal Implantable Neural Interface (MINI), a wireless recording system which extracts and transmits signal power in a single, configurable frequency band. In prerecorded datasets, we used the MINI to explore low frequency signal features and any resulting tradeoff between power savings and decoding performance losses. When processing intracortical data, the MINI achieved a power consumption 89.7% less than a more typical system designed to extract action potential waveforms. When processing ECoG and EMG data, the MINI achieved similar power reductions of 62.7% and 78.8%. At the same time, using the single signal feature extracted by the MINI, we were able to decode all three modalities with less than a 9% drop in accuracy relative to using high-bandwidth, modality-specific signal features. We believe this system architecture can be used to produce a viable, cost-effective, clinical BMI.

  12. Lasers for industrial production processing: tailored tools with increasing flexibility

    NASA Astrophysics Data System (ADS)

    Rath, Wolfram

    2012-03-01

    High-power fiber lasers are the newest generation of diode-pumped solid-state lasers. Due to their all-fiber design they are compact, efficient and robust. Rofin's Fiber lasers are available with highest beam qualities but the use of different process fiber core sizes enables the user additionally to adapt the beam quality, focus size and Rayleigh length to his requirements for best processing results. Multi-mode fibers from 50μm to 600μm with corresponding beam qualities of 2.5 mm.mrad to 25 mm.mrad are typically used. The integrated beam switching modules can make the laser power available to 4 different manufacturing systems or can share the power to two processing heads for parallel processing. Also CO2 Slab lasers combine high power with either "single-mode" beam quality or higher order modes. The wellestablished technique is in use for a large number of industrial applications, processing either metals or non-metallic materials. For many of these applications CO2 lasers remain the best choice of possible laser sources either driven by the specific requirements of the application or because of the cost structure of the application. The actual technical properties of these lasers will be presented including an overview over the wavelength driven differences of application results, examples of current industrial practice as cutting, welding, surface processing including the flexible use of scanners and classical optics processing heads.

  13. Basic Auditory Processing Skills and Phonological Awareness in Low-IQ Readers and Typically Developing Controls

    ERIC Educational Resources Information Center

    Kuppen, Sarah; Huss, Martina; Fosker, Tim; Fegan, Natasha; Goswami, Usha

    2011-01-01

    We explore the relationships between basic auditory processing, phonological awareness, vocabulary, and word reading in a sample of 95 children, 55 typically developing children, and 40 children with low IQ. All children received nonspeech auditory processing tasks, phonological processing and literacy measures, and a receptive vocabulary task.…

  14. Activation of sputter-processed indium–gallium–zinc oxide films by simultaneous ultraviolet and thermal treatments

    PubMed Central

    Tak, Young Jun; Du Ahn, Byung; Park, Sung Pyo; Kim, Si Joon; Song, Ae Ran; Chung, Kwun-Bum; Kim, Hyun Jae

    2016-01-01

    Indium–gallium–zinc oxide (IGZO) films, deposited by sputtering at room temperature, still require activation to achieve satisfactory semiconductor characteristics. Thermal treatment is typically carried out at temperatures above 300 °C. Here, we propose activating sputter- processed IGZO films using simultaneous ultraviolet and thermal (SUT) treatments to decrease the required temperature and enhance their electrical characteristics and stability. SUT treatment effectively decreased the amount of carbon residues and the number of defect sites related to oxygen vacancies and increased the number of metal oxide (M–O) bonds through the decomposition-rearrangement of M–O bonds and oxygen radicals. Activation of IGZO TFTs using the SUT treatment reduced the processing temperature to 150 °C and improved various electrical performance metrics including mobility, on-off ratio, and threshold voltage shift (positive bias stress for 10,000 s) from 3.23 to 15.81 cm2/Vs, 3.96 × 107 to 1.03 × 108, and 11.2 to 7.2 V, respectively. PMID:26902863

  15. Efficient SRAM yield optimization with mixture surrogate modeling

    NASA Astrophysics Data System (ADS)

    Zhongjian, Jiang; Zuochang, Ye; Yan, Wang

    2016-12-01

    Largely repeated cells such as SRAM cells usually require extremely low failure-rate to ensure a moderate chi yield. Though fast Monte Carlo methods such as importance sampling and its variants can be used for yield estimation, they are still very expensive if one needs to perform optimization based on such estimations. Typically the process of yield calculation requires a lot of SPICE simulation. The circuit SPICE simulation analysis accounted for the largest proportion of time in the process yield calculation. In the paper, a new method is proposed to address this issue. The key idea is to establish an efficient mixture surrogate model. The surrogate model is based on the design variables and process variables. This model construction method is based on the SPICE simulation to get a certain amount of sample points, these points are trained for mixture surrogate model by the lasso algorithm. Experimental results show that the proposed model is able to calculate accurate yield successfully and it brings significant speed ups to the calculation of failure rate. Based on the model, we made a further accelerated algorithm to further enhance the speed of the yield calculation. It is suitable for high-dimensional process variables and multi-performance applications.

  16. Computational efficiency improvements for image colorization

    NASA Astrophysics Data System (ADS)

    Yu, Chao; Sharma, Gaurav; Aly, Hussein

    2013-03-01

    We propose an efficient algorithm for colorization of greyscale images. As in prior work, colorization is posed as an optimization problem: a user specifies the color for a few scribbles drawn on the greyscale image and the color image is obtained by propagating color information from the scribbles to surrounding regions, while maximizing the local smoothness of colors. In this formulation, colorization is obtained by solving a large sparse linear system, which normally requires substantial computation and memory resources. Our algorithm improves the computational performance through three innovations over prior colorization implementations. First, the linear system is solved iteratively without explicitly constructing the sparse matrix, which significantly reduces the required memory. Second, we formulate each iteration in terms of integral images obtained by dynamic programming, reducing repetitive computation. Third, we use a coarseto- fine framework, where a lower resolution subsampled image is first colorized and this low resolution color image is upsampled to initialize the colorization process for the fine level. The improvements we develop provide significant speedup and memory savings compared to the conventional approach of solving the linear system directly using off-the-shelf sparse solvers, and allow us to colorize images with typical sizes encountered in realistic applications on typical commodity computing platforms.

  17. Optical radiation hazards of laser welding processes. Part 1: Neodymium-YAG laser.

    PubMed

    Rockwell, R J; Moss, C E

    1983-08-01

    High power laser devices are being used for numerous metalworking processes such as welding, cutting and heat treating. Such laser devices are totally enclosed either by the manufacturer or the end-user. When this is done, the total laser system is usually certified by the manufacturer following the federal requirements of the Code of Federal Regulations (CFR) 1040.10 and 10.40.11 as a Class I laser system. Similarly, the end-user may also reclassify an enclosed high-power laser into the Class I category following the requirements of the American National Standards Institute (ANSI) Z-136.1 (1980) standard. There are, however, numerous industrial laser applications where Class IV systems are required to be used in an unenclosed manner. In such applications, there is concern for both ocular and skin hazards caused by direct and scattered laser radiation, as well as potential hazards caused by the optical radiation created by the laser beam's interaction with the metal (i.e. the plume radiation). Radiant energy measurements are reported for both the scattered laser radiation and the resultant plume radiations which were produced during typical unenclosed Class IV Neodymium-YAG laser welding processes. Evaluation of the plume radiation was done with both radiometric and spectroradiometric measurement equipment. The data obtained were compared to applicable safety standards.

  18. Low-Carbon Metallurgical Concepts for Seamless Octg Pipe

    NASA Astrophysics Data System (ADS)

    Mohrbacher, Hardy

    Seamless pipes are available with wall gages of up to 100 mm and outer diameters up to around 700 mm. Such pipes are typically used for oil country tubular goods as well as for structural applications. Due to market requirements the demand for high strength grade seamless pipes is increasing. Many applications need high toughness in addition to high strength. The different rolling processes applied in production depend on wall gage and pipe diameter. The continuous mandrel mill process is used to produce smaller gages and diameters; plug mill processing covers medium gages and diameters; Pilger mill processing allows producing larger diameters and heavy wall gage. In all these processes only a limited degree of thermo-mechanical rolling can be achieved. Therefore strengthening and toughening by severe grain refinement employing a conventional niobium-based microalloying concept is not easily achievable. Accordingly, high strength and toughness seamless pipe is typically produced via a quench and tempering process route. This route however is costly and above that often constitutes a capacity bottleneck in the mill. Innovative low-carbon alloy concepts however do allow producing strength up to grade X70 at very high toughness directly off the rolling plant, i.e., without quench and tempering treatment. Due to the low carbon content also welding is much facilitated. The paper reveals the metallurgical principles, which are based on appropriate niobium and molybdenum alloying. Additionally the paper demonstrates how heavy gaged seamless pipes up to 70 mm wall thickness can be produced based on a low-carbon Nb-Mo approach using quench and temper treatment.

  19. An Analysis of the Air Force Government Operated Civil Engineering Supply Store Logistic System: How Can It Be Improved?

    DTIC Science & Technology

    1990-09-01

    6 Logistics Systems ............ 7 GOCESS Operation . . . . . . . ..... 9 Work Order Processing . . . . ... 12 Job Order Processing . . . . . . . . . . 14...orders and job orders to the Material Control Section will be discussed separately. Work Order Processing . Figure 2 illustrates typical WO processing...logistics function. The JO processing is similar. Job Order Processing . Figure 3 illustrates typical JO processing in a GOCESS operation. As with WOs, this

  20. Beyond Self-Report: Tools to Compare Estimated and Real-World Smartphone Use

    PubMed Central

    Andrews, Sally; Ellis, David A.; Shaw, Heather; Piwek, Lukasz

    2015-01-01

    Psychologists typically rely on self-report data when quantifying mobile phone usage, despite little evidence of its validity. In this paper we explore the accuracy of using self-reported estimates when compared with actual smartphone use. We also include source code to process and visualise these data. We compared 23 participants’ actual smartphone use over a two-week period with self-reported estimates and the Mobile Phone Problem Use Scale. Our results indicate that estimated time spent using a smartphone may be an adequate measure of use, unless a greater resolution of data are required. Estimates concerning the number of times an individual used their phone across a typical day did not correlate with actual smartphone use. Neither estimated duration nor number of uses correlated with the Mobile Phone Problem Use Scale. We conclude that estimated smartphone use should be interpreted with caution in psychological research. PMID:26509895

  1. Heat flux sensor research and development: The cool film calorimeter

    NASA Technical Reports Server (NTRS)

    Abtahi, A.; Dean, P.

    1990-01-01

    The goal was to meet the measurement requirement of the NASP program for a gauge capable of measuring heat flux into a 'typical' structure in a 'typical' hypersonic flight environment. A device is conceptually described that has fast response times and is small enough to fit in leading edge or cowl lip structures. The device relies heavily on thin film technology. The main conclusion is the description of the limitations of thin film technology both in the art of fabrication and in the assumption that thin films have the same material properties as the original bulk material. Three gauges were designed and fabricated. Thin film deposition processes were evaluated. The effect of different thin film materials on the performance and fabrication of the gauge was studied. The gauges were tested in an arcjet facility. Survivability and accuracy were determined under various hostile environment conditions.

  2. Visualization of multi-INT fusion data using Java Viewer (JVIEW)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Aved, Alex; Nagy, James; Scott, Stephen

    2014-05-01

    Visualization is important for multi-intelligence fusion and we demonstrate issues for presenting physics-derived (i.e., hard) and human-derived (i.e., soft) fusion results. Physics-derived solutions (e.g., imagery) typically involve sensor measurements that are objective, while human-derived (e.g., text) typically involve language processing. Both results can be geographically displayed for user-machine fusion. Attributes of an effective and efficient display are not well understood, so we demonstrate issues and results for filtering, correlation, and association of data for users - be they operators or analysts. Operators require near-real time solutions while analysts have the opportunities of non-real time solutions for forensic analysis. In a use case, we demonstrate examples using the JVIEW concept that has been applied to piloting, space situation awareness, and cyber analysis. Using the open-source JVIEW software, we showcase a big data solution for multi-intelligence fusion application for context-enhanced information fusion.

  3. Beyond Self-Report: Tools to Compare Estimated and Real-World Smartphone Use.

    PubMed

    Andrews, Sally; Ellis, David A; Shaw, Heather; Piwek, Lukasz

    2015-01-01

    Psychologists typically rely on self-report data when quantifying mobile phone usage, despite little evidence of its validity. In this paper we explore the accuracy of using self-reported estimates when compared with actual smartphone use. We also include source code to process and visualise these data. We compared 23 participants' actual smartphone use over a two-week period with self-reported estimates and the Mobile Phone Problem Use Scale. Our results indicate that estimated time spent using a smartphone may be an adequate measure of use, unless a greater resolution of data are required. Estimates concerning the number of times an individual used their phone across a typical day did not correlate with actual smartphone use. Neither estimated duration nor number of uses correlated with the Mobile Phone Problem Use Scale. We conclude that estimated smartphone use should be interpreted with caution in psychological research.

  4. Hardware Architecture Study for NASA's Space Software Defined Radios

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Scardelletti, Maximilian C.; Mortensen, Dale J.; Kacpura, Thomas J.; Andro, Monty; Smith, Carl; Liebetreu, John

    2008-01-01

    This study defines a hardware architecture approach for software defined radios to enable commonality among NASA space missions. The architecture accommodates a range of reconfigurable processing technologies including general purpose processors, digital signal processors, field programmable gate arrays (FPGAs), and application-specific integrated circuits (ASICs) in addition to flexible and tunable radio frequency (RF) front-ends to satisfy varying mission requirements. The hardware architecture consists of modules, radio functions, and and interfaces. The modules are a logical division of common radio functions that comprise a typical communication radio. This paper describes the architecture details, module definitions, and the typical functions on each module as well as the module interfaces. Trade-offs between component-based, custom architecture and a functional-based, open architecture are described. The architecture does not specify the internal physical implementation within each module, nor does the architecture mandate the standards or ratings of the hardware used to construct the radios.

  5. Space Telecommunications Radio Systems (STRS) Hardware Architecture Standard: Release 1.0 Hardware Section

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Kacpura, Thomas J.; Smith, Carl R.; Liebetreu, John; Hill, Gary; Mortensen, Dale J.; Andro, Monty; Scardelletti, Maximilian C.; Farrington, Allen

    2008-01-01

    This report defines a hardware architecture approach for software-defined radios to enable commonality among NASA space missions. The architecture accommodates a range of reconfigurable processing technologies including general-purpose processors, digital signal processors, field programmable gate arrays, and application-specific integrated circuits (ASICs) in addition to flexible and tunable radiofrequency front ends to satisfy varying mission requirements. The hardware architecture consists of modules, radio functions, and interfaces. The modules are a logical division of common radio functions that compose a typical communication radio. This report describes the architecture details, the module definitions, the typical functions on each module, and the module interfaces. Tradeoffs between component-based, custom architecture and a functional-based, open architecture are described. The architecture does not specify a physical implementation internally on each module, nor does the architecture mandate the standards or ratings of the hardware used to construct the radios.

  6. Megawatt-Scale Application of Thermoelectric Devices in Thermal Power Plants

    NASA Astrophysics Data System (ADS)

    Knox, A. R.; Buckle, J.; Siviter, J.; Montecucco, A.; McCulloch, E.

    2013-07-01

    Despite the recent investment in renewable and sustainable energy sources, over 95% of the UK's electrical energy generation relies on the use of thermal power plants utilizing the Rankine cycle. Advanced supercritical Rankine cycle power plants typically have a steam temperature in excess of 600°C at a pressure of 290 bar and yet still have an overall efficiency below 50%, with much of this wasted energy being rejected to the environment through the condenser/cooling tower. This paper examines the opportunity for large-scale application of thermoelectric heat pumps to modify the Rankine cycle in such plants by preheating the boiler feedwater using energy recovered from the condenser system at a rate of approximately 1 MWth per °C temperature rise. A derivation of the improved process cycle efficiency and breakeven coefficient of performance required for economic operation is presented for a typical supercritical 600-MWe installation.

  7. Stalled RNAP-II molecules bound to non-coding rDNA spacers are required for normal nucleolus architecture.

    PubMed

    Freire-Picos, M A; Landeira-Ameijeiras, V; Mayán, María D

    2013-07-01

    The correct distribution of nuclear domains is critical for the maintenance of normal cellular processes such as transcription and replication, which are regulated depending on their location and surroundings. The most well-characterized nuclear domain, the nucleolus, is essential for cell survival and metabolism. Alterations in nucleolar structure affect nuclear dynamics; however, how the nucleolus and the rest of the nuclear domains are interconnected is largely unknown. In this report, we demonstrate that RNAP-II is vital for the maintenance of the typical crescent-shaped structure of the nucleolar rDNA repeats and rRNA transcription. When stalled RNAP-II molecules are not bound to the chromatin, the nucleolus loses its typical crescent-shaped structure. However, the RNAP-II interaction with Seh1p, or cryptic transcription by RNAP-II, is not critical for morphological changes. Copyright © 2013 John Wiley & Sons, Ltd.

  8. Synthesis and evaluation of a series of 6-chloro-4-methylumbelliferyl glycosides as fluorogenic reagents for screening metagenomic libraries for glycosidase activity.

    PubMed

    Chen, Hong-Ming; Armstrong, Zachary; Hallam, Steven J; Withers, Stephen G

    2016-02-08

    Screening of large enzyme libraries such as those derived from metagenomic sources requires sensitive substrates. Fluorogenic glycosides typically offer the best sensitivity but typically must be used in a stopped format to generate good signal. Use of fluorescent phenols of pKa < 7, such as halogenated coumarins, allows direct screening at neutral pH. The synthesis and characterisation of a set of nine different glycosides of 6-chloro-4-methylumbelliferone are described. The use of these substrates in a pooled format for screening of expressed metagenomic libraries yielded a "hit rate" of 1 in 60. Hits were then readily deconvoluted with the individual substrates in a single plate to identify specific activities within each clone. The use of such a collection of substrates greatly accelerates the screening process. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Design and Manufacture of Wood Blades for Windtunnel Fans

    NASA Technical Reports Server (NTRS)

    Richardson, S. E.

    1998-01-01

    Many windtunnels use wooden fan blades, however, because of their usual long life (often in excess of 50 years) wooden blades typically do not have to be replaced very often; therefore, the expertise for designing and building wooden windtunnel fan blades is being lost. The purpose of this report is to document the design and build process so that when replacement blades are eventually required some of the critical information required is available. Information useful to fan-blade designers, fabricators, inspectors, and windtunnel operations personnel is included. Fixed pitch and variable pitch fans as well as fans which range in size from a few feet in diameter to over 40 ft. in diameter are described. Woods, adhesives, and coverings are discussed.

  10. Doing Systems Engineering Without Thinking About It at NASA Dryden Flight Research Center

    NASA Technical Reports Server (NTRS)

    Bohn-Meyer, Marta; Kilp, Stephen; Chun, Peggy; Mizukami, Masashi

    2004-01-01

    When asked about his processes in designing a new airplane, Burt Rutan responded: ...there is always a performance requirement. So I start with the basic physics of an airplane that can get those requirements, and that pretty much sizes an airplane... Then I look at the functionality... And then I try a lot of different configurations to meet that, and then justify one at a time, throwing them out... Typically I'll have several different configurations... But I like to experiment, certainly. I like to see if there are other ways to provide the utility. This kind of thinking engineering as a total systems engineering approach is what is being instilled in all engineers at the NASA Dryden Flight Research Center.

  11. Tailoring Functional Chitosan-based Composites for Food Applications.

    PubMed

    Nunes, Cláudia; Coimbra, Manuel A; Ferreira, Paula

    2018-03-08

    Chitosan-based functional materials are emerging for food applications. The covalent bonding of molecular entities demonstrates to enhance resistance to the typical acidity of food assigning mechanical and moisture/gas barrier properties. Moreover, the grafting to chitosan of some functional molecules, like phenolic compounds or essential oils, gives antioxidant, antimicrobial, among others properties to chitosan. The addition of nanofillers to chitosan and other biopolymers improves the already mentioned required properties for food applications and can attribute electrical conductivity and magnetic properties for active and intelligent packaging. Electrical conductivity is a required property for the processing of food at low temperature using electric fields or for sensors application. © 2018 The Chemical Society of Japan & Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Numerical Studies of Impurities in Fusion Plasmas

    DOE R&D Accomplishments Database

    Hulse, R. A.

    1982-09-01

    The coupled partial differential equations used to describe the behavior of impurity ions in magnetically confined controlled fusion plasmas require numerical solution for cases of practical interest. Computer codes developed for impurity modeling at the Princeton Plasma Physics Laboratory are used as examples of the types of codes employed for this purpose. These codes solve for the impurity ionization state densities and associated radiation rates using atomic physics appropriate for these low-density, high-temperature plasmas. The simpler codes solve local equations in zero spatial dimensions while more complex cases require codes which explicitly include transport of the impurity ions simultaneously with the atomic processes of ionization and recombination. Typical applications are discussed and computational results are presented for selected cases of interest.

  13. The longevity of habitable planets and the development of intelligent life

    NASA Astrophysics Data System (ADS)

    Simpson, Fergus

    2017-07-01

    Why did the emergence of our species require a timescale similar to the entire habitable period of our planet? Our late appearance has previously been interpreted by Carter (2008) as evidence that observers typically require a very long development time, implying that intelligent life is a rare occurrence. Here we present an alternative explanation, which simply asserts that many planets possess brief periods of habitability. We also propose that the rate-limiting step for the formation of observers is the enlargement of species from an initially microbial state. In this scenario, the development of intelligent life is a slow but almost inevitable process, greatly enhancing the prospects of future search for extra-terrestrial intelligence (SETI) experiments such as the Breakthrough Listen project.

  14. User Interactive Software for Analysis of Human Physiological Data

    NASA Technical Reports Server (NTRS)

    Cowings, Patricia S.; Toscano, William; Taylor, Bruce C.; Acharya, Soumydipta

    2006-01-01

    Ambulatory physiological monitoring has been used to study human health and performance in space and in a variety of Earth-based environments (e.g., military aircraft, armored vehicles, small groups in isolation, and patients). Large, multi-channel data files are typically recorded in these environments, and these files often require the removal of contaminated data prior to processing and analyses. Physiological data processing can now be performed with user-friendly, interactive software developed by the Ames Psychophysiology Research Laboratory. This software, which runs on a Windows platform, contains various signal-processing routines for both time- and frequency- domain data analyses (e.g., peak detection, differentiation and integration, digital filtering, adaptive thresholds, Fast Fourier Transform power spectrum, auto-correlation, etc.). Data acquired with any ambulatory monitoring system that provides text or binary file format are easily imported to the processing software. The application provides a graphical user interface where one can manually select and correct data artifacts utilizing linear and zero interpolation and adding trigger points for missed peaks. Block and moving average routines are also provided for data reduction. Processed data in numeric and graphic format can be exported to Excel. This software, PostProc (for post-processing) requires the Dadisp engineering spreadsheet (DSP Development Corp), or equivalent, for implementation. Specific processing routines were written for electrocardiography, electroencephalography, electromyography, blood pressure, skin conductance level, impedance cardiography (cardiac output, stroke volume, thoracic fluid volume), temperature, and respiration

  15. Mechanical and biological behavior of ultrafine-grained Ti alloy aneurysm clip processed using high-pressure torsion.

    PubMed

    Um, Ho Yong; Park, Byung Ho; Ahn, Dong-Hyun; Abd El Aal, Mohamed Ibrahim; Park, Jaechan; Kim, Hyoung Seop

    2017-04-01

    Severe plastic deformation (SPD) has recently been advanced as the main process for fabricating bulk ultrafine grained or nanocrystalline metallic materials, which present much higher strength and better bio-compatibility than coarse-grained counterparts. Medical devices, such as aneurysm clips and dental implants, require high mechanical and biological performance (e.g., stiffness, yield strength, fatigue resistance, and bio-compatibility). These requirements match well the characteristics of SPD-processed materials. Typical aneurysm clips are made of a commercial Ti-6Al-4V alloy, which has higher yield strength than Ti. In this work, Ti and Ti-6Al-4V workpieces were processed by high-pressure torsion (HPT) to enhance their mechanical properties. Tensile tests and hardness tests were performed to evaluate their mechanical properties, and their microstructure was investigated. The hardness and yield stress of the HPT-processed Ti are comparable to those of the initial Ti-6Al-4V due to significantly refined microstructure. Finite element analyses for evaluating the opening performance of a specific geometry of the YASARGIL aneurysm clip were carried out using mechanical properties of the initial and HPT-processed Ti and Ti-6Al-4V. These results indicate that SPD-processed Ti could be a good candidate to substitute for Ti-6Al-4V in aneurysm clips. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Firmware Development Improves System Efficiency

    NASA Technical Reports Server (NTRS)

    Chern, E. James; Butler, David W.

    1993-01-01

    Most manufacturing processes require physical pointwise positioning of the components or tools from one location to another. Typical mechanical systems utilize either stop-and-go or fixed feed-rate procession to accomplish the task. The first approach achieves positional accuracy but prolongs overall time and increases wear on the mechanical system. The second approach sustains the throughput but compromises positional accuracy. A computer firmware approach has been developed to optimize this point wise mechanism by utilizing programmable interrupt controls to synchronize engineering processes 'on the fly'. This principle has been implemented in an eddy current imaging system to demonstrate the improvement. Software programs were developed that enable a mechanical controller card to transmit interrupts to a system controller as a trigger signal to initiate an eddy current data acquisition routine. The advantages are: (1) optimized manufacturing processes, (2) increased throughput of the system, (3) improved positional accuracy, and (4) reduced wear and tear on the mechanical system.

  17. Fabrication of multilayered conductive polymer structures via selective visible light photopolymerization

    NASA Astrophysics Data System (ADS)

    Cullen, Andrew T.; Price, Aaron D.

    2017-04-01

    Electropolymerization of pyrrole is commonly employed to fabricate intrinsically conductive polymer films that exhibit desirable electromechanical properties. Due to their monolithic nature, electroactive polypyrrole films produced via this process are typically limited to simple linear or bending actuation modes, which has hindered their application in complex actuation tasks. This initiative aims to develop the specialized fabrication methods and polymer formulations required to realize three-dimensional conductive polymer structures capable of more elaborate actuation modes. Our group has previously reported the application of the digital light processing additive manufacturing process for the fabrication of three-dimensional conductive polymer structures using ultraviolet radiation. In this investigation, we further expand upon this initial work and present an improved polymer formulation designed for digital light processing additive manufacturing using visible light. This technology enables the design of novel electroactive polymer sensors and actuators with enhanced capabilities and brings us one step closer to realizing more advanced electroactive polymer enabled devices.

  18. Fabrication of Circuit QED Quantum Processors, Part 2: Advanced Semiconductor Manufacturing Perspectives

    NASA Astrophysics Data System (ADS)

    Michalak, D. J.; Bruno, A.; Caudillo, R.; Elsherbini, A. A.; Falcon, J. A.; Nam, Y. S.; Poletto, S.; Roberts, J.; Thomas, N. K.; Yoscovits, Z. R.; Dicarlo, L.; Clarke, J. S.

    Experimental quantum computing is rapidly approaching the integration of sufficient numbers of quantum bits for interesting applications, but many challenges still remain. These challenges include: realization of an extensible design for large array scale up, sufficient material process control, and discovery of integration schemes compatible with industrial 300 mm fabrication. We present recent developments in extensible circuits with vertical delivery. Toward the goal of developing a high-volume manufacturing process, we will present recent results on a new Josephson junction process that is compatible with current tooling. We will then present the improvements in NbTiN material uniformity that typical 300 mm fabrication tooling can provide. While initial results on few-qubit systems are encouraging, advanced processing control is expected to deliver the improvements in qubit uniformity, coherence time, and control required for larger systems. Research funded by Intel Corporation.

  19. DWPF Recycle Evaporator Simulant Tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stone, M

    2005-04-05

    Testing was performed to determine the feasibility and processing characteristics of an evaporation process to reduce the volume of the recycle stream from the Defense Waste Processing Facility (DWPF). The concentrated recycle would be returned to DWPF while the overhead condensate would be transferred to the Effluent Treatment Plant. Various blends of evaporator feed were tested using simulants developed from characterization of actual recycle streams from DWPF and input from DWPF-Engineering. The simulated feed was evaporated in laboratory scale apparatus to target a 30X volume reduction. Condensate and concentrate samples from each run were analyzed and the process characteristics (foaming,more » scaling, etc) were visually monitored during each run. The following conclusions were made from the testing: Concentration of the ''typical'' recycle stream in DWPF by 30X was feasible. The addition of DWTT recycle streams to the typical recycle stream raises the solids content of the evaporator feed considerably and lowers the amount of concentration that can be achieved. Foaming was noted during all evaporation tests and must be addressed prior to operation of the full-scale evaporator. Tests were conducted that identified Dow Corning 2210 as an antifoam candidate that warrants further evaluation. The condensate has the potential to exceed the ETP WAC for mercury, silicon, and TOC. Controlling the amount of equipment decontamination recycle in the evaporator blend would help meet the TOC limits. The evaporator condensate will be saturated with mercury and elemental mercury will collect in the evaporator condensate collection vessel. No scaling on heating surfaces was noted during the tests, but splatter onto the walls of the evaporation vessels led to a buildup of solids. These solids were difficult to remove with 2M nitric acid. Precipitation of solids was not noted during the testing. Some of the aluminum present in the recycle streams was converted from gibbsite to aluminum oxide during the evaporation process. The following recommendations were made: Recycle from the DWTT should be metered in slowly to the ''typical'' recycle streams to avoid spikes in solids content to allow consistent processing and avoid process upsets. Additional studies should be conducted to determine acceptable volume ratios for the HEME dissolution and decontamination solutions in the evaporator feed. Dow Corning 2210 antifoam should be evaluated for use to control foaming. Additional tests are required to determine the concentration of antifoam required to prevent foaming during startup, the frequency of antifoam additions required to control foaming during steady state processing, and the ability of the antifoam to control foam over a range of potential feed compositions. This evaluation should also include evaluation of the degradation of the antifoam and impact on the silicon and TOC content of the condensate. The caustic HEME dissolution recycle stream should be neutralized to at least pH of 7 prior to blending with the acidic recycle streams. Dow Corning 2210 should be used during the evaporation testing using the radioactive recycle samples received from DWPF. Evaluation of additional antifoam candidates should be conducted as a backup for Dow Corning 2210. A camera and/or foam detection instrument should be included in the evaporator design to allow monitoring of the foaming behavior during operation. The potential for foam formation and high solids content should be considered during the design of the evaporator vessel.« less

  20. Single-step affinity purification of enzyme biotherapeutics: a platform methodology for accelerated process development.

    PubMed

    Brower, Kevin P; Ryakala, Venkat K; Bird, Ryan; Godawat, Rahul; Riske, Frank J; Konstantinov, Konstantin; Warikoo, Veena; Gamble, Jean

    2014-01-01

    Downstream sample purification for quality attribute analysis is a significant bottleneck in process development for non-antibody biologics. Multi-step chromatography process train purifications are typically required prior to many critical analytical tests. This prerequisite leads to limited throughput, long lead times to obtain purified product, and significant resource requirements. In this work, immunoaffinity purification technology has been leveraged to achieve single-step affinity purification of two different enzyme biotherapeutics (Fabrazyme® [agalsidase beta] and Enzyme 2) with polyclonal and monoclonal antibodies, respectively, as ligands. Target molecules were rapidly isolated from cell culture harvest in sufficient purity to enable analysis of critical quality attributes (CQAs). Most importantly, this is the first study that demonstrates the application of predictive analytics techniques to predict critical quality attributes of a commercial biologic. The data obtained using the affinity columns were used to generate appropriate models to predict quality attributes that would be obtained after traditional multi-step purification trains. These models empower process development decision-making with drug substance-equivalent product quality information without generation of actual drug substance. Optimization was performed to ensure maximum target recovery and minimal target protein degradation. The methodologies developed for Fabrazyme were successfully reapplied for Enzyme 2, indicating platform opportunities. The impact of the technology is significant, including reductions in time and personnel requirements, rapid product purification, and substantially increased throughput. Applications are discussed, including upstream and downstream process development support to achieve the principles of Quality by Design (QbD) as well as integration with bioprocesses as a process analytical technology (PAT). © 2014 American Institute of Chemical Engineers.

  1. The SpaceCube Family of Hybrid On-Board Science Data Processors: An Update

    NASA Astrophysics Data System (ADS)

    Flatley, T.

    2012-12-01

    SpaceCube is an FPGA based on-board hybrid science data processing system developed at the NASA Goddard Space Flight Center (GSFC). The goal of the SpaceCube program is to provide 10x to 100x improvements in on-board computing power while lowering relative power consumption and cost. The SpaceCube design strategy incorporates commercial rad-tolerant FPGA technology and couples it with an upset mitigation software architecture to provide "order of magnitude" improvements in computing power over traditional rad-hard flight systems. Many of the missions proposed in the Earth Science Decadal Survey (ESDS) will require "next generation" on-board processing capabilities to meet their specified mission goals. Advanced laser altimeter, radar, lidar and hyper-spectral instruments are proposed for at least ten of the ESDS missions, and all of these instrument systems will require advanced on-board processing capabilities to facilitate the timely conversion of Earth Science data into Earth Science information. Both an "order of magnitude" increase in processing power and the ability to "reconfigure on the fly" are required to implement algorithms that detect and react to events, to produce data products on-board for applications such as direct downlink, quick look, and "first responder" real-time awareness, to enable "sensor web" multi-platform collaboration, and to perform on-board "lossless" data reduction by migrating typical ground-based processing functions on-board, thus reducing on-board storage and downlink requirements. This presentation will highlight a number of SpaceCube technology developments to date and describe current and future efforts, including the collaboration with the U.S. Department of Defense - Space Test Program (DoD/STP) on the STP-H4 ISS experiment pallet (launch June 2013) that will demonstrate SpaceCube 2.0 technology on-orbit.; ;

  2. N-ViroTech--a novel process for the treatment of nutrient limited wastewaters.

    PubMed

    Slade, A H; Gapes, D J; Stuthridge, T R; Anderson, S M; Dare, P H; Pearson, H G W; Dennis, M

    2004-01-01

    As pulp and paper wastewaters are mostly deficient in nitrogen and phosphorus, historical practice has dictated that they cannot be effectively treated using microbiological processes without the addition of supplementary nutrients, such as urea and phosphoric acid. Supplementation is a difficult step to manage efficiently, requiring extensive post-treatment monitoring and some degree of overdosing to ensure sufficient nutrient availability under all conditions. As a result, treated wastewaters usually contain excess amounts of both nutrients, leading to potential impacts on the receiving waters such as eutrophication. N-ViroTech is a highly effective alternative treatment technology which overcomes this nutrient deficiency/excess paradox. The process relies on communities of nitrogen-fixing bacteria, which are able to directly fix nitrogen from the atmosphere, thus satisfying their cellular nitrogen requirements. The process relies on manipulation of growth conditions within the biological system to maintain a nitrogen-fixing population whilst achieving target wastewater treatment performance. The technology has significant advantages over conventional activated sludge operation, including: Improved environmental performance. Nutrient loadings in the final treated effluent for selected nitrogen and phosphorus species (particularly ammonium and orthophosphate) may be reduced by over 90% compared to conventional systems; Elimination of nitrogen supplementation, and minimisation of phosphorus supplementation, thus achieving significant chemical savings and resulting in between 25% and 35% savings in operational costs for a typical system; Self-regulation of nutrient requirements, as the bacteria only use as much nitrogen as they require, allowing for substantially less operator intervention and monitoring. This paper will summarise critical performance outcomes of the N-ViroTech process utilising results from laboratory-, pilot-scale and recent alpha-adopter, full-scale trials.

  3. Automated Extraction of Flow Features

    NASA Technical Reports Server (NTRS)

    Dorney, Suzanne (Technical Monitor); Haimes, Robert

    2005-01-01

    Computational Fluid Dynamics (CFD) simulations are routinely performed as part of the design process of most fluid handling devices. In order to efficiently and effectively use the results of a CFD simulation, visualization tools are often used. These tools are used in all stages of the CFD simulation including pre-processing, interim-processing, and post-processing, to interpret the results. Each of these stages requires visualization tools that allow one to examine the geometry of the device, as well as the partial or final results of the simulation. An engineer will typically generate a series of contour and vector plots to better understand the physics of how the fluid is interacting with the physical device. Of particular interest are detecting features such as shocks, re-circulation zones, and vortices (which will highlight areas of stress and loss). As the demand for CFD analyses continues to increase the need for automated feature extraction capabilities has become vital. In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like; isc-surface, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snapshot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments). Methods must be developed to abstract the feature of interest and display it in a manner that physically makes sense.

  4. Automated Extraction of Flow Features

    NASA Technical Reports Server (NTRS)

    Dorney, Suzanne (Technical Monitor); Haimes, Robert

    2004-01-01

    Computational Fluid Dynamics (CFD) simulations are routinely performed as part of the design process of most fluid handling devices. In order to efficiently and effectively use the results of a CFD simulation, visualization tools are often used. These tools are used in all stages of the CFD simulation including pre-processing, interim-processing, and post-processing, to interpret the results. Each of these stages requires visualization tools that allow one to examine the geometry of the device, as well as the partial or final results of the simulation. An engineer will typically generate a series of contour and vector plots to better understand the physics of how the fluid is interacting with the physical device. Of particular interest are detecting features such as shocks, recirculation zones, and vortices (which will highlight areas of stress and loss). As the demand for CFD analyses continues to increase the need for automated feature extraction capabilities has become vital. In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like; iso-surface, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snapshot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for (co-processing environments). Methods must be developed to abstract the feature of interest and display it in a manner that physically makes sense.

  5. Space station needs, attributes and architectural options study. Volume 3: Requirements

    NASA Technical Reports Server (NTRS)

    1983-01-01

    A typical system specification format is presented and requirements are compiled. A Program Specification Tree is shown showing a high inclination space station and a low inclination space station with their typical element breakdown, also represented along the top blocks are the interfaces with other systems. The specification format is directed at the Low Inclination space station.

  6. Disappearance of patulin during alcoholic fermentation of apple juice.

    PubMed

    Stinson, E E; Osman, S F; Huhtanen, C N; Bills, D D

    1978-10-01

    Eight yeast strains were used in three typical American processes to ferment apple juice containing 15 mg of added patulin per liter. Patulin was reduced to less than the minimum detectable level of 50 microgram/liter in all but two cases; in all cases, the level of patulin was reduced by over 99% during alcoholic fermentation. In unfermented samples of apple juice, the concentration of added patulin declined by only 10% when the juice was held for 2 weeks, a period equivalent to the time required for fermentation.

  7. Role of Microbes in the Smectite-to-Illite Reaction

    USGS Publications Warehouse

    Kim, J.; Dong, H.; Seabaugh, J.; Newell, Steven W.; Eberl, D.D.

    2004-01-01

    Temperature, pressure, and time have been thought to control the smectiteto-illite (S-I) reaction, an important diagenetic process used for petroleum exploration. We demonstrated that microorganisms can promote the S-I reaction by dissolving smectite through reduction of structural FE(III) at room temperature and 1 atmosphere within 14 days. This reaction typically requires conditions of 300?? to 350??C, 100 megapascals, and 4 to 5 months in the absence of microbial activity. These results challenge the conventional concept of the S-I reaction and of reaction kinetic models.

  8. Role of microbes in the smectite-to-illite reaction

    NASA Technical Reports Server (NTRS)

    Kim, Jinwook; Dong, Hailiang; Seabaugh, Jennifer; Newell, Steven W.; Eberl, Dennis D.

    2004-01-01

    Temperature, pressure, and time have been thought to control the smectite-to-illite (S-I) reaction, an important diagenetic process used for petroleum exploration. We demonstrated that microorganisms can promote the S-I reaction by dissolving smectite through reduction of structural Fe(III) at room temperature and 1 atmosphere within 14 days. This reaction typically requires conditions of 300 degrees to 350 degrees C, 100 megapascals, and 4 to 5 months in the absence of microbial activity. These results challenge the conventional concept of the S-I reaction and of reaction kinetic models.

  9. Versatile, High Quality and Scalable Continuous Flow Production of Metal-Organic Frameworks

    PubMed Central

    Rubio-Martinez, Marta; Batten, Michael P.; Polyzos, Anastasios; Carey, Keri-Constanti; Mardel, James I.; Lim, Kok-Seng; Hill, Matthew R.

    2014-01-01

    Further deployment of Metal-Organic Frameworks in applied settings requires their ready preparation at scale. Expansion of typical batch processes can lead to unsuccessful or low quality synthesis for some systems. Here we report how continuous flow chemistry can be adapted as a versatile route to a range of MOFs, by emulating conditions of lab-scale batch synthesis. This delivers ready synthesis of three different MOFs, with surface areas that closely match theoretical maxima, with production rates of 60 g/h at extremely high space-time yields. PMID:24962145

  10. Using a value chain approach for effective decision making.

    PubMed

    Wilner, N A

    1997-09-01

    Effectively managing costs in a healthcare environment may require taking a new look at how those costs are evaluated. The price of a product is not necessarily the most effective or efficient way of determining the actual cost. Using a value chain approach takes into consideration the functional costs of using a product as well, including both the "process" and "downstream" costs to an organization. In this article, Associate Professor Neil A. Wilner examines the differences between price and cost using a typical purchase in a healthcare environment.

  11. Method Developed for Improving the Thermomechanical Properties of Silicon Carbide Matrix Composites

    NASA Technical Reports Server (NTRS)

    Bhatt, Ramakrishna T.; DiCarlo, James A.

    2004-01-01

    Today, a major thrust for achieving engine components with improved thermal capability is the development of fiber-reinforced silicon-carbide (SiC) matrix composites. These materials are not only lighter and capable of higher use temperatures than state-of-the-art metallic alloys and oxide matrix composites (approx. 1100 C), but they can provide significantly better static and dynamic toughness than unreinforced silicon-based monolithic ceramics. However, for successful application in advanced engine systems, the SiC matrix composites should be able to withstand component service stresses and temperatures for the desired component lifetime. Since the high-temperature structural life of ceramic materials is typically controlled by creep-induced flaw growth, a key composite property requirement is the ability to display high creep resistance under these conditions. Also, because of the possibility of severe thermal gradients in the components, the composites should provide maximum thermal conductivity to minimize the development of thermal stresses. State-of-the-art SiC matrix composites are typically fabricated via a three-step process: (1) fabrication of a component-shaped architectural preform reinforced by high-performance fibers, (2) chemical vapor infiltration of a fiber coating material such as boron nitride (BN) into the preform, and (3) infiltration of a SiC matrix into the remaining porous areas in the preform. Generally, the highest performing composites have matrices fabricated by the CVI process, which produces a SiC matrix typically more thermally stable and denser than matrices formed by other approaches. As such, the CVI SiC matrix is able to provide better environmental protection to the coated fibers, plus provide the composite with better resistance to crack propagation. Also, the denser CVI SiC matrix should provide optimal creep resistance and thermal conductivity to the composite. However, for adequate preform infiltration, the CVI SiC matrix process typically has to be conducted at temperatures below 1100 C, which results in a SiC matrix that is fairly dense, but contains metastable atomic defects and is nonstoichiometric because of a small amount of excess silicon. Because these defects typically exist at the matrix grain boundaries, they can scatter thermal phonons and degrade matrix creep resistance by enhancing grain-boundary sliding. To eliminate these defects and improve the thermomechanical properties of ceramic composites with CVI SiC matrices, researchers at the NASA Glenn Research Center developed a high-temperature treatment process that can be used after the CVI SiC matrix is deposited into the fiber preform.

  12. Optimal post-experiment estimation of poorly modeled dynamic systems

    NASA Technical Reports Server (NTRS)

    Mook, D. Joseph

    1988-01-01

    Recently, a novel strategy for post-experiment state estimation of discretely-measured dynamic systems has been developed. The method accounts for errors in the system dynamic model equations in a more general and rigorous manner than do filter-smoother algorithms. The dynamic model error terms do not require the usual process noise assumptions of zero-mean, symmetrically distributed random disturbances. Instead, the model error terms require no prior assumptions other than piecewise continuity. The resulting state estimates are more accurate than filters for applications in which the dynamic model error clearly violates the typical process noise assumptions, and the available measurements are sparse and/or noisy. Estimates of the dynamic model error, in addition to the states, are obtained as part of the solution of a two-point boundary value problem, and may be exploited for numerous reasons. In this paper, the basic technique is explained, and several example applications are given. Included among the examples are both state estimation and exploitation of the model error estimates.

  13. CMM Data Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Due to the increase in the use of Coordinate Measuring Machines (CMMs) to measure fine details and complex geometries in manufacturing, many programs have been made to compile and analyze the data. These programs typically require extensive setup to determine the expected results in order to not only track the pass/fail of a dimension, but also to use statistical process control (SPC). These extra steps and setup times have been addressed through the CMM Data Analysis Tool, which only requires the output of the CMM to provide both pass/fail analysis on all parts run to the same inspection program asmore » well as provide graphs which help visualize where the part measures within the allowed tolerances. This provides feedback not only to the customer for approval of a part during development, but also to machining process engineers to identify when any dimension is drifting towards an out of tolerance condition during production. This program can handle hundreds of parts with complex dimensions and will provide an analysis within minutes.« less

  14. Preparing for in situ processing on upcoming leading-edge supercomputers

    DOE PAGES

    Kress, James; Churchill, Randy Michael; Klasky, Scott; ...

    2016-10-01

    High performance computing applications are producing increasingly large amounts of data and placing enormous stress on current capabilities for traditional post-hoc visualization techniques. Because of the growing compute and I/O imbalance, data reductions, including in situ visualization, are required. These reduced data are used for analysis and visualization in a variety of different ways. Many of he visualization and analysis requirements are known a priori, but when they are not, scientists are dependent on the reduced data to accurately represent the simulation in post hoc analysis. The contributions of this paper is a description of the directions we are pursuingmore » to assist a large scale fusion simulation code succeed on the next generation of supercomputers. Finally, these directions include the role of in situ processing for performing data reductions, as well as the tradeoffs between data size and data integrity within the context of complex operations in a typical scientific workflow.« less

  15. Advancing the Evidence Base of Rehabilitation Treatments: A Developmental Approach

    PubMed Central

    Whyte, John; Barrett, A.M.

    2013-01-01

    Translational research refers to the development of new scientific discoveries into evidence-based treatments for human diseases and conditions. This developmental process requires that a number of scientific, as well as social and psychological obstacles, be overcome during a sequence of research stages that address different goals. Rehabilitation, like other biomedical disciplines, requires this kind of developmental process. For a variety of reasons, however, development of rehabilitation treatments is less linear than the familiar phases of pharmaceutical research. In addition, research on treatments intended to address impairments (body structure/function, in terms of the International Classification of Functioning, Disability and Health), faces the challenge of determining the likely impact of an impairment-level treatment on the multifaceted activities and aspects of participation that are the typical goals of rehabilitation treatments. This article describes the application of treatment theory and enablement theory to the development of new impairment-based treatments, and examines similarities and differences between the developmental sequence needed for rehabilitation treatment research versus pharmaceutical research in other areas of medicine. PMID:22683206

  16. Continuum and discrete approach in modeling biofilm development and structure: a review.

    PubMed

    Mattei, M R; Frunzo, L; D'Acunto, B; Pechaud, Y; Pirozzi, F; Esposito, G

    2018-03-01

    The scientific community has recognized that almost 99% of the microbial life on earth is represented by biofilms. Considering the impacts of their sessile lifestyle on both natural and human activities, extensive experimental activity has been carried out to understand how biofilms grow and interact with the environment. Many mathematical models have also been developed to simulate and elucidate the main processes characterizing the biofilm growth. Two main mathematical approaches for biomass representation can be distinguished: continuum and discrete. This review is aimed at exploring the main characteristics of each approach. Continuum models can simulate the biofilm processes in a quantitative and deterministic way. However, they require a multidimensional formulation to take into account the biofilm spatial heterogeneity, which makes the models quite complicated, requiring significant computational effort. Discrete models are more recent and can represent the typical multidimensional structural heterogeneity of biofilm reflecting the experimental expectations, but they generate computational results including elements of randomness and introduce stochastic effects into the solutions.

  17. Continued Data Acquisition Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwellenbach, David

    This task focused on improving techniques for integrating data acquisition of secondary particles correlated in time with detected cosmic-ray muons. Scintillation detectors with Pulse Shape Discrimination (PSD) capability show the most promise as a detector technology based on work in FY13. Typically PSD parameters are determined prior to an experiment and the results are based on these parameters. By saving data in list mode, including the fully digitized waveform, any experiment can effectively be replayed to adjust PSD and other parameters for the best data capture. List mode requires time synchronization of two independent data acquisitions (DAQ) systems: the muonmore » tracker and the particle detector system. Techniques to synchronize these systems were studied. Two basic techniques were identified: real time mode and sequential mode. Real time mode is the preferred system but has proven to be a significant challenge since two FPGA systems with different clocking parameters must be synchronized. Sequential processing is expected to work with virtually any DAQ but requires more post processing to extract the data.« less

  18. Rheology of corn stover slurries during fermentation to ethanol

    NASA Astrophysics Data System (ADS)

    Ghosh, Sanchari; Epps, Brenden; Lynd, Lee

    2017-11-01

    In typical processes that convert cellulosic biomass into ethanol fuel, solubilization of the biomass is carried out by saccharolytic enzymes; however, these enzymes require an expensive pretreatment step to make the biomass accessible for solubilization (and subsequent fermentation). We have proposed a potentially-less-expensive approach using the bacterium Clostridium thermocellum, which can initiate fermentation without pretreatment. Moreover, we have proposed a ``cotreatment'' process, in which fermentation and mechanical milling occur alternately so as to achieve the highest ethanol yield for the least milling energy input. In order to inform the energetic requirements of cotreatment, we experimentally characterized the rheological properties of corn stover slurries at various stages of fermentation. Results show that a corn stover slurry is a yield stress fluid, with shear thinning behavior well described by a power law model. Viscosity decreases dramatically upon fermentation, controlling for variables such as solids concentration and particle size distribution. To the authors' knowledge, this is the first study to characterize the changes in the physical properties of biomass during fermentation by a thermophilic bacterium.

  19. Electrical Properties of Reactive Liquid Crystal Semiconductors

    NASA Astrophysics Data System (ADS)

    McCulloch, Iain; Coelle, Michael; Genevicius, Kristijonas; Hamilton, Rick; Heckmeier, Michael; Heeney, Martin; Kreouzis, Theo; Shkunov, Maxim; Zhang, Weimin

    2008-01-01

    Fabrication of display products by low cost printing technologies such as ink jet, gravure offset lithography and flexography requires solution processable semiconductors for the backplane electronics. The products will typically be of lower performance than polysilicon transistors, but comparable to amorphous silicon. A range of prototypes are under development, including rollable electrophoretic displays, active matrix liquid crystal displays (AMLCD's), and flexible organic light-emitting diode (OLED) displays. Organic semiconductors that offer both electrical performance and stability with respect to storage and operation under ambient conditions are required. This work describes the initial evaluation of reactive mesogen semiconductors, which can polymerise within mesophase temperatures, “freezing in” the order in crosslinked domains. These crosslinked domains offer mechanical stability and are inert to solvent exposure in further processing steps. Reactive mesogens containing conjugated aromatic cores, designed to facilitate charge transport and provide good oxidative stability, were prepared and their liquid crystalline properties evaluated. Both time-of-flight and field effect transistor devices were prepared and their electrical characterisation reported.

  20. A hydromorphological framework for the evaluation of e-flows

    NASA Astrophysics Data System (ADS)

    Bussettini, Martina; Rinaldi, Massimo; Grant, Gordon

    2017-04-01

    Anthropogenic alteration of hydromorphological processes in rivers is a major factor that diminishes river health and undermines environmental objectives envisaged by river protection policies. Specifying environmental flows to address those impacts can be a key strategy for the maintenance of functional river processes and the achievement of those objectives. Environmental flows are determined by various methods and approaches, based primarily on hydrological and/or hydraulic evaluations, although holistic methodologies, considering the many interacting factors that structure aquatic ecosystems, including sediments, are increasingly used. Hydrological and geomorphological processes are highly coupled and any change in one typically affects the other. The coupling varies over different spatial and temporal scales, and changing either hydrological or geomorphological processes can result in alteration of river habitats, ultimately impacting ecological processes. In spite of these linkages, current restoration approaches typically focus only on changes on hydrological regime as a means promoting ecological enhancements. Neglecting sediment transport and its interaction with flow in shaping riverine habitats is likely to results not only in minor or no enhancements in the ecology, but may also increase the costs of water use. A more integrated view of how human activities jointly affect sediment regime, river morphology and river flows is therefore needed in order to determine the most effective actions to rehabilitate river processes to desired states. These states involve considerations of the combination of intrinsic ("natural") conditions (e.g. river sensitivity and morphological potential, off-site conditions) and socio-economic constraints. The evaluation of such factors, the analysis of different scenarios, and the selection of appropriate actions require the contextualization of river reaches within a wider spatial-temporal hydromorphological framework. Here we present such a general multiscale, process-based hydromorphological framework, and discuss its application to the problem of how best to analyse and estimate e-flows.

  1. Activation of bean (Phaseolus vulgaris) [alpha]-amylase inhibitor requires proteolytic processing of the proprotein

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pueyo, J.J.; Hunt, D.C.; Chrispeels, M.J.

    Seeds of the common bean (Phaseolus vulgaris) contain a plant defense protein that inhibits the [alpha]-amylases of mammals and insects. This [alpha]-amylase inhibitor ([alpha]Al) is synthesized as a proprotein on the endoplasmic reticulum and is proteolytically processed after arrival in the protein storage vacuoles to polypeptides of relative molecular weight (M[sub r]) 15,000 to 18,000. The authors report two types of evidence that proteolytic processing is linked to activation of the inhibitory activity. First, by surveying seed extracts of wild accessions of P. vulgaris and other species in the genus Phaseolus, they found that antibodies to [alpha]Al recognize large (M[submore » r] 30,000-35,000) polypeptides as well as typical [alpha]Al processing products (M[sub r] 15,000-18,000). [alpha]Al activity was found in all extracts that had the typical [alpha]Al processed polypeptides, but was absent from seed extracts that lacked such polypeptides. Second, they made a mutant [alpha]Al in which asparagine-77 is changed to aspartic acid-77. This mutation slows down the proteolytic processing of pro-[alpha]Al when the gene is expressed in tobacco. When pro-[alpha]Al was separated from mature [alpha]Al by gel filtration, pro-[alpha]Al was found not to have [alpha]-amylase inhibitory activity. The authors interpret these results to mean that formation of the active inhibitor is causally related to proteolytic processing of the proprotein. They suggest that the polypeptide cleavage removes a conformation constraint on the precursor to produce the biochemically active molecule. 43 refs., 5 figs., 1 tab.« less

  2. Topology optimization for nonlinear dynamic problems: Considerations for automotive crashworthiness

    NASA Astrophysics Data System (ADS)

    Kaushik, Anshul; Ramani, Anand

    2014-04-01

    Crashworthiness of automotive structures is most often engineered after an optimal topology has been arrived at using other design considerations. This study is an attempt to incorporate crashworthiness requirements upfront in the topology synthesis process using a mathematically consistent framework. It proposes the use of equivalent linear systems from the nonlinear dynamic simulation in conjunction with a discrete-material topology optimizer. Velocity and acceleration constraints are consistently incorporated in the optimization set-up. Issues specific to crash problems due to the explicit solution methodology employed, nature of the boundary conditions imposed on the structure, etc. are discussed and possible resolutions are proposed. A demonstration of the methodology on two-dimensional problems that address some of the structural requirements and the types of loading typical of frontal and side impact is provided in order to show that this methodology has the potential for topology synthesis incorporating crashworthiness requirements.

  3. An Analysis of an Automatic Coolant Bypass in the International Space Station Node 2 Internal Active Thermal Control System

    NASA Technical Reports Server (NTRS)

    Clanton, Stephen E.; Holt, James M.; Turner, Larry D. (Technical Monitor)

    2001-01-01

    A challenging part of International Space Station (ISS) thermal control design is the ability to incorporate design changes into an integrated system without negatively impacting performance. The challenge presents itself in that the typical ISS Internal Active Thermal Control System (IATCS) consists of an integrated hardware/software system that provides active coolant resources to a variety of users. Software algorithms control the IATCS to specific temperatures, flow rates, and pressure differentials in order to meet the user-defined requirements. What may seem to be small design changes imposed on the system may in fact result in system instability or the temporary inability to meet user requirements. The purpose of this paper is to provide a brief description of the solution process and analyses used to implement one such design change that required the incorporation of an automatic coolant bypass in the ISS Node 2 element.

  4. Creation of digital contours that approach the characteristics of cartographic contours

    USGS Publications Warehouse

    Tyler, Dean J.; Greenlee, Susan K.

    2012-01-01

    The capability to easily create digital contours using commercial off-the-shelf (COTS) software has existed for decades. Out-of-the-box raw contours are suitable for many scientific applications without pre- or post-processing; however, cartographic applications typically require additional improvements. For example, raw contours generally require smoothing before placement on a map. Cartographic contours must also conform to certain spatial/logical rules; for example, contours may not cross waterbodies. The objective was to create contours that match as closely as possible the cartographic contours produced by manual methods on the 1:24,000-scale, 7.5-minute Topographic Map series. This report outlines the basic approach, describes a variety of problems that were encountered, and discusses solutions. Many of the challenges described herein were the result of imperfect input raster elevation data and the requirement to have the contours integrated with hydrographic features from the National Hydrography Dataset (NHD).

  5. Component-Level Selection and Qualification for the Global Ecosystem Dynamics Investigation (GEDI) Laser Altimeter Transmitter

    NASA Technical Reports Server (NTRS)

    Frese, Erich A.; Chiragh, Furqan L.; Switzer, Robert; Vasilyev, Aleksey A.; Thomes, Joe; Coyle, D. Barry; Stysley, Paul R.

    2018-01-01

    Flight quality solid-state lasers require a unique and extensive set of testing and qualification processes, both at the system and component levels to insure the laser's promised performance. As important as the overall laser transmitter design is, the quality and performance of individual subassemblies, optics, and electro-optics dictate the final laser unit's quality. The Global Ecosystem Dynamics Investigation (GEDI) laser transmitters employ all the usual components typical for a diode-pumped, solid-state laser, yet must each go through their own individual process of specification, modeling, performance demonstration, inspection, and destructive testing. These qualification processes and results for the laser crystals, laser diode arrays, electro-optics, and optics, will be reviewed as well as the relevant critical issues encountered, prior to their installation in the GEDI flight laser units.

  6. A Core Plug and Play Architecture for Reusable Flight Software Systems

    NASA Technical Reports Server (NTRS)

    Wilmot, Jonathan

    2006-01-01

    The Flight Software Branch, at Goddard Space Flight Center (GSFC), has been working on a run-time approach to facilitate a formal software reuse process. The reuse process is designed to enable rapid development and integration of high-quality software systems and to more accurately predict development costs and schedule. Previous reuse practices have been somewhat successful when the same teams are moved from project to project. But this typically requires taking the software system in an all-or-nothing approach where useful components cannot be easily extracted from the whole. As a result, the system is less flexible and scalable with limited applicability to new projects. This paper will focus on the rationale behind, and implementation of the run-time executive. This executive is the core for the component-based flight software commonality and reuse process adopted at Goddard.

  7. Pupillometry reveals reduced unconscious emotional reactivity in autism.

    PubMed

    Nuske, Heather J; Vivanti, Giacomo; Hudry, Kristelle; Dissanayake, Cheryl

    2014-09-01

    Recent theoretical conceptualisations have suggested that emotion processing impairments in autism stem from disruption to the sub-cortical, rapid emotion-processing system. We argue that a clear way to ascertain whether this system is affected in autism is by measuring unconscious emotional reactivity. Using backwards masking, we presented fearful expressions non-consciously (subliminally) as well as consciously (supraliminally), and measured pupillary responses as an index of emotional reactivity in 19 children with autism and 19 typically developing children, aged 2-5 years. The pupillary responses of the children with autism revealed reduced unconscious emotional reactivity, with no group differences on consciously presented emotion. Together, these results indicate a hyporesponsiveness to non-consciously presented emotion suggesting a fundamental difference in emotion processing in autism, which requires consciousness and more time. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Electrical-assisted double side incremental forming and processes thereof

    DOEpatents

    Roth, John; Cao, Jian

    2014-06-03

    A process for forming a sheet metal component using an electric current passing through the component is provided. The process can include providing a double side incremental forming machine, the machine operable to perform a plurality of double side incremental deformations on the sheet metal component and also apply an electric direct current to the sheet metal component during at least part of the forming. The direct current can be applied before or after the forming has started and/or be terminated before or after the forming has stopped. The direct current can be applied to any portion of the sheet metal. The electrical assistance can reduce the magnitude of force required to produce a given amount of deformation, increase the amount of deformation exhibited before failure and/or reduce any springback typically exhibited by the sheet metal component.

  9. Real-time optimizations for integrated smart network camera

    NASA Astrophysics Data System (ADS)

    Desurmont, Xavier; Lienard, Bruno; Meessen, Jerome; Delaigle, Jean-Francois

    2005-02-01

    We present an integrated real-time smart network camera. This system is composed of an image sensor, an embedded PC based electronic card for image processing and some network capabilities. The application detects events of interest in visual scenes, highlights alarms and computes statistics. The system also produces meta-data information that could be shared between other cameras in a network. We describe the requirements of such a system and then show how the design of the system is optimized to process and compress video in real-time. Indeed, typical video-surveillance algorithms as background differencing, tracking and event detection should be highly optimized and simplified to be used in this hardware. To have a good adequation between hardware and software in this light embedded system, the software management is written on top of the java based middle-ware specification established by the OSGi alliance. We can integrate easily software and hardware in complex environments thanks to the Java Real-Time specification for the virtual machine and some network and service oriented java specifications (like RMI and Jini). Finally, we will report some outcomes and typical case studies of such a camera like counter-flow detection.

  10. Computer model to simulate testing at the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.; Owens, Lewis R., Jr.; Wahls, Richard A.; Hannon, Judith A.

    1995-01-01

    A computer model has been developed to simulate the processes involved in the operation of the National Transonic Facility (NTF), a large cryogenic wind tunnel at the Langley Research Center. The simulation was verified by comparing the simulated results with previously acquired data from three experimental wind tunnel test programs in the NTF. The comparisons suggest that the computer model simulates reasonably well the processes that determine the liquid nitrogen (LN2) consumption, electrical consumption, fan-on time, and the test time required to complete a test plan at the NTF. From these limited comparisons, it appears that the results from the simulation model are generally within about 10 percent of the actual NTF test results. The use of actual data acquisition times in the simulation produced better estimates of the LN2 usage, as expected. Additional comparisons are needed to refine the model constants. The model will typically produce optimistic results since the times and rates included in the model are typically the optimum values. Any deviation from the optimum values will lead to longer times or increased LN2 and electrical consumption for the proposed test plan. Computer code operating instructions and listings of sample input and output files have been included.

  11. Improving bioaerosol exposure assessments of composting facilities — Comparative modelling of emissions from different compost ages and processing activities

    NASA Astrophysics Data System (ADS)

    Taha, M. P. M.; Drew, G. H.; Tamer, A.; Hewings, G.; Jordinson, G. M.; Longhurst, P. J.; Pollard, S. J. T.

    We present bioaerosol source term concentrations from passive and active composting sources and compare emissions from green waste compost aged 1, 2, 4, 6, 8, 12 and 16 weeks. Results reveal that the age of compost has little effect on the bioaerosol concentrations emitted for passive windrow sources. However emissions from turning compost during the early stages may be higher than during the later stages of the composting process. The bioaerosol emissions from passive sources were in the range of 10 3-10 4 cfu m -3, with releases from active sources typically 1-log higher. We propose improvements to current risk assessment methodologies by examining emission rates and the differences between two air dispersion models for the prediction of downwind bioaerosol concentrations at off-site points of exposure. The SCREEN3 model provides a more precautionary estimate of the source depletion curves of bioaerosol emissions in comparison to ADMS 3.3. The results from both models predict that bioaerosol concentrations decrease to below typical background concentrations before 250 m, the distance at which the regulator in England and Wales may require a risk assessment to be completed.

  12. Attention, working memory, and grammaticality judgment in typical young adults.

    PubMed

    Smith, Pamela A

    2011-06-01

    To examine resource allocation and sentence processing, this study examined the effects of auditory distraction on grammaticality judgment (GJ) of sentences varied by semantics (reversibility) and short-term memory requirements. Experiment 1: Typical young adult females (N = 60) completed a whole-sentence GJ task in distraction (Quiet, Noise, or Talk). Participants judged grammaticality of Passive sentences varied by sentence (length), grammaticality, and reversibility. Reaction time (RT) data were analyzed using a mixed analysis of variance. Experiment 2: A similar group completed a self-paced reading GJ task using the similar materials. Experiment 1: Participants responded faster to Bad and to Nonreversible sentences, and in the Talk distraction. The slowest RTs were noted for Good-Reversible-Padded sentences in the Quiet condition. Experiment 2: Distraction did not differentially affect RTs for sentence components. Verb RTs were slower for Reversible sentences. Results suggest that narrative distraction affected GJ, but by speeding responses, not slowing them. Sentence variables of memory and reversibility slowed RTs, but narrative distraction resulted in faster processing times regardless of individual sentence variables. More explicit, deliberate tasks (self-paced reading) resulted in less effect from distraction. Results are discussed in terms of recent theories about auditory distraction.

  13. Effect of Surface Treatments on Electron Beam Freeform Fabricated Aluminum Structures

    NASA Technical Reports Server (NTRS)

    Taminger, Karen M. B.; Hafley, Robert A.; Fahringer, David T.; Martin, Richard E.

    2004-01-01

    Electron beam freeform fabrication (EBF3) parts exhibit a ridged surface finish typical of many layer-additive processes. This, post-processing is required to produce a net shape with a smooth surface finish. High speed milling wire electrical discharge machining (EDM), electron beam glazing, and glass bead blasting were performed on EBF3-build 2219 aluminum alloy parts to reduce or eliminate the ridged surface features. Surface roughness, surface residual stress state, and microstructural characteristics were examined for each of the different surface treatment to assess the quality and effect of the surface treatments on the underlying material. The analysis evaluated the effectivenes of the different surface finishing techniques for achieving a smooth surface finish on an electron beam freeform fabricated part.

  14. Computing diffusivities from particle models out of equilibrium

    NASA Astrophysics Data System (ADS)

    Embacher, Peter; Dirr, Nicolas; Zimmer, Johannes; Reina, Celia

    2018-04-01

    A new method is proposed to numerically extract the diffusivity of a (typically nonlinear) diffusion equation from underlying stochastic particle systems. The proposed strategy requires the system to be in local equilibrium and have Gaussian fluctuations but it is otherwise allowed to undergo arbitrary out-of-equilibrium evolutions. This could be potentially relevant for particle data obtained from experimental applications. The key idea underlying the method is that finite, yet large, particle systems formally obey stochastic partial differential equations of gradient flow type satisfying a fluctuation-dissipation relation. The strategy is here applied to three classic particle models, namely independent random walkers, a zero-range process and a symmetric simple exclusion process in one space dimension, to allow the comparison with analytic solutions.

  15. SeqPig: simple and scalable scripting for large sequencing data sets in Hadoop.

    PubMed

    Schumacher, André; Pireddu, Luca; Niemenmaa, Matti; Kallio, Aleksi; Korpelainen, Eija; Zanetti, Gianluigi; Heljanko, Keijo

    2014-01-01

    Hadoop MapReduce-based approaches have become increasingly popular due to their scalability in processing large sequencing datasets. However, as these methods typically require in-depth expertise in Hadoop and Java, they are still out of reach of many bioinformaticians. To solve this problem, we have created SeqPig, a library and a collection of tools to manipulate, analyze and query sequencing datasets in a scalable and simple manner. SeqPigscripts use the Hadoop-based distributed scripting engine Apache Pig, which automatically parallelizes and distributes data processing tasks. We demonstrate SeqPig's scalability over many computing nodes and illustrate its use with example scripts. Available under the open source MIT license at http://sourceforge.net/projects/seqpig/

  16. Models of stochastic gene expression

    NASA Astrophysics Data System (ADS)

    Paulsson, Johan

    2005-06-01

    Gene expression is an inherently stochastic process: Genes are activated and inactivated by random association and dissociation events, transcription is typically rare, and many proteins are present in low numbers per cell. The last few years have seen an explosion in the stochastic modeling of these processes, predicting protein fluctuations in terms of the frequencies of the probabilistic events. Here I discuss commonalities between theoretical descriptions, focusing on a gene-mRNA-protein model that includes most published studies as special cases. I also show how expression bursts can be explained as simplistic time-averaging, and how generic approximations can allow for concrete interpretations without requiring concrete assumptions. Measures and nomenclature are discussed to some extent and the modeling literature is briefly reviewed.

  17. Detecting Phase Boundaries in Hard-Sphere Suspensions

    NASA Technical Reports Server (NTRS)

    McDowell, Mark; Rogers, Richard B.; Gray, Elizabeth

    2009-01-01

    A special image-data-processing technique has been developed for use in experiments that involve observation, via optical microscopes equipped with electronic cameras, of moving boundaries between the colloidal-solid and colloidal-liquid phases of colloidal suspensions of monodisperse hard spheres. During an experiment, it is necessary to adjust the position of a microscope to keep the phase boundary within view. A boundary typically moves at a speed of the order of microns per hour. Because an experiment can last days or even weeks, it is impractical to require human intervention to keep the phase boundary in view. The present image-data-processing technique yields results within a computation time short enough to enable generation of automated-microscope-positioning commands to track the moving phase boundary

  18. How to write an educational research grant: AMEE Guide No. 101.

    PubMed

    Blanco, Maria A; Gruppen, Larry D; Artino, Anthony R; Uijtdehaage, Sebastian; Szauter, Karen; Durning, Steven J

    2016-01-01

    Writing an educational research grant in health profession education is challenging, not only for those doing it for the first time but also for more experienced scholars. The intensity of the competition, the peculiarities of the grant format, the risk of rejection, and the time required are among the many obstacles that can prevent educational researchers with interesting and important ideas from writing a grant, that could provide the funding needed to turn their scholarly ideas into reality. The aim of this AMEE Guide is to clarify the grant-writing process by (a) explaining the mechanics and structure of a typical educational research grant proposal, and (b) sharing tips and strategies for making the process more manageable.

  19. A study of swing-curve physics in diffraction-based overlay

    NASA Astrophysics Data System (ADS)

    Bhattacharyya, Kaustuve; den Boef, Arie; Storms, Greet; van Heijst, Joost; Noot, Marc; An, Kevin; Park, Noh-Kyoung; Jeon, Se-Ra; Oh, Nang-Lyeom; McNamara, Elliott; van de Mast, Frank; Oh, SeungHwa; Lee, Seung Yoon; Hwang, Chan; Lee, Kuntack

    2016-03-01

    With the increase of process complexity in advanced nodes, the requirements of process robustness in overlay metrology continues to tighten. Especially with the introduction of newer materials in the film-stack along with typical stack variations (thickness, optical properties, profile asymmetry etc.), the signal formation physics in diffraction-based overlay (DBO) becomes an important aspect to apply in overlay metrology target and recipe selection. In order to address the signal formation physics, an effort is made towards studying the swing-curve phenomena through wavelength and polarizations on production stacks using simulations as well as experimental technique using DBO. The results provide a wealth of information on target and recipe selection for robustness. Details from simulation and measurements will be reported in this technical publication.

  20. Development of Level 2 Calibration and Validation Plans for GOES-R; What is a RIMP?

    NASA Technical Reports Server (NTRS)

    Kopp, Thomas J.; Belsma, Leslie O.; Mollner, Andrew K.; Sun, Ziping; Deluccia, Frank

    2017-01-01

    Calibration and Validation (CalVal) plans for Geostationary Operational Environmental Satellite version R (GOES-R) Level 2 (L2) products were documented via Resource, Implementation, and Management Plans (RIMPs) for all of the official L2 products required from the GOES-R Advanced Baseline Imager (ABI). In 2015 the GOES-R program decided to replace the typical CalVal plans with RIMPs that covered, for a given L2 product, what was required from that product, how it would be validated, and what tools would be used to do so. Similar to Level 1b products, the intent was to cover the full spectrum of planning required for the CalVal of L2 ABI products. Instead of focusing on step-by-step procedures, the RIMPs concentrated on the criteria for each stage of the validation process (Beta, Provisional, and Full Validation) and the many elements required to prove when each stage was reached.

  1. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    PubMed Central

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  2. Pathogen and Particle Associations in Wastewater: Significance and Implications for Treatment and Disinfection Processes.

    PubMed

    Chahal, C; van den Akker, B; Young, F; Franco, C; Blackbeard, J; Monis, P

    2016-01-01

    Disinfection guidelines exist for pathogen inactivation in potable water and recycled water, but wastewater with high numbers of particles can be more difficult to disinfect, making compliance with the guidelines problematic. Disinfection guidelines specify that drinking water with turbidity ≥1 Nephelometric Turbidity Units (NTU) is not suitable for disinfection and therefore not fit for purpose. Treated wastewater typically has higher concentrations of particles (1-10NTU for secondary treated effluent). Two processes widely used for disinfecting wastewater are chlorination and ultraviolet radiation. In both cases, particles in wastewater can interfere with disinfection and can significantly increase treatment costs by increasing operational expenditure (chemical demand, power consumption) or infrastructure costs by requiring additional treatment processes to achieve the required levels of pathogen inactivation. Many microorganisms (viruses, bacteria, protozoans) associate with particles, which can allow them to survive disinfection processes and cause a health hazard. Improved understanding of this association will enable development of cost-effective treatment, which will become increasingly important as indirect and direct potable reuse of wastewater becomes more widespread in both developed and developing countries. This review provides an overview of wastewater and associated treatment processes, the pathogens in wastewater, the nature of particles in wastewater and how they interact with pathogens, and how particles can impact disinfection processes. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Critical review of real-time methods for solid waste characterisation: Informing material recovery and fuel production.

    PubMed

    Vrancken, C; Longhurst, P J; Wagland, S T

    2017-03-01

    Waste management processes generally represent a significant loss of material, energy and economic resources, so legislation and financial incentives are being implemented to improve the recovery of these valuable resources whilst reducing contamination levels. Material recovery and waste derived fuels are potentially valuable options being pursued by industry, using mechanical and biological processes incorporating sensor and sorting technologies developed and optimised for recycling plants. In its current state, waste management presents similarities to other industries that could improve their efficiencies using process analytical technology tools. Existing sensor technologies could be used to measure critical waste characteristics, providing data required by existing legislation, potentially aiding waste treatment processes and assisting stakeholders in decision making. Optical technologies offer the most flexible solution to gather real-time information applicable to each of the waste mechanical and biological treatment processes used by industry. In particular, combinations of optical sensors in the visible and the near-infrared range from 800nm to 2500nm of the spectrum, and different mathematical techniques, are able to provide material information and fuel properties with typical performance levels between 80% and 90%. These sensors not only could be used to aid waste processes, but to provide most waste quality indicators required by existing legislation, whilst offering better tools to the stakeholders. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Central tendency effects in time interval reproduction in autism

    PubMed Central

    Karaminis, Themelis; Cicchini, Guido Marco; Neil, Louise; Cappagli, Giulia; Aagten-Murphy, David; Burr, David; Pellicano, Elizabeth

    2016-01-01

    Central tendency, the tendency of judgements of quantities (lengths, durations etc.) to gravitate towards their mean, is one of the most robust perceptual effects. A Bayesian account has recently suggested that central tendency reflects the integration of noisy sensory estimates with prior knowledge representations of a mean stimulus, serving to improve performance. The process is flexible, so prior knowledge is weighted more heavily when sensory estimates are imprecise, requiring more integration to reduce noise. In this study we measure central tendency in autism to evaluate a recent theoretical hypothesis suggesting that autistic perception relies less on prior knowledge representations than typical perception. If true, autistic children should show reduced central tendency than theoretically predicted from their temporal resolution. We tested autistic and age- and ability-matched typical children in two child-friendly tasks: (1) a time interval reproduction task, measuring central tendency in the temporal domain; and (2) a time discrimination task, assessing temporal resolution. Central tendency reduced with age in typical development, while temporal resolution improved. Autistic children performed far worse in temporal discrimination than the matched controls. Computational simulations suggested that central tendency was much less in autistic children than predicted by theoretical modelling, given their poor temporal resolution. PMID:27349722

  5. Aging Studies of VCE Dismantlement Returns

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Letant, S; Alviso, C; Pearson, M

    2011-10-17

    VCE is an ethylene/vinyl acetate/vinyl alcohol terpolymer binder for filled elastomers which is designed to accept high filler loadings. Filled elastomer parts consist of the binder (VCE), a curing agent (Hylene MP, diphenol-4-4{prime}-methylenebis(phenylcarbamate)), a processing aid (LS, lithium stearate), and filler particles (typically 70% fraction by weight). The curing of the filled elastomer parts occurs from the heat-activated reaction between the hydroxyl groups of VCE with the Hylene MP curing agent, resulting in a cross-linked network. The final vinyl acetate content is typically between 34.9 and 37.9%, while the vinyl alcohol content is typically between 1.27 and 1.78%. Surveillance datamore » for this material is both scarce and scattered, complicating the assessment of any aging trends in systems. In addition, most of the initial surveillance efforts focused on mechanical properties such as hardness and tensile strength, and chemical information is therefore lacking. Material characterization and aging studies had been performed on previous formulations of the VCE material but the Ethylene Vinyl Acetate (EVA) starting copolymer is no longer commercially available. New formulations with replacement EVA materials are currently being established and will require characterization as well as updated aging models.« less

  6. Mission Engineering of a Rapid Cycle Spacecraft Logistics Fleet

    NASA Technical Reports Server (NTRS)

    Holladay, Jon; McClendon, Randy (Technical Monitor)

    2002-01-01

    The requirement for logistics re-supply of the International Space Station has provided a unique opportunity for engineering the implementation of NASA's first dedicated pressurized logistics carrier fleet. The NASA fleet is comprised of three Multi-Purpose Logistics Modules (MPLM) provided to NASA by the Italian Space Agency in return for operations time aboard the International Space Station. Marshall Space Flight Center was responsible for oversight of the hardware development from preliminary design through acceptance of the third flight unit, and currently manages the flight hardware sustaining engineering and mission engineering activities. The actual MPLM Mission began prior to NASA acceptance of the first flight unit in 1999 and will continue until the de-commission of the International Space Station that is planned for 20xx. Mission engineering of the MPLM program requires a broad focus on three distinct yet inter-related operations processes: pre-flight, flight operations, and post-flight turn-around. Within each primary area exist several complex subsets of distinct and inter-related activities. Pre-flight processing includes the evaluation of carrier hardware readiness for space flight. This includes integration of payload into the carrier, integration of the carrier into the launch vehicle, and integration of the carrier onto the orbital platform. Flight operations include the actual carrier operations during flight and any required real-time ground support. Post-flight processing includes de-integration of the carrier hardware from the launch vehicle, de-integration of the payload, and preparation for returning the carrier to pre-flight staging. Typical space operations are engineered around the requirements and objectives of a dedicated mission on a dedicated operational platform (i.e. Launch or Orbiting Vehicle). The MPLM, however, has expanded this envelope by requiring operations with both vehicles during flight as well as pre-launch and post-landing operations. These unique requirements combined with a success-oriented schedule of four flights within a ten-month period have provided numerous opportunities for understanding and improving operations processes. Furthermore, it has increased the knowledge base of future Payload Carrier and Launch Vehicle hardware and requirement developments. Discussion of the process flows and target areas for process improvement are provided in the subject paper. Special emphasis is also placed on supplying guidelines for hardware development. The combination of process knowledge and hardware development knowledge will provide a comprehensive overview for future vehicle developments as related to integration and transportation of payloads.

  7. Prkci is required for a non-autonomous signal that coordinates cell polarity during cavitation.

    PubMed

    Mah, In Kyoung; Soloff, Rachel; Izuhara, Audrey K; Lakeland, Daniel L; Wang, Charles; Mariani, Francesca V

    2016-08-01

    Polarized epithelia define boundaries, spaces, and cavities within organisms. Cavitation, a process by which multicellular hollow balls or tubes are produced, is typically associated with the formation of organized epithelia. In order for these epithelial layers to form, cells must ultimately establish a distinct apical-basal polarity. Atypical PKCs have been proposed to be required for apical-basal polarity in diverse species. Here we show that while cells null for the Prkci isozyme exhibit some polarity characteristics, they fail to properly segregate apical-basal proteins, form a coordinated ectodermal epithelium, or participate in normal cavitation. A failure to cavitate could be due to an overgrowth of interior cells or to an inability of interior cells to die. Null cells however, do not have a marked change in proliferation rate and are still capable of undergoing cell death, suggesting that alterations in these processes are not the predominant cause of the failed cavitation. Overexpression of BMP4 or EZRIN can partially rescue the phenotype possibly by promoting cell death, polarity, and differentiation. However, neither is sufficient to provide the required cues to generate a polarized epithelium and fully rescue cavitation. Interestingly, when wildtype and Prkci(-/-) ES cells are mixed together, a polarized ectodermal epithelium forms and cavitation is rescued, likely due to the ability of wildtype cells to produce non-autonomous polarity cues. We conclude that Prkci is not required for cells to respond to these cues, though it is required to produce them. Together these findings indicate that environmental cues can facilitate the formation of polarized epithelia and that cavitation requires the proper coordination of multiple basic cellular processes including proliferation, differentiation, cell death, and apical-basal polarization. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Metal mirror TMA, telescopes of the JSS product line: design and analysis

    NASA Astrophysics Data System (ADS)

    Kirschstein, Steffen; Koch, Amelia; Schöneich, Jürgen; Döngi, Frank

    2005-09-01

    For the increasing market of low-cost multispectral pushbroom scanners for spaceborne Earth remote sensing the Jena-Optronik GmbH have developed the JSS product line. They are typically operated on micro-satellites with strong resources constraints. This leads to instrument designs optimised with respect to minimum size and mass, power consumption, and cost. From various customer requirements, Jena-Optronik has derived the JSS product line of low-cost optical spaceborne scanners in the visible wavelength range. Three-mirror anastigmat (TMA) telescope designs have become a widespread design solution for fields of view from 2 to 12 deg. The design solution chosen by Jena-Optronik is based on all-aluminium telescopes. Novel ultra-precision milling and polishing techniques now give the opportunity to achieve the necessary optical surface quality for applications in the visible range. The TMA telescope optics design of the JSS-56 imager will be accommodated onboard the RapidEye spacecraft. The JSS-56 TMA with a F-number of 4.3 realised a swath width of 78km with a Ground pixel resolution of 6.5m × 6.5m. The aluminium mirrors are Ni coated to achieve a suitable surface polish quality. This paper discusses typical requirements for the thermal design the bimetallic effects of the mirrors. To achieve a nearly diffracted limited imaging the typical surface irregularities due to the turning process have to be addressed in the ray tracing models. Analysis and integration of real mirror data in the ZEMAX design software are demonstrated here and compared with build-in standard tolerance concepts.

  9. Cookie- versus cracker-baking--what's the difference? Flour functionality requirements explored by SRC and alveography.

    PubMed

    Kweon, Meera; Slade, Louise; Levine, Harry; Gannon, Diane

    2014-01-01

    The many differences between cookie- and cracker-baking are discussed and described in terms of the functionality, and functional requirements, of the major biscuit ingredients--flour and sugar. Both types of products are similar in their major ingredients, but different in their formulas and processes. One of the most important and consequential differences between traditional cracker and cookie formulas is sugar (i.e., sucrose) concentration: usually lower than 30% in a typical cracker formula and higher than 30% in a typical cookie formula. Gluten development is facilitated in lower-sugar cracker doughs during mixing and sheeting; this is a critical factor linked to baked-cracker quality. Therefore, soft wheat flours with greater gluten quality and strength are typically preferred for cracker production. In contrast, the concentrated aqueous sugar solutions existing in high-sugar cookie doughs generally act as an antiplasticizer, compared with water alone, so gluten development during dough mixing and starch gelatinization/pasting during baking are delayed or prevented in most cookie systems. Traditional cookies and crackers are low-moisture baked goods, which are desirably made from flours with low water absorption [low water-holding capacity (WHC)], and low levels of damaged starch and water-soluble pentosans (i.e., water-accessible arabinoxylans). Rheological (e.g., alveography) and baking tests are often used to evaluate flour quality for baked-goods applications, but the solvent retention capacity (SRC) method (AACC 56-11) is a better diagnostic tool for predicting the functional contribution of each individual flour functional component, as well as the overall functionality of flours for cookie- and/or cracker-baking.

  10. A multilevel reuse system with source separation process for printing and dyeing wastewater treatment: A case study.

    PubMed

    Wang, Rui; Jin, Xin; Wang, Ziyuan; Gu, Wantao; Wei, Zhechao; Huang, Yuanjie; Qiu, Zhuang; Jin, Pengkang

    2018-01-01

    This paper proposes a new system of multilevel reuse with source separation in printing and dyeing wastewater (PDWW) treatment in order to dramatically improve the water reuse rate to 35%. By analysing the characteristics of the sources and concentrations of pollutants produced in different printing and dyeing processes, special, highly, and less contaminated wastewaters (SCW, HCW, and LCW, respectively) were collected and treated separately. Specially, a large quantity of LCW was sequentially reused at multiple levels to meet the water quality requirements for different production processes. Based on this concept, a multilevel reuse system with a source separation process was established in a typical printing and dyeing enterprise. The water reuse rate increased dramatically to 62%, and the reclaimed water was reused in different printing and dyeing processes based on the water quality. This study provides promising leads in water management for wastewater reclamation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Method for sequestering CO.sub.2 and SO.sub.2 utilizing a plurality of waste streams

    DOEpatents

    Soong, Yee [Monroeville, PA; Allen, Douglas E [Salem, MA; Zhu, Chen [Monroe County, IN

    2011-04-12

    A neutralization/sequestration process is provided for concomitantly addressing capture and sequestration of both CO.sub.2 and SO.sub.2 from industrial gas byproduct streams. The invented process concomitantly treats and minimizes bauxite residues from aluminum production processes and brine wastewater from oil/gas production processes. The benefits of this integrated approach to coincidental treatment of multiple industrial waste byproduct streams include neutralization of caustic byproduct such as bauxite residue, thereby decreasing the risk associated with the long-term storage and potential environmental of storing caustic materials, decreasing or obviating the need for costly treatment of byproduct brines, thereby eliminating the need to purchase CaO or similar scrubber reagents typically required for SO.sub.2 treatment of such gasses, and directly using CO.sub.2 from flue gas to neutralize bauxite residue/brine mixtures, without the need for costly separation of CO.sub.2 from the industrial byproduct gas stream by processes such as liquid amine-based scrubbers.

  12. Successive membrane separation processes simplify concentration of lipases produced by Aspergillus niger by solid-state fermentation.

    PubMed

    Reinehr, Christian Oliveira; Treichel, Helen; Tres, Marcus Vinicius; Steffens, Juliana; Brião, Vandré Barbosa; Colla, Luciane Maria

    2017-06-01

    In this study, we developed a simplified method for producing, separating, and concentrating lipases derived from solid-state fermentation of agro-industrial residues by filamentous fungi. First, we used Aspergillus niger to produce lipases with hydrolytic activity. We analyzed the separation and concentration of enzymes using membrane separation processes. The sequential use of microfiltration and ultrafiltration processes made it possible to obtain concentrates with enzymatic activities much higher than those in the initial extract. The permeate flux was higher than 60 L/m 2 h during microfiltration using 20- and 0.45-µm membranes and during ultrafiltration using 100- and 50-kDa membranes, where fouling was reversible during the filtration steps, thereby indicating that the fouling may be removed by cleaning processes. These results demonstrate the feasibility of lipase production using A. niger by solid-state fermentation of agro-industrial residues, followed by successive tangential filtration with membranes, which simplify the separation and concentration steps that are typically required in downstream processes.

  13. An Optimized Multicolor Point-Implicit Solver for Unstructured Grid Applications on Graphics Processing Units

    NASA Technical Reports Server (NTRS)

    Zubair, Mohammad; Nielsen, Eric; Luitjens, Justin; Hammond, Dana

    2016-01-01

    In the field of computational fluid dynamics, the Navier-Stokes equations are often solved using an unstructuredgrid approach to accommodate geometric complexity. Implicit solution methodologies for such spatial discretizations generally require frequent solution of large tightly-coupled systems of block-sparse linear equations. The multicolor point-implicit solver used in the current work typically requires a significant fraction of the overall application run time. In this work, an efficient implementation of the solver for graphics processing units is proposed. Several factors present unique challenges to achieving an efficient implementation in this environment. These include the variable amount of parallelism available in different kernel calls, indirect memory access patterns, low arithmetic intensity, and the requirement to support variable block sizes. In this work, the solver is reformulated to use standard sparse and dense Basic Linear Algebra Subprograms (BLAS) functions. However, numerical experiments show that the performance of the BLAS functions available in existing CUDA libraries is suboptimal for matrices representative of those encountered in actual simulations. Instead, optimized versions of these functions are developed. Depending on block size, the new implementations show performance gains of up to 7x over the existing CUDA library functions.

  14. Labor of love. A model for planning human resource needs.

    PubMed

    Brady, F J

    1989-01-01

    Typically, the annual budgeting process is the hospital's only attempt to forecast human resource requirements. In times of rapid change, this traditional ad hoc approach is incapable of satisfying either the Catholic hospital's ethical obligations as an employer or its responsibilities to provide healthcare to the poor and suffering. Assumptions about future activity, including volume projections on admissions, patient days, and other services, influence the budgeting process to a large degree. Because the amount of work to be done and the number of employees required to do it are related, changes in demand for service immediately and directly affect staffing requirements. A hospital cannot achieve ethical human resource management or provide high-quality healthcare if inadequate planning forces management into a cycle of crisis-coping--reacting to this year's nursing shortage with a major recruiting effort and next year's financial crunch with a traumatic reduction in force. The human resource planning approach outlined here helps the hospital meet legitimate business needs while satisfying its ethical obligations. The model has four phases and covers a charge to the planning committee; committee appointments; announcements; the establishment of ground rules, focus, and task forces; and the work of each task force.

  15. Solid state light engines for bioanalytical instruments and biomedical devices

    NASA Astrophysics Data System (ADS)

    Jaffe, Claudia B.; Jaffe, Steven M.

    2010-02-01

    Lighting subsystems to drive 21st century bioanalysis and biomedical diagnostics face stringent requirements. Industrywide demands for speed, accuracy and portability mean illumination must be intense as well as spectrally pure, switchable, stable, durable and inexpensive. Ideally a common lighting solution could service these needs for numerous research and clinical applications. While this is a noble objective, the current technology of arc lamps, lasers, LEDs and most recently light pipes have intrinsic spectral and angular traits that make a common solution untenable. Clearly a hybrid solution is required to service the varied needs of the life sciences. Any solution begins with a critical understanding of the instrument architecture and specifications for illumination regarding power, illumination area, illumination and emission wavelengths and numerical aperture. Optimizing signal to noise requires careful optimization of these parameters within the additional constraints of instrument footprint and cost. Often the illumination design process is confined to maximizing signal to noise without the ability to adjust any of the above parameters. A hybrid solution leverages the best of the existing lighting technologies. This paper will review the design process for this highly constrained, but typical optical optimization scenario for numerous bioanalytical instruments and biomedical devices.

  16. In the year 2525, if x ray is still alive, if lithography can survive, they may find...

    NASA Astrophysics Data System (ADS)

    Nistler, John L.; Michael, Mark; Hause, Fred N.; Edwards, Richard D.

    1998-12-01

    Data and discussions will be presented on the NTRM, National Technology Roadmap, for reticles based on a Process Integration perception. Specifically two layers are considered for this paper, the gate layer which is primarily a chrome geometry mask with a lot of open glass and a local interconnect layer which is primarily a chrome plate using clear geometries. Information from other sources is used where appropriate and actual in-house data is used to illustrate specific points. Realizing that demands from different customers for specific types of features tend to drive specific mask makers and their decisions on equipment purchases and processes. We attempt to help predict where Integration approaches have either caused a lag in technology pushes or have actually speeded up certain requirements. Discussions of integration requirements, which tend to push maskmakers, will be presented. Along with typical design approaches in OPC and PSM which either will push technology or actually slow down the trend towards smaller geometries. In addition, data will be presented showing how specific stepper characteristics may actually drive the customer's criteria, thus changing the requirements from customer to customer.

  17. Electrophysiological evidence for flexible goal-directed cue processing during episodic retrieval.

    PubMed

    Herron, Jane E; Evans, Lisa H; Wilding, Edward L

    2016-05-15

    A widely held assumption is that memory retrieval is aided by cognitive control processes that are engaged flexibly in service of memory retrieval and memory decisions. While there is some empirical support for this view, a notable exception is the absence of evidence for the flexible use of retrieval control in functional neuroimaging experiments requiring frequent switches between tasks with different cognitive demands. This absence is troublesome in so far as frequent switches between tasks mimic some of the challenges that are typically placed on memory outside the laboratory. In this experiment we instructed participants to alternate frequently between three episodic memory tasks requiring item recognition or retrieval of one of two different kinds of contextual information encoded in a prior study phase (screen location or encoding task). Event-related potentials (ERPs) elicited by unstudied items in the two tasks requiring retrieval of study context were reliably different, demonstrating for the first time that ERPs index task-specific processing of retrieval cues when retrieval goals change frequently. The inclusion of the item recognition task was a novel and important addition in this study, because only the ERPs elicited by unstudied items in one of the two context conditions diverged from those in the item recognition condition. This outcome constrains functional interpretations of the differences that emerged between the two context conditions and emphasises the utility of this baseline in functional imaging studies of retrieval processing operations. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Electrophysiological evidence for flexible goal-directed cue processing during episodic retrieval

    PubMed Central

    Herron, Jane E.; Evans, Lisa H.; Wilding, Edward L.

    2016-01-01

    A widely held assumption is that memory retrieval is aided by cognitive control processes that are engaged flexibly in service of memory retrieval and memory decisions. While there is some empirical support for this view, a notable exception is the absence of evidence for the flexible use of retrieval control in functional neuroimaging experiments requiring frequent switches between tasks with different cognitive demands. This absence is troublesome in so far as frequent switches between tasks mimic some of the challenges that are typically placed on memory outside the laboratory. In this experiment we instructed participants to alternate frequently between three episodic memory tasks requiring item recognition or retrieval of one of two different kinds of contextual information encoded in a prior study phase (screen location or encoding task). Event-related potentials (ERPs) elicited by unstudied items in the two tasks requiring retrieval of study context were reliably different, demonstrating for the first time that ERPs index task-specific processing of retrieval cues when retrieval goals change frequently. The inclusion of the item recognition task was a novel and important addition in this study, because only the ERPs elicited by unstudied items in one of the two context conditions diverged from those in the item recognition condition. This outcome constrains functional interpretations of the differences that emerged between the two context conditions and emphasises the utility of this baseline in functional imaging studies of retrieval processing operations. PMID:26892854

  19. Color measurement of plastics - From compounding via pelletizing, up to injection molding and extrusion

    NASA Astrophysics Data System (ADS)

    Botos, J.; Murail, N.; Heidemeyer, P.; Kretschmer, K.; Ulmer, B.; Zentgraf, T.; Bastian, M.; Hochrein, T.

    2014-05-01

    The typical offline color measurement on injection molded or pressed specimens is a very expensive and time-consuming process. In order to optimize the productivity and quality, it is desirable to measure the color already during the production. Therefore several systems have been developed to monitor the color e.g. on melts, strands, pellets, the extrudate or injection molded part already during the process. Different kinds of inline, online and atline methods with their respective advantages and disadvantages will be compared. The criteria are e.g. the testing time, which ranges from real-time to some minutes, the required calibration procedure, the spectral resolution and the final measuring precision. The latter ranges between 0.05 to 0.5 in the CIE L*a*b* system depending on the particular measurement system. Due to the high temperatures in typical plastics processes thermochromism of polymers and dyes has to be taken into account. This effect can influence the color value in the magnitude of some 10% and is barely understood so far. Different suitable methods to compensate thermochromic effects during compounding or injection molding by using calibration curves or artificial neural networks are presented. Furthermore it is even possible to control the color during extrusion and compounding almost in real-time. The goal is a specific developed software for adjusting the color recipe automatically with the final objective of a closed-loop control.

  20. Nonlinear plasma wave models in 3D fluid simulations of laser-plasma interaction

    NASA Astrophysics Data System (ADS)

    Chapman, Thomas; Berger, Richard; Arrighi, Bill; Langer, Steve; Banks, Jeffrey; Brunner, Stephan

    2017-10-01

    Simulations of laser-plasma interaction (LPI) in inertial confinement fusion (ICF) conditions require multi-mm spatial scales due to the typical laser beam size and durations of order 100 ps in order for numerical laser reflectivities to converge. To be computationally achievable, these scales necessitate a fluid-like treatment of light and plasma waves with a spatial grid size on the order of the light wave length. Plasma waves experience many nonlinear phenomena not naturally described by a fluid treatment, such as frequency shifts induced by trapping, a nonlinear (typically suppressed) Landau damping, and mode couplings leading to instabilities that can cause the plasma wave to decay rapidly. These processes affect the onset and saturation of stimulated Raman and Brillouin scattering, and are of direct interest to the modeling and prediction of deleterious LPI in ICF. It is not currently computationally feasible to simulate these Debye length-scale phenomena in 3D across experimental scales. Analytically-derived and/or numerically benchmarked models of processes occurring at scales finer than the fluid simulation grid offer a path forward. We demonstrate the impact of a range of kinetic processes on plasma reflectivity via models included in the LPI simulation code pF3D. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  1. Basic visual perceptual processes in children with typical development and cerebral palsy: The processing of surface, length, orientation, and position.

    PubMed

    Schmetz, Emilie; Magis, David; Detraux, Jean-Jacques; Barisnikov, Koviljka; Rousselle, Laurence

    2018-03-02

    The present study aims to assess how the processing of basic visual perceptual (VP) components (length, surface, orientation, and position) develops in typically developing (TD) children (n = 215, 4-14 years old) and adults (n = 20, 20-25 years old), and in children with cerebral palsy (CP) (n = 86, 5-14 years old) using the first four subtests of the Battery for the Evaluation of Visual Perceptual and Spatial processing in children. Experiment 1 showed that these four basic VP processes follow distinct developmental trajectories in typical development. Experiment 2 revealed that children with CP present global and persistent deficits for the processing of basic VP components when compared with TD children matched on chronological age and nonverbal reasoning abilities.

  2. Diffraction based overlay re-assessed

    NASA Astrophysics Data System (ADS)

    Leray, Philippe; Laidler, David; D'havé, Koen; Cheng, Shaunee

    2011-03-01

    In recent years, numerous authors have reported the advantages of Diffraction Based Overlay (DBO) over Image Based Overlay (IBO), mainly by comparison of metrology figures of merit such as TIS and TMU. Some have even gone as far as to say that DBO is the only viable overlay metrology technique for advanced technology nodes; 22nm and beyond. Typically the only reported drawback of DBO is the size of the required targets. This severely limits its effective use, when all critical layers of a product, including double patterned layers need to be measured, and in-die overlay measurements are required. In this paper we ask whether target size is the only limitation to the adoption of DBO for overlay characterization and control, or are there other metrics, which need to be considered. For example, overlay accuracy with respect to scanner baseline or on-product process overlay control? In this work, we critically re-assess the strengths and weaknesses of DBO for the applications of scanner baseline and on-product process layer overlay control. A comprehensive comparison is made to IBO. For on product process layer control we compare the performance on critical process layers; Gate, Contact and Metal. In particularly we focus on the response of the scanner to the corrections determined by each metrology technique for each process layer, as a measure of the accuracy. Our results show that to characterize an overlay metrology technique that is suitable for use in advanced technology nodes requires much more than just evaluating the conventional metrology metrics of TIS and TMU.

  3. The Complexities of Complex Memory Span: Storage and Processing Deficits in Specific Language Impairment

    ERIC Educational Resources Information Center

    Archibald, Lisa M. D.; Gathercole, Susan E.

    2007-01-01

    This study investigated the verbal and visuospatial processing and storage skills of children with SLI and typically developing children. Fourteen school-age children with SLI, and two groups of typically developing children matched either for age or language abilities, completed measures of processing speed and storage capacity, and a set of…

  4. Organic electronics with polymer dielectrics on plastic substrates fabricated via transfer printing

    NASA Astrophysics Data System (ADS)

    Hines, Daniel R.

    Printing methods are fast becoming important processing techniques for the fabrication of flexible electronics. Some goals for flexible electronics are to produce cheap, lightweight, disposable radio frequency identification (RFID) tags, very large flexible displays that can be produced in a roll-to-roll process and wearable electronics for both the clothing and medical industries. Such applications will require fabrication processes for the assembly of dissimilar materials onto a common substrate in ways that are compatible with organic and polymeric materials as well as traditional solid-state electronic materials. A transfer printing method has been developed with these goals and application in mind. This printing method relies primarily on differential adhesion where no chemical processing is performed on the device substrate. It is compatible with a wide variety of materials with each component printed in exactly the same way, thus avoiding any mixed processing steps on the device substrate. The adhesion requirements of one material printed onto a second are studied by measuring the surface energy of both materials and by surface treatments such as plasma exposure or the application of self-assembled monolayers (SAM). Transfer printing has been developed within the context of fabricating organic electronics onto plastic substrates because these materials introduce unique opportunities associated with processing conditions not typically required for traditional semiconducting materials. Compared to silicon, organic semiconductors are soft materials that require low temperature processing and are extremely sensitive to chemical processing and environmental contamination. The transfer printing process has been developed for the important and commonly used organic semiconducting materials, pentacene (Pn) and poly(3-hexylthiophene) (P3HT). A three-step printing process has been developed by which these materials are printed onto an electrode subassembly consisting of previously printed electrodes separated by a polymer dielectric layer all on a plastic substrate. These bottom contact, flexible organic thin-film transistors (OTFT) have been compared to unprinted (reference) devices consisting of top contact electrodes and a silicon dioxide dielectric layer on a silicon substrate. Printed Pn and P3HT TFTs have been shown to out-perform the reference devices. This enhancement has been attributed to an annealing under pressure of the organic semiconducting material.

  5. A Methodology for Quantifying Certain Design Requirements During the Design Phase

    NASA Technical Reports Server (NTRS)

    Adams, Timothy; Rhodes, Russel

    2005-01-01

    A methodology for developing and balancing quantitative design requirements for safety, reliability, and maintainability has been proposed. Conceived as the basis of a more rational approach to the design of spacecraft, the methodology would also be applicable to the design of automobiles, washing machines, television receivers, or almost any other commercial product. Heretofore, it has been common practice to start by determining the requirements for reliability of elements of a spacecraft or other system to ensure a given design life for the system. Next, safety requirements are determined by assessing the total reliability of the system and adding redundant components and subsystems necessary to attain safety goals. As thus described, common practice leaves the maintainability burden to fall to chance; therefore, there is no control of recurring costs or of the responsiveness of the system. The means that have been used in assessing maintainability have been oriented toward determining the logistical sparing of components so that the components are available when needed. The process established for developing and balancing quantitative requirements for safety (S), reliability (R), and maintainability (M) derives and integrates NASA s top-level safety requirements and the controls needed to obtain program key objectives for safety and recurring cost (see figure). Being quantitative, the process conveniently uses common mathematical models. Even though the process is shown as being worked from the top down, it can also be worked from the bottom up. This process uses three math models: (1) the binomial distribution (greaterthan- or-equal-to case), (2) reliability for a series system, and (3) the Poisson distribution (less-than-or-equal-to case). The zero-fail case for the binomial distribution approximates the commonly known exponential distribution or "constant failure rate" distribution. Either model can be used. The binomial distribution was selected for modeling flexibility because it conveniently addresses both the zero-fail and failure cases. The failure case is typically used for unmanned spacecraft as with missiles.

  6. 16 CFR Figure 5 to Part 1512 - Typical Handbrake Actuator Showing Grip Dimension

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Typical Handbrake Actuator Showing Grip Dimension 5 Figure 5 to Part 1512 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION FEDERAL HAZARDOUS SUBSTANCES ACT REGULATIONS REQUIREMENTS FOR BICYCLES Pt. 1512, Fig. 5 Figure 5 to Part 1512—Typical Handbrake Actuator Showing Grip Dimension...

  7. 16 CFR Figure 5 to Part 1512 - Typical Handbrake Actuator Showing Grip Dimension

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 2 2011-01-01 2011-01-01 false Typical Handbrake Actuator Showing Grip Dimension 5 Figure 5 to Part 1512 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION FEDERAL HAZARDOUS SUBSTANCES ACT REGULATIONS REQUIREMENTS FOR BICYCLES Pt. 1512, Fig. 5 Figure 5 to Part 1512—Typical Handbrake Actuator Showing Grip Dimension...

  8. Collaborative Problem Solving in Young Typical Development and HFASD

    ERIC Educational Resources Information Center

    Kimhi, Yael; Bauminger-Zviely, Nirit

    2012-01-01

    Collaborative problem solving (CPS) requires sharing goals/attention and coordinating actions--all deficient in HFASD. Group differences were examined in CPS (HFASD/typical), with a friend versus with a non-friend. Participants included 28 HFASD and 30 typical children aged 3-6 years and their 58 friends and 58 non-friends. Groups were matched on…

  9. 16 CFR Figure 5 to Part 1512 - Typical Handbrake Actuator Showing Grip Dimension

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 2 2013-01-01 2013-01-01 false Typical Handbrake Actuator Showing Grip Dimension 5 Figure 5 to Part 1512 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION FEDERAL HAZARDOUS SUBSTANCES ACT REGULATIONS REQUIREMENTS FOR BICYCLES Pt. 1512, Fig. 5 Figure 5 to Part 1512—Typical Handbrake Actuator Showing Grip Dimension...

  10. 16 CFR Figure 5 to Part 1512 - Typical Handbrake Actuator Showing Grip Dimension

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 16 Commercial Practices 2 2012-01-01 2012-01-01 false Typical Handbrake Actuator Showing Grip Dimension 5 Figure 5 to Part 1512 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION FEDERAL HAZARDOUS SUBSTANCES ACT REGULATIONS REQUIREMENTS FOR BICYCLES Pt. 1512, Fig. 5 Figure 5 to Part 1512—Typical Handbrake Actuator Showing Grip Dimension...

  11. 16 CFR Figure 5 to Part 1512 - Typical Handbrake Actuator Showing Grip Dimension

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 2 2014-01-01 2014-01-01 false Typical Handbrake Actuator Showing Grip Dimension 5 Figure 5 to Part 1512 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION FEDERAL HAZARDOUS SUBSTANCES ACT REGULATIONS REQUIREMENTS FOR BICYCLES Pt. 1512, Fig. 5 Figure 5 to Part 1512—Typical Handbrake Actuator Showing Grip Dimension...

  12. 14 CFR Appendix C to Part 1215 - Typical User Activity Timeline

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Typical User Activity Timeline C Appendix C... RELAY SATELLITE SYSTEM (TDRSS) Pt. 1215, App. C Appendix C to Part 1215—Typical User Activity Timeline... mission model. 3 years before launch (Ref. § 1215.109(c). Submit general user requirements to permit...

  13. Revision strategies of deaf student writers.

    PubMed

    Livingston, S

    1989-03-01

    Deaf high school students at different schools shared second drafts of their own narratives via an electronic bulletin board after conferencing with their repective teachers. This article characterizes the kinds of questions teachers asked during the conferences and the kinds of revisions the students made between first and second drafts. Results indicate that teachers most often ask questions that require student to provide more information; yet these questions do not affect revision as much as questions which require students to rephrase specific language. Students typically either added or substituted words or phrases that showed both similarities to and differences from the revision patterns of inexperienced writers with normal hearing. In the majority of cases, trained readers rated the deaf students' revised drafts better than their first attempts, signifying the central role revision plays in the composition process.

  14. Microgravity science experiment integration - When the PI and the PED differ

    NASA Technical Reports Server (NTRS)

    Baer-Peckham, M. S.; Mccarley, K. S.

    1991-01-01

    This paper addresses issues related to the integration of principal investigators (PIs) and payload-element developers (PEDs) for conducting effective microgravity experiments. The Crystal Growth Furnace (CGF) is used as an example to demonstrate the key issues related to the integration of a PI's sample into a facility run by a different organization. Attention is given to the typical preflight timeline, documentation required for experimental implementation, and hardware deliverables. A flow chart delineates the payload-integration process flow, and PI inputs required for an experiment include equipment and procedure definitions, detailed design and fabrication of the experiment-specific equipment, and specifications of the contract-end item. The present analysis is of interest to the coordination of effective microgravity experiments on the Space Station Freedom that incorporate PIs and PEDs from different organizations.

  15. Risk-Taking: Individual and Family Interests.

    PubMed

    Iltis, Ana S

    2015-08-01

    Decisions regarding clinical procedures or research participation typically require the informed consent of individuals. When individuals are unable to give consent, the informed permission of a legally authorized representative or surrogate is required. Although many proposed procedures are aimed primarily at benefiting the individual, some are not. I argue that, particularly when individuals are asked to assume risks primarily or exclusively for the benefit of others, family members ought to be engaged in the informed consent process. Examples of procedures in which individuals are asked to assume risks primarily or exclusively for the benefit of others include living organ donation and research participation. © The Author 2015. Published by Oxford University Press, on behalf of the Journal of Medicine and Philosophy Inc. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. African Primary Care Research: writing a research report.

    PubMed

    Couper, Ian; Mash, Bob

    2014-06-06

    Presenting a research report is an important way of demonstrating one's ability to conduct research and is a requirement of most research-based degrees. Although known by various names across academic institutions, the structure required is mostly very similar, being based on the Introduction, Methods, Results, Discussion format of scientific articles.This article offers some guidance on the process of writing, aimed at helping readers to start and to continue their writing; and to assist them in presenting a report that is received positively by their readers, including examiners. It also details the typical components of the research report, providing some guidelines for each, as well as the pitfalls to avoid.This article is part of a series on African Primary Care Research that aims to build capacity for research particularly at a Master's level.

  17. Determining The Provenance Of Sedimentary Materials On Mars Through Analog Studies

    NASA Astrophysics Data System (ADS)

    Craddock, R. A.

    2017-12-01

    The amount and types of sedimentary material available for transport can control the types of features that result from aeolian or fluvial processes. For example, if sediment availability increases dune forms transition from barchans to linear dunes. The availability of sediment and the erodibilty of the landscape can influence drainage divides, catchment areas, and stream type. There is abundant evidence of both aeolian and fluvial sediments on Mars with grain sizes ranging from silt/clay to pebbles and cobbles. However, what is unique about Mars is that the dominant rock type on the surface is basalt, and basalt does not typically weather into coarser particles sizes larger than silt/clay. So where does all the sand come from on Mars? Chemical weathering would produce clays. While mechanical weathering is possible, there are really only two end member processes: impact cratering and physical abrasion. Impact cratering can produce a wide range of particle sizes from house sized boulders to fine dust, but how much sand can be expected to be produced from impact craters? Physical abrasion is likely to be inefficient on Mars, resulting in the fast breakdown of sand-sized particles while producing more silt/clay sized particles. Other processes for generating sand on Mars include hyaloclastic, phreatomagmatic, and pyroclastic. These processes typically require the presence of water. This presentation will explore the possible diagnostic characteristics of sediments generated from these different processes. It will also show how basaltic sediments change as they are transported by water, wind, and ice. The image shows the physical characteristics of basaltic sediment transported by different geologic processes.

  18. N400 as an index of uncontrolled categorization processing in brand extension.

    PubMed

    Wang, Xiaoyi; Ma, Qingguo; Wang, Cuicui

    2012-09-06

    This study examined the ERP (event-related potential) correlates of categorization processing in brand extension with irrelative task. Participants faced two sequential stimuli in a pair consisting of a soft drink brand name (S1) and a product name (S2) which comprised two categories: beverage (typical product of the brand, e.g. Coke branded soda water) and clothing (atypical product of the brand, even though sometimes it was seen in the real market, e.g. Coke branded sport wear). The N400 was recorded and more largely distributed in frontal, frontal-central and central areas when S2 was clothing compared with beverage. The study did not require the participants to evaluate that the brand extension was appropriate or not, the N400 recorded here was, therefore, irrelative to the task difficulty and the conscious categorization process. We speculated that it reflected an integration processing related with the mental category. The brand performed the role of prime which aroused the participants' association of the brand-related typical products and attributes retrieving from their long term memory. The product name activated an unconscious processing of comparison between the brand and the product. In this process, the participant treated the brand as a mental category and classified the product as a member of it. There would be a large cognitive reaction which elicited the N400 if the product's attributes were atypical to the category of the brand. These findings might help us understand the N400 component in unconscious mental categorization and supported the categorization hypotheses in brand extension theory which was crucial in consumer psychology. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  19. Ecological Processes of Isolated Wetlands: Ecosystem Services and the Significant Nexus (Invited)

    NASA Astrophysics Data System (ADS)

    Lane, C.; Autrey, B.; D'Amico, E.

    2013-12-01

    Geographically isolated wetlands occur throughout the US and are characterized by a wetland system completely surrounded by uplands. Examples include prairie potholes, woodland seasonal (i.e., vernal) pools, cypress domes, playas, and other such systems. Decisions by the US Supreme Court in 2001 and 2006 have affected the jurisdictional status of geographically isolated wetlands such that those failing to have a demonstrable 'significant nexus' to navigable waters may have no federal protection under the Clean Water Act. These systems are typically small and, as such, may be under-counted in assessments of area and abundance. Areal extent is a portion of the information required to characterize the functions associated with geographically isolated wetlands and understanding both site-specific and larger-scale processes are also required to better quantify those functions. In addition, quantifying anthropogenic effects on system processing informs our understanding of the contributions and the connectivity of geographically isolated wetlands to other waters. This presentation focuses on both efforts to quantify the contribution of geographically isolated wetlands to system-scale processes, focusing on nutrient assimilation and hydrologic storage, as well as concurrent research to identify their locations at multiple scales. Findings from this research may help elucidate the link between geographically isolated wetlands and other systems, and may inform discussions on ecosystem services provided by geographically isolated wetlands.

  20. Modernizing Earth and Space Science Modeling Workflows in the Big Data Era

    NASA Astrophysics Data System (ADS)

    Kinter, J. L.; Feigelson, E.; Walker, R. J.; Tino, C.

    2017-12-01

    Modeling is a major aspect of the Earth and space science research. The development of numerical models of the Earth system, planetary systems or astrophysical systems is essential to linking theory with observations. Optimal use of observations that are quite expensive to obtain and maintain typically requires data assimilation that involves numerical models. In the Earth sciences, models of the physical climate system are typically used for data assimilation, climate projection, and inter-disciplinary research, spanning applications from analysis of multi-sensor data sets to decision-making in climate-sensitive sectors with applications to ecosystems, hazards, and various biogeochemical processes. In space physics, most models are from first principles, require considerable expertise to run and are frequently modified significantly for each case study. The volume and variety of model output data from modeling Earth and space systems are rapidly increasing and have reached a scale where human interaction with data is prohibitively inefficient. A major barrier to progress is that modeling workflows isn't deemed by practitioners to be a design problem. Existing workflows have been created by a slow accretion of software, typically based on undocumented, inflexible scripts haphazardly modified by a succession of scientists and students not trained in modern software engineering methods. As a result, existing modeling workflows suffer from an inability to onboard new datasets into models; an inability to keep pace with accelerating data production rates; and irreproducibility, among other problems. These factors are creating an untenable situation for those conducting and supporting Earth system and space science. Improving modeling workflows requires investments in hardware, software and human resources. This paper describes the critical path issues that must be targeted to accelerate modeling workflows, including script modularization, parallelization, and automation in the near term, and longer term investments in virtualized environments for improved scalability, tolerance for lossy data compression, novel data-centric memory and storage technologies, and tools for peer reviewing, preserving and sharing workflows, as well as fundamental statistical and machine learning algorithms.

  1. Early process development of API applied to poorly water-soluble TBID.

    PubMed

    Meise, Marius; Niggemann, Matthias; Dunens, Alexandra; Schoenitz, Martin; Kuschnerow, Jan C; Kunick, Conrad; Scholl, Stephan

    2018-05-01

    Finding and optimising of synthesis processes for active pharmaceutical ingredients (API) is time consuming. In the finding phase, established methods for synthesis, purification and formulation are used to achieve a high purity API for biological studies. For promising API candidates, this is followed by pre-clinical and clinical studies requiring sufficient quantities of the active component. Ideally, these should be produced with a process representative for a later production process and suitable for scaling to production capacity. This work presents an overview of different approaches for process synthesis based on an existing lab protocol. This is demonstrated for the production of the model drug 4,5,6,7-tetrabromo-2-(1H-imidazol-2-yl) isoindolin-1,3-dione (TBID). Early batch synthesis and purification procedures typically suffer from low and fluctuating yields and purities due to poor process control. In a first step the literature synthesis and purification procedure was modified and optimized using solubility measurements, targeting easier and safer processing for consecutive studies. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. A time-driven, activity-based costing methodology for determining the costs of red blood cell transfusion in patients with beta thalassaemia major.

    PubMed

    Burns, K E; Haysom, H E; Higgins, A M; Waters, N; Tahiri, R; Rushford, K; Dunstan, T; Saxby, K; Kaplan, Z; Chunilal, S; McQuilten, Z K; Wood, E M

    2018-04-10

    To describe the methodology to estimate the total cost of administration of a single unit of red blood cells (RBC) in adults with beta thalassaemia major in an Australian specialist haemoglobinopathy centre. Beta thalassaemia major is a genetic disorder of haemoglobin associated with multiple end-organ complications and typically requiring lifelong RBC transfusion therapy. New therapeutic agents are becoming available based on advances in understanding of the disorder and its consequences. Assessment of the true total cost of transfusion, incorporating both product and activity costs, is required in order to evaluate the benefits and costs of these new therapies. We describe the bottom-up, time-driven, activity-based costing methodology used to develop process maps to provide a step-by-step outline of the entire transfusion pathway. Detailed flowcharts for each process are described. Direct observations and timing of the process maps document all activities, resources, staff, equipment and consumables in detail. The analysis will include costs associated with performing these processes, including resources and consumables. Sensitivity analyses will be performed to determine the impact of different staffing levels, timings and probabilities associated with performing different tasks. Thirty-one process maps have been developed, with over 600 individual activities requiring multiple timings. These will be used for future detailed cost analyses. Detailed process maps using bottom-up, time-driven, activity-based costing for determining the cost of RBC transfusion in thalassaemia major have been developed. These could be adapted for wider use to understand and compare the costs and complexities of transfusion in other settings. © 2018 British Blood Transfusion Society.

  3. cljam: a library for handling DNA sequence alignment/map (SAM) with parallel processing.

    PubMed

    Takeuchi, Toshiki; Yamada, Atsuo; Aoki, Takashi; Nishimura, Kunihiro

    2016-01-01

    Next-generation sequencing can determine DNA bases and the results of sequence alignments are generally stored in files in the Sequence Alignment/Map (SAM) format and the compressed binary version (BAM) of it. SAMtools is a typical tool for dealing with files in the SAM/BAM format. SAMtools has various functions, including detection of variants, visualization of alignments, indexing, extraction of parts of the data and loci, and conversion of file formats. It is written in C and can execute fast. However, SAMtools requires an additional implementation to be used in parallel with, for example, OpenMP (Open Multi-Processing) libraries. For the accumulation of next-generation sequencing data, a simple parallelization program, which can support cloud and PC cluster environments, is required. We have developed cljam using the Clojure programming language, which simplifies parallel programming, to handle SAM/BAM data. Cljam can run in a Java runtime environment (e.g., Windows, Linux, Mac OS X) with Clojure. Cljam can process and analyze SAM/BAM files in parallel and at high speed. The execution time with cljam is almost the same as with SAMtools. The cljam code is written in Clojure and has fewer lines than other similar tools.

  4. Optical coherence microscopy as a novel, non-invasive method for the 4D live imaging of early mammalian embryos.

    PubMed

    Karnowski, Karol; Ajduk, Anna; Wieloch, Bartosz; Tamborski, Szymon; Krawiec, Krzysztof; Wojtkowski, Maciej; Szkulmowski, Maciej

    2017-06-23

    Imaging of living cells based on traditional fluorescence and confocal laser scanning microscopy has delivered an enormous amount of information critical for understanding biological processes in single cells. However, the requirement for a high numerical aperture and fluorescent markers still limits researchers' ability to visualize the cellular architecture without causing short- and long-term photodamage. Optical coherence microscopy (OCM) is a promising alternative that circumvents the technical limitations of fluorescence imaging techniques and provides unique access to fundamental aspects of early embryonic development, without the requirement for sample pre-processing or labeling. In the present paper, we utilized the internal motion of cytoplasm, as well as custom scanning and signal processing protocols, to effectively reduce the speckle noise typical for standard OCM and enable high-resolution intracellular time-lapse imaging. To test our imaging system we used mouse and pig oocytes and embryos and visualized them through fertilization and the first embryonic division, as well as at selected stages of oogenesis and preimplantation development. Because all morphological and morphokinetic properties recorded by OCM are believed to be biomarkers of oocyte/embryo quality, OCM may represent a new chapter in imaging-based preimplantation embryo diagnostics.

  5. Preface to QoIS 2009

    NASA Astrophysics Data System (ADS)

    Comyn-Wattiau, Isabelle; Thalheim, Bernhard

    Quality assurance is a growing research domain within the Information Systems (IS) and Conceptual Modeling (CM) disciplines. Ongoing research on quality in IS and CM is highly diverse and encompasses theoretical aspects including quality definition and quality models, and practical/empirical aspects such as the development of methods, approaches and tools for quality measurement and improvement. Current research on quality also includes quality characteristics definitions, validation instruments, methodological and development approaches to quality assurance during software and information systems development, quality monitors, quality assurance during information systems development processes and practices, quality assurance both for data and (meta)schemata, quality support for information systems data import and export, quality of query answering, and cost/benefit analysis of quality assurance processes. Quality assurance is also depending on the application area and the specific requirements in applications such as health sector, logistics, public sector, financial sector, manufacturing, services, e-commerce, software, etc. Furthermore, quality assurance must also be supported for data aggregation, ETL processes, web content management and other multi-layered applications. Quality assurance is typically requiring resources and has therefore beside its benefits a computational and economical trade-off. It is therefore also based on compromising between the value of quality data and the cost for quality assurance.

  6. A Co-modeling Method Based on Component Features for Mechatronic Devices in Aero-engines

    NASA Astrophysics Data System (ADS)

    Wang, Bin; Zhao, Haocen; Ye, Zhifeng

    2017-08-01

    Data-fused and user-friendly design of aero-engine accessories is required because of their structural complexity and stringent reliability. This paper gives an overview of a typical aero-engine control system and the development process of key mechatronic devices used. Several essential aspects of modeling and simulation in the process are investigated. Considering the limitations of a single theoretic model, feature-based co-modeling methodology is suggested to satisfy the design requirements and compensate for diversity of component sub-models for these devices. As an example, a stepper motor controlled Fuel Metering Unit (FMU) is modeled in view of the component physical features using two different software tools. An interface is suggested to integrate the single discipline models into the synthesized one. Performance simulation of this device using the co-model and parameter optimization for its key components are discussed. Comparison between delivery testing and the simulation shows that the co-model for the FMU has a high accuracy and the absolute superiority over a single model. Together with its compatible interface with the engine mathematical model, the feature-based co-modeling methodology is proven to be an effective technical measure in the development process of the device.

  7. A COMPARISON OF TRANSIENT INFINITE ELEMENTS AND TRANSIENT KIRCHHOFF INTEGRAL METHODS FOR FAR FIELD ACOUSTIC ANALYSIS

    DOE PAGES

    WALSH, TIMOTHY F.; JONES, ANDREA; BHARDWAJ, MANOJ; ...

    2013-04-01

    Finite element analysis of transient acoustic phenomena on unbounded exterior domains is very common in engineering analysis. In these problems there is a common need to compute the acoustic pressure at points outside of the acoustic mesh, since meshing to points of interest is impractical in many scenarios. In aeroacoustic calculations, for example, the acoustic pressure may be required at tens or hundreds of meters from the structure. In these cases, a method is needed for post-processing the acoustic results to compute the response at far-field points. In this paper, we compare two methods for computing far-field acoustic pressures, onemore » derived directly from the infinite element solution, and the other from the transient version of the Kirchhoff integral. Here, we show that the infinite element approach alleviates the large storage requirements that are typical of Kirchhoff integral and related procedures, and also does not suffer from loss of accuracy that is an inherent part of computing numerical derivatives in the Kirchhoff integral. In order to further speed up and streamline the process of computing the acoustic response at points outside of the mesh, we also address the nonlinear iterative procedure needed for locating parametric coordinates within the host infinite element of far-field points, the parallelization of the overall process, linear solver requirements, and system stability considerations.« less

  8. An analytical chemistry laboratory's experiences under Department of Energy Order 5633. 3 - a status report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bingham, C.D.

    The U.S. Department of Energy (DOE) order 5633.3, Control and Accountability of Nuclear Materials, initiated substantial changes to the requirements for operations involving nuclear materials. In the opinion of this author, the two most significant changes are the clarification of and the increased emphasis on the concept of graded safeguards and the implementation of performance requirements. Graded safeguards recognizes that some materials are more attractive than others to potential adversary actions and, thus, should be afforded a higher level of integrated safeguards effort. An analytical chemistry laboratory, such as the New Brunswick Laboratory (NBL), typically has a small total inventorymore » of special nuclear materials compared to, for example, a production or manufacturing facility. The NBL has a laboratory information management system (LIMS) that not only provides the sample identification and tracking but also incorporates the essential features of MC A required of NBL operations. As a consequence of order 5633.3, NBL had to modify LIMS to accommodate material attractiveness information for the logging process, to reflect changes in the attractiveness as the material was processed through the laboratory, and to enable inventory information to be accumulated by material attractiveness as the material was processed through the laboratory, and to enable inventory information to be accumulated by material attractiveness codes.« less

  9. Phonological Processing in Children with Specific Reading Disorder versus Typical Learners: Factor Structure and Measurement Invariance in a Transparent Orthography

    ERIC Educational Resources Information Center

    Brandenburg, Janin; Klesczewski, Julia; Schuchardt, Kirsten; Fischbach, Anne; Büttner, Gerhard; Hasselhorn, Marcus

    2017-01-01

    Although children with specific reading disorder (RD) have often been compared to typically achieving children on various phonological processing tasks, to our knowledge no study so far has examined whether the structure of phonological processing applies to both groups of children alike. According to Wagner and Torgesen (1987), phonological…

  10. The Next Technology Revolution - Nano Electronic Technology

    NASA Astrophysics Data System (ADS)

    Turlik, Iwona

    2004-03-01

    Nanotechnology is a revolutionary engine that will engender enormous changes in a vast majority of today's industries and markets, while potentially creating whole new industries. The impact of nanotechnology is particularly significant in the electronics industry, which is constantly driven by the need for higher performance, increased functionality, smaller size and lower cost. Nanotechnology can influence many of the hundreds of components that are typically assembled to manufacture modern electronic devices. Motorola manufactures electronics for a wide range of industries and communication products. In this presentation, the typical components of a cellular phone are outlined and technology requirements for future products, the customer benefits, and the potential impact of nanotechnology on many of the components are discussed. Technology needs include reliable materials supply, processes for high volume production, experimental and simulation tools, etc. For example, even routine procedures such as failure characterization may require the development of new tools for investigating nano-scale phenomena. Business needs include the development of an effective, high volume supply chain for nano-materials and devices, disruptive product platforms, and visible performance impact on the end consumer. An equally significant long-term industry need is the availability of science and engineering graduates with a multidisciplinary focus and a deep understanding of the fundamentals of nano-technology, that can harness the technology to create revolutionary products.

  11. A texture-based framework for improving CFD data visualization in a virtual environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bivins, Gerrick O'Ron

    2005-01-01

    In the field of computational fluid dynamics (CFD) accurate representations of fluid phenomena can be simulated hut require large amounts of data to represent the flow domain. Most datasets generated from a CFD simulation can be coarse, ~10,000 nodes or cells, or very fine with node counts on the order of 1,000,000. A typical dataset solution can also contain multiple solutions for each node, pertaining to various properties of the flow at a particular node. Scalar properties such as density, temperature, pressure, and velocity magnitude are properties that are typically calculated and stored in a dataset solution. Solutions are notmore » limited to just scalar properties. Vector quantities, such as velocity, are also often calculated and stored for a CFD simulation. Accessing all of this data efficiently during runtime is a key problem for visualization in an interactive application. Understanding simulation solutions requires a post-processing tool to convert the data into something more meaningful. Ideally, the application would present an interactive visual representation of the numerical data for any dataset that was simulated while maintaining the accuracy of the calculated solution. Most CFD applications currently sacrifice interactivity for accuracy, yielding highly detailed flow descriptions hut limiting interaction for investigating the field.« less

  12. A texture-based frameowrk for improving CFD data visualization in a virtual environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bivins, Gerrick O'Ron

    2005-01-01

    In the field of computational fluid dynamics (CFD) accurate representations of fluid phenomena can be simulated but require large amounts of data to represent the flow domain. Most datasets generated from a CFD simulation can be coarse, ~ 10,000 nodes or cells, or very fine with node counts on the order of 1,000,000. A typical dataset solution can also contain multiple solutions for each node, pertaining to various properties of the flow at a particular node. Scalar properties such as density, temperature, pressure, and velocity magnitude are properties that are typically calculated and stored in a dataset solution. Solutions aremore » not limited to just scalar properties. Vector quantities, such as velocity, are also often calculated and stored for a CFD simulation. Accessing all of this data efficiently during runtime is a key problem for visualization in an interactive application. Understanding simulation solutions requires a post-processing tool to convert the data into something more meaningful. Ideally, the application would present an interactive visual representation of the numerical data for any dataset that was simulated while maintaining the accuracy of the calculated solution. Most CFD applications currently sacrifice interactivity for accuracy, yielding highly detailed flow descriptions but limiting interaction for investigating the field.« less

  13. Actinomycosis: etiology, clinical features, diagnosis, treatment, and management

    PubMed Central

    Valour, Florent; Sénéchal, Agathe; Dupieux, Céline; Karsenty, Judith; Lustig, Sébastien; Breton, Pierre; Gleizal, Arnaud; Boussel, Loïc; Laurent, Frédéric; Braun, Evelyne; Chidiac, Christian; Ader, Florence; Ferry, Tristan

    2014-01-01

    Actinomycosis is a rare chronic disease caused by Actinomyces spp., anaerobic Gram-positive bacteria that normally colonize the human mouth and digestive and genital tracts. Physicians must be aware of typical clinical presentations (such as cervicofacial actinomycosis following dental focus of infection, pelvic actinomycosis in women with an intrauterine device, and pulmonary actinomycosis in smokers with poor dental hygiene), but also that actinomycosis may mimic the malignancy process in various anatomical sites. Bacterial cultures and pathology are the cornerstone of diagnosis, but particular conditions are required in order to get the correct diagnosis. Prolonged bacterial cultures in anaerobic conditions are necessary to identify the bacterium and typical microscopic findings include necrosis with yellowish sulfur granules and filamentous Gram-positive fungal-like pathogens. Patients with actinomycosis require prolonged (6- to 12-month) high doses (to facilitate the drug penetration in abscess and in infected tissues) of penicillin G or amoxicillin, but the duration of antimicrobial therapy could probably be shortened to 3 months in patients in whom optimal surgical resection of infected tissues has been performed. Preventive measures, such as reduction of alcohol abuse and improvement of dental hygiene, may limit occurrence of pulmonary, cervicofacial, and central nervous system actinomycosis. In women, intrauterine devices must be changed every 5 years in order to limit the occurrence of pelvic actinomycosis. PMID:25045274

  14. Flexible control techniques for a lunar base

    NASA Technical Reports Server (NTRS)

    Kraus, Thomas W.

    1992-01-01

    The fundamental elements found in every terrestrial control system can be employed in all lunar applications. These elements include sensors which measure physical properties, controllers which acquire sensor data and calculate a control response, and actuators which apply the control output to the process. The unique characteristics of the lunar environment will certainly require the development of new control system technology. However, weightlessness, harsh atmospheric conditions, temperature extremes, and radiation hazards will most significantly impact the design of sensors and actuators. The controller and associated control algorithms, which are the most complex element of any control system, can be derived in their entirety from existing technology. Lunar process control applications -- ranging from small-scale research projects to full-scale processing plants -- will benefit greatly from the controller advances being developed today. In particular, new software technology aimed at commercial process monitoring and control applications will almost completely eliminate the need for custom programs and the lengthy development and testing cycle they require. The applicability of existing industrial software to lunar applications has other significant advantages in addition to cost and quality. This software is designed to run on standard hardware platforms and takes advantage of existing LAN and telecommunications technology. Further, in order to exploit the existing commercial market, the software is being designed to be implemented by users of all skill levels -- typically users who are familiar with their process, but not necessarily with software or control theory. This means that specialized technical support personnel will not need to be on-hand, and the associated costs are eliminated. Finally, the latest industrial software designed for the commercial market is extremely flexible, in order to fit the requirements of many types of processing applications with little or no customization. This means that lunar process control projects will not be delayed by unforeseen problems or last minute process modifications. The software will include all of the tools needed to adapt to virtually any changes. In contrast to other space programs which required the development of tremendous amounts of custom software, lunar-based processing facilities will benefit from the use of existing software technology which is being proven in commercial applications on Earth.

  15. Method of fabricating a uranium-bearing foil

    DOEpatents

    Gooch, Jackie G [Seymour, TN; DeMint, Amy L [Kingston, TN

    2012-04-24

    Methods of fabricating a uranium-bearing foil are described. The foil may be substantially pure uranium, or may be a uranium alloy such as a uranium-molybdenum alloy. The method typically includes a series of hot rolling operations on a cast plate material to form a thin sheet. These hot rolling operations are typically performed using a process where each pass reduces the thickness of the plate by a substantially constant percentage. The sheet is typically then annealed and then cooled. The process typically concludes with a series of cold rolling passes where each pass reduces the thickness of the plate by a substantially constant thickness amount to form the foil.

  16. Towards a building typology and terminology for Irish hospitals.

    PubMed

    Grey, T; Kennelly, S; de Freine, P; Mahon, S; Mannion, V; O'Neill, D

    2017-02-01

    The physical form of the hospital environment shapes the care setting and influences the relationship of the hospital to the community. Due to ongoing demographic change, evolving public health needs, and advancing medical practice, typical hospitals are frequently redeveloped, retrofitted, or expanded. It is argued that multi-disciplinary and multi-stakeholder approaches are required to ensure that hospital design matches these increasingly complex needs. To facilitate such a conversation across different disciplines, experts, and community stakeholders, it is helpful to establish a hospital typology and associated terminology as part of any collaborative process. Examine the literature around hospital design, and review the layout and overall form of a range of typical Irish acute public hospitals, to outline an associated building typology, and to establish the terminology associated with the planning and design of these hospitals in Ireland. Searches in 'Academic Search Complete', 'Compendex', 'Google', 'Google Scholar', 'JSTOR', 'PADDI', 'Science Direct', 'Scopus', 'Web of Science', and Trinity College Dublin Library. The search terms included: 'hospital design history'; 'hospital typology'; 'hospital design terminology'; and 'hospital design Ireland'. Typical hospitals are composed of different layouts due to development over time; however, various discrete building typologies can still be determined within many hospitals. This paper presents a typology illustrating distinct layout, circulation, and physical form characteristics, along with a hospital planning and design terminology of key terms and definitions. This typology and terminology define the main components of Irish hospital building design to create a shared understanding around design, and support stakeholder engagement, as part of any collaborative design process.

  17. Real-time product attribute control to manufacture antibodies with defined N-linked glycan levels.

    PubMed

    Zupke, Craig; Brady, Lowell J; Slade, Peter G; Clark, Philip; Caspary, R Guy; Livingston, Brittney; Taylor, Lisa; Bigham, Kyle; Morris, Arvia E; Bailey, Robert W

    2015-01-01

    Pressures for cost-effective new therapies and an increased emphasis on emerging markets require technological advancements and a flexible future manufacturing network for the production of biologic medicines. The safety and efficacy of a product is crucial, and consistent product quality is an essential feature of any therapeutic manufacturing process. The active control of product quality in a typical biologic process is challenging because of measurement lags and nonlinearities present in the system. The current study uses nonlinear model predictive control to maintain a critical product quality attribute at a predetermined value during pilot scale manufacturing operations. This approach to product quality control ensures a more consistent product for patients, enables greater manufacturing efficiency, and eliminates the need for extensive process characterization by providing direct measures of critical product quality attributes for real time release of drug product. © 2015 American Institute of Chemical Engineers.

  18. Nanometer-scale mapping of irreversible electrochemical nucleation processes on solid Li-ion electrolytes

    NASA Astrophysics Data System (ADS)

    Kumar, Amit; Arruda, Thomas M.; Tselev, Alexander; Ivanov, Ilia N.; Lawton, Jamie S.; Zawodzinski, Thomas A.; Butyaev, Oleg; Zayats, Sergey; Jesse, Stephen; Kalinin, Sergei V.

    2013-04-01

    Electrochemical processes associated with changes in structure, connectivity or composition typically proceed via new phase nucleation with subsequent growth of nuclei. Understanding and controlling reactions requires the elucidation and control of nucleation mechanisms. However, factors controlling nucleation kinetics, including the interplay between local mechanical conditions, microstructure and local ionic profile remain inaccessible. Furthermore, the tendency of current probing techniques to interfere with the original microstructure prevents a systematic evaluation of the correlation between the microstructure and local electrochemical reactivity. In this work, the spatial variability of irreversible nucleation processes of Li on a Li-ion conductive glass-ceramics surface is studied with ~30 nm resolution. An increased nucleation rate at the boundaries between the crystalline AlPO4 phase and amorphous matrix is observed and attributed to Li segregation. This study opens a pathway for probing mechanisms at the level of single structural defects and elucidation of electrochemical activities in nanoscale volumes.

  19. Self-Patterning of Silica/Epoxy Nanocomposite Underfill by Tailored Hydrophilic-Superhydrophobic Surfaces for 3D Integrated Circuit (IC) Stacking.

    PubMed

    Tuan, Chia-Chi; James, Nathan Pataki; Lin, Ziyin; Chen, Yun; Liu, Yan; Moon, Kyoung-Sik; Li, Zhuo; Wong, C P

    2017-03-15

    As microelectronics are trending toward smaller packages and integrated circuit (IC) stacks nowadays, underfill, the polymer composite filled in between the IC chip and the substrate, becomes increasingly important for interconnection reliability. However, traditional underfills cannot meet the requirements for low-profile and fine pitch in high density IC stacking packages. Post-applied underfills have difficulties in flowing into the small gaps between the chip and the substrate, while pre-applied underfills face filler entrapment at bond pads. In this report, we present a self-patterning underfilling technology that uses selective wetting of underfill on Cu bond pads and Si 3 N 4 passivation via surface energy engineering. This novel process, fully compatible with the conventional underfilling process, eliminates the issue of filler entrapment in typical pre-applied underfilling process, enabling high density and fine pitch IC die bonding.

  20. Standardization of pitch-range settings in voice acoustic analysis.

    PubMed

    Vogel, Adam P; Maruff, Paul; Snyder, Peter J; Mundt, James C

    2009-05-01

    Voice acoustic analysis is typically a labor-intensive, time-consuming process that requires the application of idiosyncratic parameters tailored to individual aspects of the speech signal. Such processes limit the efficiency and utility of voice analysis in clinical practice as well as in applied research and development. In the present study, we analyzed 1,120 voice files, using standard techniques (case-by-case hand analysis), taking roughly 10 work weeks of personnel time to complete. The results were compared with the analytic output of several automated analysis scripts that made use of preset pitch-range parameters. After pitch windows were selected to appropriately account for sex differences, the automated analysis scripts reduced processing time of the 1,120 speech samples to less than 2.5 h and produced results comparable to those obtained with hand analysis. However, caution should be exercised when applying the suggested preset values to pathological voice populations.

  1. Verification and Validation Methodology of Real-Time Adaptive Neural Networks for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Gupta, Pramod; Loparo, Kenneth; Mackall, Dale; Schumann, Johann; Soares, Fola

    2004-01-01

    Recent research has shown that adaptive neural based control systems are very effective in restoring stability and control of an aircraft in the presence of damage or failures. The application of an adaptive neural network with a flight critical control system requires a thorough and proven process to ensure safe and proper flight operation. Unique testing tools have been developed as part of a process to perform verification and validation (V&V) of real time adaptive neural networks used in recent adaptive flight control system, to evaluate the performance of the on line trained neural networks. The tools will help in certification from FAA and will help in the successful deployment of neural network based adaptive controllers in safety-critical applications. The process to perform verification and validation is evaluated against a typical neural adaptive controller and the results are discussed.

  2. Nanometer-scale mapping of irreversible electrochemical nucleation processes on solid Li-ion electrolytes.

    PubMed

    Kumar, Amit; Arruda, Thomas M; Tselev, Alexander; Ivanov, Ilia N; Lawton, Jamie S; Zawodzinski, Thomas A; Butyaev, Oleg; Zayats, Sergey; Jesse, Stephen; Kalinin, Sergei V

    2013-01-01

    Electrochemical processes associated with changes in structure, connectivity or composition typically proceed via new phase nucleation with subsequent growth of nuclei. Understanding and controlling reactions requires the elucidation and control of nucleation mechanisms. However, factors controlling nucleation kinetics, including the interplay between local mechanical conditions, microstructure and local ionic profile remain inaccessible. Furthermore, the tendency of current probing techniques to interfere with the original microstructure prevents a systematic evaluation of the correlation between the microstructure and local electrochemical reactivity. In this work, the spatial variability of irreversible nucleation processes of Li on a Li-ion conductive glass-ceramics surface is studied with ~30 nm resolution. An increased nucleation rate at the boundaries between the crystalline AlPO4 phase and amorphous matrix is observed and attributed to Li segregation. This study opens a pathway for probing mechanisms at the level of single structural defects and elucidation of electrochemical activities in nanoscale volumes.

  3. Nanometer-scale mapping of irreversible electrochemical nucleation processes on solid Li-ion electrolytes

    PubMed Central

    Kumar, Amit; Arruda, Thomas M.; Tselev, Alexander; Ivanov, Ilia N.; Lawton, Jamie S.; Zawodzinski, Thomas A.; Butyaev, Oleg; Zayats, Sergey; Jesse, Stephen; Kalinin, Sergei V.

    2013-01-01

    Electrochemical processes associated with changes in structure, connectivity or composition typically proceed via new phase nucleation with subsequent growth of nuclei. Understanding and controlling reactions requires the elucidation and control of nucleation mechanisms. However, factors controlling nucleation kinetics, including the interplay between local mechanical conditions, microstructure and local ionic profile remain inaccessible. Furthermore, the tendency of current probing techniques to interfere with the original microstructure prevents a systematic evaluation of the correlation between the microstructure and local electrochemical reactivity. In this work, the spatial variability of irreversible nucleation processes of Li on a Li-ion conductive glass-ceramics surface is studied with ~30 nm resolution. An increased nucleation rate at the boundaries between the crystalline AlPO4 phase and amorphous matrix is observed and attributed to Li segregation. This study opens a pathway for probing mechanisms at the level of single structural defects and elucidation of electrochemical activities in nanoscale volumes. PMID:23563856

  4. Pneumatic Regolith Transfer Systems for In-Situ Resource Utilization

    NASA Technical Reports Server (NTRS)

    Mueller, Robert P.; Townsend, Ivan I., III; Mantovani, James G.

    2010-01-01

    One aspect of In-Situ Resource Utilization (lSRU) in a lunar environment is to extract oxygen and other elements from the minerals that make up the lunar regolith. Typical ISRU oxygen production processes include but are not limited to hydrogen reduction, carbothermal and molten oxide electrolysis. All of these processes require the transfer of regolith from a supply hopper into a reactor for chemical reaction processing, and the subsequent extraction of the reacted regolith from the reactor. This paper will discuss recent activities in the NASA ISRU project involved with developing pneumatic conveying methods to achieve lunar regolith simulant transfer under I-g and 1/6-g gravitational environments. Examples will be given of hardware that has been developed and tested by NASA on reduced gravity flights. Lessons learned and details of pneumatic regolith transfer systems will be examined as well as the relative performance in a 1/6th G environment

  5. Quality By Design: Concept To Applications.

    PubMed

    Swain, Suryakanta; Padhy, Rabinarayan; Jena, Bikash Ranjan; Babu, Sitty Manohar

    2018-03-08

    Quality by Design is associated to the modern, systematic, scientific and novel approach which is concerned with pre-distinct objectives that not only focus on product, process understanding but also leads to process control. It predominantly signifies the design and product improvement and the manufacturing process in order to fulfill the predefined manufactured goods or final products quality characteristics. It is quite essential to identify desire and required product performance report such as Target Product Profile, typical Quality Target Product Profile (QTPP) and Critical Quality attributes (CQA). This review highlighted about the concepts of QbD design space, for critical material attributes (CMAs) as well as the critical process parameters that can totally affect the CQAs within which the process shall be unaffected and consistently manufacture the required product. Risk assessment tools and design of experiments are its prime components. This paper outlines the basic knowledge of QbD, the key elements; steps as well as various tools for QbD implementation in pharmaceutics field are presented briefly. In addition to this, quite a lot of applications of QbD in numerous pharmaceutical related unit operations are discussed and summarized. This article provides a complete data as well as the road map for universal implementation and application of QbD for pharmaceutical products. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  6. Quadratic Polynomial Regression using Serial Observation Processing:Implementation within DART

    NASA Astrophysics Data System (ADS)

    Hodyss, D.; Anderson, J. L.; Collins, N.; Campbell, W. F.; Reinecke, P. A.

    2017-12-01

    Many Ensemble-Based Kalman ltering (EBKF) algorithms process the observations serially. Serial observation processing views the data assimilation process as an iterative sequence of scalar update equations. What is useful about this data assimilation algorithm is that it has very low memory requirements and does not need complex methods to perform the typical high-dimensional inverse calculation of many other algorithms. Recently, the push has been towards the prediction, and therefore the assimilation of observations, for regions and phenomena for which high-resolution is required and/or highly nonlinear physical processes are operating. For these situations, a basic hypothesis is that the use of the EBKF is sub-optimal and performance gains could be achieved by accounting for aspects of the non-Gaussianty. To this end, we develop here a new component of the Data Assimilation Research Testbed [DART] to allow for a wide-variety of users to test this hypothesis. This new version of DART allows one to run several variants of the EBKF as well as several variants of the quadratic polynomial lter using the same forecast model and observations. Dierences between the results of the two systems will then highlight the degree of non-Gaussianity in the system being examined. We will illustrate in this work the differences between the performance of linear versus quadratic polynomial regression in a hierarchy of models from Lorenz-63 to a simple general circulation model.

  7. Headspace-SPME-GC/MS as a simple cleanup tool for sensitive 2,6-diisopropylphenol analysis from lipid emulsions and adaptable to other matrices.

    PubMed

    Pickl, Karin E; Adamek, Viktor; Gorges, Roland; Sinner, Frank M

    2011-07-15

    Due to increased regulatory requirements, the interaction of active pharmaceutical ingredients with various surfaces and solutions during production and storage is gaining interest in the pharmaceutical research field, in particular with respect to development of new formulations, new packaging material and the evaluation of cleaning processes. Experimental adsorption/absorption studies as well as the study of cleaning processes require sophisticated analytical methods with high sensitivity for the drug of interest. In the case of 2,6-diisopropylphenol - a small lipophilic drug which is typically formulated as lipid emulsion for intravenous injection - a highly sensitive method in the concentration range of μg/l suitable to be applied to a variety of different sample matrices including lipid emulsions is needed. We hereby present a headspace-solid phase microextraction (HS-SPME) approach as a simple cleanup procedure for sensitive 2,6-diisopropylphenol quantification from diverse matrices choosing a lipid emulsion as the most challenging matrix with regard to complexity. By combining the simple and straight forward HS-SPME sample pretreatment with an optimized GC-MS quantification method a robust and sensitive method for 2,6-diisopropylphenol was developed. This method shows excellent sensitivity in the low μg/l concentration range (5-200μg/l), good accuracy (94.8-98.8%) and precision (intraday-precision 0.1-9.2%, inter-day precision 2.0-7.7%). The method can be easily adapted to other, less complex, matrices such as water or swab extracts. Hence, the presented method holds the potential to serve as a single and simple analytical procedure for 2,6-diisopropylphenol analysis in various types of samples such as required in, e.g. adsorption/absorption studies which typically deal with a variety of different surfaces (steel, plastic, glass, etc.) and solutions/matrices including lipid emulsions. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. A Comparison of Photocatalytic Oxidation Reactor Performance for Spacecraft Cabin Trace Contaminant Control Applications

    NASA Technical Reports Server (NTRS)

    Perry, Jay L.; Frederick, Kenneth R.; Scott, Joseph P.; Reinermann, Dana N.

    2011-01-01

    Photocatalytic oxidation (PCO) is a maturing process technology that shows potential for spacecraft life support system application. Incorporating PCO into a spacecraft cabin atmosphere revitalization system requires an understanding of basic performance, particularly with regard to partial oxidation product production. Four PCO reactor design concepts have been evaluated for their effectiveness for mineralizing key trace volatile organic com-pounds (VOC) typically observed in crewed spacecraft cabin atmospheres. Mineralization efficiency and selectivity for partial oxidation products are compared for the reactor design concepts. The role of PCO in a spacecraft s life support system architecture is discussed.

  9. Prototype wash water renovation system integration with government-furnished wash fixture

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The requirements of a significant quantity of proposed life sciences experiments in Shuttle payloads for available wash water to support cleansing operations has provided the incentive to develop a technique for wash water renovation. A prototype wash water waste renovation system which has the capability to process the waste water and return it to a state adequate for reuse in a typical cleansing fixture designed to support life science experiments was investigated. The resulting technology is to support other developments efforts pertaining to water reclamation by serving as a pretreatment step for subsequent reclamation procedures.

  10. Research Spotlight: New method to assess coral reef health

    NASA Astrophysics Data System (ADS)

    Tretkoff, Ernie

    2011-03-01

    Coral reefs around the world are becoming stressed due to rising temperatures, ocean acidification, overfishing, and other factors. Measuring community level rates of photosynthesis, respiration, and biogenic calcification is essential to assessing the health of coral reef ecosystems because the balance between these processes determines the potential for reef growth and the export of carbon. Measurements of biological productivity have typically been made by tracing changes in dissolved oxygen in seawater as it passes over a reef. However, this is a labor-intensive and difficult method, requiring repeated measurements. (Geophysical Research Letters, doi:10.1029/2010GL046179, 2011)

  11. Optimum spaceborne computer system design by simulation

    NASA Technical Reports Server (NTRS)

    Williams, T.; Kerner, H.; Weatherbee, J. E.; Taylor, D. S.; Hodges, B.

    1973-01-01

    A deterministic simulator is described which models the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. Its use as a tool to study and determine the minimum computer system configuration necessary to satisfy the on-board computational requirements of a typical mission is presented. The paper describes how the computer system configuration is determined in order to satisfy the data processing demand of the various shuttle booster subsytems. The configuration which is developed as a result of studies with the simulator is optimal with respect to the efficient use of computer system resources.

  12. Fully synthetic taped insulation cables

    DOEpatents

    Forsyth, E.B.; Muller, A.C.

    1983-07-15

    The present invention is a cable which, although constructed from inexpensive polyolefin tapes and using typical impregnating oils, furnishes high voltage capability up to 765 kV, and has such excellent dielectric characteristics and heat transfer properties that it is capable of operation at capacities equal to or higher than presently available cables at a given voltage. This is accomplished by using polyethylene, polybutene or polypropylene insulating tape which has been specially processed to attain properties which are not generally found in these materials, but are required for their use in impregnated electrical cables. Chief among these properties is compatibility with impregnating oil.

  13. Automation of the Image Analysis for Thermographic Inspection

    NASA Technical Reports Server (NTRS)

    Plotnikov, Yuri A.; Winfree, William P.

    1998-01-01

    Several data processing procedures for the pulse thermal inspection require preliminary determination of an unflawed region. Typically, an initial analysis of the thermal images is performed by an operator to determine the locations of unflawed and the defective areas. In the present work an algorithm is developed for automatically determining a reference point corresponding to an unflawed region. Results are obtained for defects which are arbitrarily located in the inspection region. A comparison is presented of the distributions of derived values with right and wrong localization of the reference point. Different algorithms of automatic determination of the reference point are compared.

  14. SEM evaluation of metallization on semiconductors. [Scanning Electron Microscope

    NASA Technical Reports Server (NTRS)

    Fresh, D. L.; Adolphsen, J. W.

    1974-01-01

    A test method for the evaluation of metallization on semiconductors is presented and discussed. The method has been prepared in MIL-STD format for submittal as a proposed addition to MIL-STD-883. It is applicable to discrete devices and to integrated circuits and specifically addresses batch-process oriented defects. Quantitative accept/reject criteria are given for contact windows, other oxide steps, and general interconnecting metallization. Figures are provided that illustrate typical types of defects. Apparatus specifications, sampling plans, and specimen preparation and examination requirements are described. Procedures for glassivated devices and for multi-metal interconnection systems are included.

  15. Parallelization and visual analysis of multidimensional fields: Application to ozone production, destruction, and transport in three dimensions

    NASA Technical Reports Server (NTRS)

    Schwan, Karsten

    1994-01-01

    Atmospheric modeling is a grand challenge problem for several reasons, including its inordinate computational requirements and its generation of large amounts of data concurrent with its use of very large data sets derived from measurement instruments like satellites. In addition, atmospheric models are typically run several times, on new data sets or to reprocess existing data sets, to investigate or reinvestigate specific chemical or physical processes occurring in the earth's atmosphere, to understand model fidelity with respect to observational data, or simply to experiment with specific model parameters or components.

  16. Partial information decomposition as a spatiotemporal filter.

    PubMed

    Flecker, Benjamin; Alford, Wesley; Beggs, John M; Williams, Paul L; Beer, Randall D

    2011-09-01

    Understanding the mechanisms of distributed computation in cellular automata requires techniques for characterizing the emergent structures that underlie information processing in such systems. Recently, techniques from information theory have been brought to bear on this problem. Building on this work, we utilize the new technique of partial information decomposition to show that previous information-theoretic measures can confound distinct sources of information. We then propose a new set of filters and demonstrate that they more cleanly separate out the background domains, particles, and collisions that are typically associated with information storage, transfer, and modification in cellular automata.

  17. Proof Rules for Automated Compositional Verification through Learning

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Giannakopoulou, Dimitra; Pasareanu, Corina S.

    2003-01-01

    Compositional proof systems not only enable the stepwise development of concurrent processes but also provide a basis to alleviate the state explosion problem associated with model checking. An assume-guarantee style of specification and reasoning has long been advocated to achieve compositionality. However, this style of reasoning is often non-trivial, typically requiring human input to determine appropriate assumptions. In this paper, we present novel assume- guarantee rules in the setting of finite labelled transition systems with blocking communication. We show how these rules can be applied in an iterative and fully automated fashion within a framework based on learning.

  18. A Prototype System for Retrieval of Gene Functional Information

    PubMed Central

    Folk, Lillian C.; Patrick, Timothy B.; Pattison, James S.; Wolfinger, Russell D.; Mitchell, Joyce A.

    2003-01-01

    Microarrays allow researchers to gather data about the expression patterns of thousands of genes simultaneously. Statistical analysis can reveal which genes show statistically significant results. Making biological sense of those results requires the retrieval of functional information about the genes thus identified, typically a manual gene-by-gene retrieval of information from various on-line databases. For experiments generating thousands of genes of interest, retrieval of functional information can become a significant bottleneck. To address this issue, we are currently developing a prototype system to automate the process of retrieval of functional information from multiple on-line sources. PMID:14728346

  19. Managing Space System Faults: Coalescing NASA's Views

    NASA Technical Reports Server (NTRS)

    Muirhead, Brian; Fesq, Lorraine

    2012-01-01

    Managing faults and their resultant failures is a fundamental and critical part of developing and operating aerospace systems. Yet, recent studies have shown that the engineering "discipline" required to manage faults is not widely recognized nor evenly practiced within the NASA community. Attempts to simply name this discipline in recent years has been fraught with controversy among members of the Integrated Systems Health Management (ISHM), Fault Management (FM), Fault Protection (FP), Hazard Analysis (HA), and Aborts communities. Approaches to managing space system faults typically are unique to each organization, with little commonality in the architectures, processes and practices across the industry.

  20. [Q:] When would you prefer a SOSSAGE to a SAUSAGE? [A:] At about 100 msec. ERP correlates of orthographic typicality and lexicality in written word recognition.

    PubMed

    Hauk, O; Patterson, K; Woollams, A; Watling, L; Pulvermüller, F; Rogers, T T

    2006-05-01

    Using a speeded lexical decision task, event-related potentials (ERPs), and minimum norm current source estimates, we investigated early spatiotemporal aspects of cortical activation elicited by words and pseudo-words that varied in their orthographic typicality, that is, in the frequency of their component letter pairs (bi-grams) and triplets (tri-grams). At around 100 msec after stimulus onset, the ERP pattern revealed a significant typicality effect, where words and pseudo-words with atypical orthography (e.g., yacht, cacht) elicited stronger brain activation than items characterized by typical spelling patterns (cart, yart). At approximately 200 msec, the ERP pattern revealed a significant lexicality effect, with pseudo-words eliciting stronger brain activity than words. The two main factors interacted significantly at around 160 msec, where words showed a typicality effect but pseudo-words did not. The principal cortical sources of the effects of both typicality and lexicality were localized in the inferior temporal cortex. Around 160 msec, atypical words elicited the stronger source currents in the left anterior inferior temporal cortex, whereas the left perisylvian cortex was the site of greater activation to typical words. Our data support distinct but interactive processing stages in word recognition, with surface features of the stimulus being processed before the word as a meaningful lexical entry. The interaction of typicality and lexicality can be explained by integration of information from the early form-based system and lexicosemantic processes.

  1. The Development of Individuation in Autism

    ERIC Educational Resources Information Center

    O'Hearn, Kirsten; Franconeri, Steven; Wright, Catherine; Minshew, Nancy; Luna, Beatriz

    2013-01-01

    Evidence suggests that people with autism rely less on holistic visual information than typical adults. The current studies examine this by investigating core visual processes that contribute to holistic processing--namely, individuation and element grouping--and how they develop in participants with autism and typically developing (TD)…

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lebarbier Dagel, Vanessa M.; Li, J.; Taylor, Charles E.

    This collaborative joint research project is in the area of advanced gasification and conversion, within the Chinese Academy of Sciences (CAS)-National Energy Technology Laboratory (NETL)-Pacific Northwest National Laboratory (PNNL) Memorandum of Understanding. The goal for this subtask is the development of advanced syngas conversion technologies. Two areas of investigation were evaluated: Sorption-Enhanced Synthetic Natural Gas Production from Syngas The conversion of synthetic gas (syngas) to synthetic natural gas (SNG) is typically catalyzed by nickel catalysts performed at moderate temperatures (275 to 325°C). The reaction is highly exothermic and substantial heat is liberated, which can lead to process thermal imbalance andmore » destruction of the catalyst. As a result, conversion per pass is typically limited, and substantial syngas recycle is employed. Commercial methanation catalysts and processes have been developed by Haldor Topsoe, and in some reports, they have indicated that there is a need and opportunity for thermally more robust methanation catalysts to allow for higher per-pass conversion in methanation units. SNG process requires the syngas feed with a higher H2/CO ratio than typically produced from gasification processes. Therefore, the water-gas shift reaction (WGS) will be required to tailor the H2/CO ratio. Integration with CO2 separation could potentially eliminate the need for a separate WGS unit, thereby integrating WGS, methanation, and CO2 capture into one single unit operation and, consequently, leading to improved process efficiency. The SNG process also has the benefit of producing a product stream with high CO2 concentrations, which makes CO2 separation more readily achievable. The use of either adsorbents or membranes that selectively separate the CO2 from the H2 and CO would shift the methanation reaction (by driving WGS for hydrogen production) and greatly improve the overall efficiency and economics of the process. The scope of this activity was to develop methods and enabling materials for syngas conversion to SNG with readily CO2 separation. Suitable methanation catalyst and CO2 sorbent materials were developed. Successful proof-of-concept for the combined reaction-sorption process was demonstrated, which culminated in a research publication. With successful demonstration, a decision was made to switch focus to an area of fuels research of more interest to all three research institutions (CAS-NETL-PNNL). Syngas-to-Hydrocarbon Fuels through Higher Alcohol Intermediates There are two types of processes in syngas conversion to fuels that are attracting R&D interest: 1) syngas conversion to mixed alcohols; and 2) syngas conversion to gasoline via the methanol-to-gasoline process developed by Exxon-Mobil in the 1970s. The focus of this task was to develop a one-step conversion technology by effectively incorporating both processes, which is expected to reduce the capital and operational cost associated with the conversion of coal-derived syngas to liquid fuels. It should be noted that this work did not further study the classic Fischer-Tropsch reaction pathway. Rather, we focused on the studies for unique catalyst pathways that involve the direct liquid fuel synthesis enabled by oxygenated intermediates. Recent advances made in the area of higher alcohol synthesis including the novel catalytic composite materials recently developed by CAS using base metal catalysts were used.« less

  3. Brief Report: Preliminary Proposal of a Conceptual Model of a Digital Environment for Developing Mathematical Reasoning in Students with Autism Spectrum Disorders.

    PubMed

    Santos, Maria Isabel; Breda, Ana; Almeida, Ana Margarida

    2015-08-01

    There is clear evidence that in typically developing children reasoning and sense-making are essential in all mathematical learning and understanding processes. In children with autism spectrum disorders (ASD), however, these become much more significant, considering their importance to successful independent living. This paper presents a preliminary proposal of a digital environment, specifically targeted to promote the development of mathematical reasoning in students with ASD. Given the diversity of ASD, the prototyping of this environment requires the study of dynamic adaptation processes and the development of activities adjusted to each user's profile. We present the results obtained during the first phase of this ongoing research, describing a conceptual model of the proposed digital environment. Guidelines for future research are also discussed.

  4. Neural Correlates of Reflection on Present and Past Selves in Autism Spectrum Disorder.

    PubMed

    Cygan, Hanna B; Marchewka, Artur; Kotlewska, Ilona; Nowicka, Anna

    2018-06-05

    Previous studies indicate that autobiographical memory is impaired in individuals with autism spectrum disorder (ASD). Successful recollection of information referring to one's own person requires the intact ability to re-activate representation of the past self. In the current fMRI study we investigated process of conscious reflection on the present self, the past self, and a close-other in the ASD and typically developing groups. Significant inter-group differences were found in the Past-Self condition. In individuals with ASD, reflection on the past self was associated with additional engagement of the posterior cingulate and posterior temporal structures. We hypothesize that this enhanced activation of widely distributed neural network reflects substantial difficulties in processes of reflection on one's own person in the past.

  5. PHOTONICS AND NANOTECHNOLOGY Laser-induced modification of transparent crystals and glasses

    NASA Astrophysics Data System (ADS)

    Bulgakova, N. M.; Stoian, Razvan; Rosenfeld, A.

    2010-12-01

    We analyse the processes taking place in transparent crystals and glasses irradiated by ultrashort laser pulses in the regimes typical of various applications in optoelectronics and photonics. We consider some phenomena, which have been previously described by the authors within the different model representations: charging of the dielectric surface due to electron photoemission resulting in a Coulomb explosion; crater shaping by using an adaptive control of the laser pulse shape; optimisation of the waveguide writing in materials strongly resistant to laser-induced compaction under ordinary irradiation conditions. The developed models and analysis of the processes relying on these models include the elements of the solid-state physics, plasma physics, thermodynamics, theory of elasticity and plasticity. Some important experimental observations which require explanations and adequate description are summarised.

  6. In-situ activation of CuO/ZnO/Al.sub.2 O.sub.3 catalysts in the liquid phase

    DOEpatents

    Brown, Dennis M.; Hsiung, Thomas H.; Rao, Pradip; Roberts, George W.

    1989-01-01

    The present invention relates to a method of activation of a CuO/ZnO/Al.sub.2 O.sub.3 catalyst slurried in a chemically inert liquid. Successful activation of the catalyst requires the use of a process in which the temperature of the system at any time is not allowed to exceed a certain critical value, which is a function of the specific hydrogen uptake of the catalyst at that same time. This process is especially critical for activating highly concentrated catalyst slurries, typically 25 to 50 wt %. Activation of slurries of CuO/ZnO/Al.sub.2 O.sub.3 catalyst is useful in carrying out the liquid phase methanol or the liquid phase shift reactions.

  7. VIII Workshop on Catastrophic Disruption in the Solar System

    NASA Astrophysics Data System (ADS)

    Michel, Patrick; Nakamura, Akiko M.; Bagatin, Adriano Campo

    2015-03-01

    The Catastrophic Disruption (CD) Workshops have become a tradition for the various communities interested in collisional processes. The first one was organized in 1985 by the missed Prof. Paolo Farinella from the University of Pisa and his colleague Paolo Paolicchi, who understood the fundamental importance of collisional processes in the history of the Solar System. It was followed by subsequent workshops in Belgrade (Serbia, 1987), Kyoto (Japan, 1990), Gubbio (Italy, 1993), the Timberline Lodge (Oregon, USA, 1998), Cannes (France, 2003) and Alicante (Spain, 2007). The CD workshops are typically separated by 3-6 years, accounting for the required amount of time to perform subsequent advances in the field motivating the interest of the relevant scientific community for getting together to discuss new results and evolving directions in the field.

  8. Toward best practices in data processing and analysis for intact biotherapeutics by MS in quantitative bioanalysis.

    PubMed

    Kellie, John F; Kehler, Jonathan R; Karlinsey, Molly Z; Summerfield, Scott G

    2017-12-01

    Typically, quantitation of biotherapeutics from biological matrices by LC-MS is based on a surrogate peptide approach to determine molecule concentration. Recent efforts have focused on quantitation of the intact protein molecules or larger mass subunits of monoclonal antibodies. To date, there has been limited guidance for large or intact protein mass quantitation for quantitative bioanalysis. Intact- and subunit-level analyses of biotherapeutics from biological matrices are performed at 12-25 kDa mass range with quantitation data presented. Linearity, bias and other metrics are presented along with recommendations made on the viability of existing quantitation approaches. This communication is intended to start a discussion around intact protein data analysis and processing, recognizing that other published contributions will be required.

  9. Open LED Illuminator: A Simple and Inexpensive LED Illuminator for Fast Multicolor Particle Tracking in Neurons

    PubMed Central

    Bosse, Jens B.; Tanneti, Nikhila S.; Hogue, Ian B.; Enquist, Lynn W.

    2015-01-01

    Dual-color live cell fluorescence microscopy of fast intracellular trafficking processes, such as axonal transport, requires rapid switching of illumination channels. Typical broad-spectrum sources necessitate the use of mechanical filter switching, which introduces delays between acquisition of different fluorescence channels, impeding the interpretation and quantification of highly dynamic processes. Light Emitting Diodes (LEDs), however, allow modulation of excitation light in microseconds. Here we provide a step-by-step protocol to enable any scientist to build a research-grade LED illuminator for live cell microscopy, even without prior experience with electronics or optics. We quantify and compare components, discuss our design considerations, and demonstrate the performance of our LED illuminator by imaging axonal transport of herpes virus particles with high temporal resolution. PMID:26600461

  10. Transforming BIM to BEM: Generation of Building Geometry for the NASA Ames Sustainability Base BIM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Donnell, James T.; Maile, Tobias; Rose, Cody

    Typical processes of whole Building Energy simulation Model (BEM) generation are subjective, labor intensive, time intensive and error prone. Essentially, these typical processes reproduce already existing data, i.e. building models already created by the architect. Accordingly, Lawrence Berkeley National Laboratory (LBNL) developed a semi-automated process that enables reproducible conversions of Building Information Model (BIM) representations of building geometry into a format required by building energy modeling (BEM) tools. This is a generic process that may be applied to all building energy modeling tools but to date has only been used for EnergyPlus. This report describes and demonstrates each stage inmore » the semi-automated process for building geometry using the recently constructed NASA Ames Sustainability Base throughout. This example uses ArchiCAD (Graphisoft, 2012) as the originating CAD tool and EnergyPlus as the concluding whole building energy simulation tool. It is important to note that the process is also applicable for professionals that use other CAD tools such as Revit (“Revit Architecture,” 2012) and DProfiler (Beck Technology, 2012) and can be extended to provide geometry definitions for BEM tools other than EnergyPlus. Geometry Simplification Tool (GST) was used during the NASA Ames project and was the enabling software that facilitated semi-automated data transformations. GST has now been superseded by Space Boundary Tool (SBT-1) and will be referred to as SBT-1 throughout this report. The benefits of this semi-automated process are fourfold: 1) reduce the amount of time and cost required to develop a whole building energy simulation model, 2) enable rapid generation of design alternatives, 3) improve the accuracy of BEMs and 4) result in significantly better performing buildings with significantly lower energy consumption than those created using the traditional design process, especially if the simulation model was used as a predictive benchmark during operation. Developing BIM based criteria to support the semi-automated process should result in significant reliable improvements and time savings in the development of BEMs. In order to define successful BIMS, CAD export of IFC based BIMs for BEM must adhere to a standard Model View Definition (MVD) for simulation as provided by the concept design BIM MVD (buildingSMART, 2011). In order to ensure wide scale adoption, companies would also need to develop their own material libraries to support automated activities and undertake a pilot project to improve understanding of modeling conventions and design tool features and limitations.« less

  11. The effect of requirements prioritization on avionics system conceptual design

    NASA Astrophysics Data System (ADS)

    Lorentz, John

    This dissertation will provide a detailed approach and analysis of a new collaborative requirements prioritization methodology that has been used successfully on four Coast Guard avionics acquisition and development programs valued at $400M+. A statistical representation of participant study results will be discussed and analyzed in detail. Many technically compliant projects fail to deliver levels of performance and capability that the customer desires. Some of these systems completely meet "threshold" levels of performance; however, the distribution of resources in the process devoted to the development and management of the requirements does not always represent the voice of the customer. This is especially true for technically complex projects such as modern avionics systems. A simplified facilitated process for prioritization of system requirements will be described. The collaborative prioritization process, and resulting artifacts, aids the systems engineer during early conceptual design. All requirements are not the same in terms of customer priority. While there is a tendency to have many thresholds inside of a system design, there is usually a subset of requirements and system performance that is of the utmost importance to the design. These critical capabilities and critical levels of performance typically represent the reason the system is being built. The systems engineer needs processes to identify these critical capabilities, the associated desired levels of performance, and the risks associated with the specific requirements that define the critical capability. The facilitated prioritization exercise is designed to collaboratively draw out these critical capabilities and levels of performance so they can be emphasized in system design. Developing the purpose, scheduling and process for prioritization events are key elements of systems engineering and modern project management. The benefits of early collaborative prioritization flow throughout the project schedule, resulting in greater success during system deployment and operational testing. This dissertation will discuss the data and findings from participant studies, present a literature review of systems engineering and design processes, and test the hypothesis that the prioritization process had no effect on stakeholder sentiment related to the conceptual design. In addition, the "Requirements Rationalization" process will be discussed in detail. Avionics, like many other systems, has transitioned from a discrete electronics engineering, hard engineering discipline to incorporate software engineering as a core process of the technology development cycle. As with other software-based systems, avionics now has significant soft system attributes that must be considered in the design process. The boundless opportunities that exist in software design demand prioritization to focus effort onto the critical functions that the software must provide. This has been a well documented and understood phenomenon in the software development community for many years. This dissertation will attempt to link the effect of software integrated avionics to the benefits of prioritization of requirements in the problem space and demonstrate the sociological and technical benefits of early prioritization practices.

  12. Orbiter Kapton wire operational requirements and experience

    NASA Technical Reports Server (NTRS)

    Peterson, R. V.

    1994-01-01

    The agenda of this presentation includes the Orbiter wire selection requirements, the Orbiter wire usage, fabrication and test requirements, typical wiring installations, Kapton wire experience, NASA Kapton wire testing, summary, and backup data.

  13. Femtosecond laser direct-write of optofluidics in polymer-coated optical fiber

    NASA Astrophysics Data System (ADS)

    Joseph, Kevin A. J.; Haque, Moez; Ho, Stephen; Aitchison, J. Stewart; Herman, Peter R.

    2017-03-01

    Multifunctional lab in fiber technology seeks to translate the accomplishments of optofluidic, lab on chip devices into silica fibers. a robust, flexible, and ubiquitous optical communication platform that can underpin the `Internet of Things' with distributed sensors, or enable lab on chip functions deep inside our bodies. Femtosecond lasers have driven significant advances in three-dimensional processing, enabling optical circuits, microfluidics, and micro-mechanical structures to be formed around the core of the fiber. However, such processing typically requires the stripping and recoating of the polymer buffer or jacket, increasing processing time and mechanically weakening the device. This paper reports on a comprehensive assessment of laser damage in urethane-acrylate-coated fiber. The results show a sufficient processing window is available for femtosecond laser processing of the fiber without damaging the polymer jacket. The fiber core, cladding, and buffer could be simultaneously processed without removal of the buffer jacket. Three-dimensional lab in fiber devices were successfully fabricated by distortion-free immersionlens focusing, presenting fiber-cladding optical circuits and progress towards chemically-etched channels, microfluidic cavities, and MEMS structure inside buffer-coated fiber.

  14. Potential use of advanced process control for safety purposes during attack of a process plant.

    PubMed

    Whiteley, James R

    2006-03-17

    Many refineries and commodity chemical plants employ advanced process control (APC) systems to improve throughputs and yields. These APC systems utilize empirical process models for control purposes and enable operation closer to constraints than can be achieved with traditional PID regulatory feedback control. Substantial economic benefits are typically realized from the addition of APC systems. This paper considers leveraging the control capabilities of existing APC systems to minimize the potential impact of a terrorist attack on a process plant (e.g., petroleum refinery). Two potential uses of APC are described. The first is a conventional application of APC and involves automatically moving the process to a reduced operating rate when an attack first begins. The second is a non-conventional application and involves reconfiguring the APC system to optimize safety rather than economics. The underlying intent in both cases is to reduce the demands on the operator to allow focus on situation assessment and optimal response planning. An overview of APC is provided along with a brief description of the modifications required for the proposed new applications of the technology.

  15. Fast approach for toner saving

    NASA Astrophysics Data System (ADS)

    Safonov, Ilia V.; Kurilin, Ilya V.; Rychagov, Michael N.; Lee, Hokeun; Kim, Sangho; Choi, Donchul

    2011-01-01

    Reducing toner consumption is an important task in modern printing devices and has a significant positive ecological impact. Existing toner saving approaches have two main drawbacks: appearance of hardcopy in toner saving mode is worse in comparison with normal mode; processing of whole rendered page bitmap requires significant computational costs. We propose to add small holes of various shapes and sizes to random places inside a character bitmap stored in font cache. Such random perforation scheme is based on processing pipeline in RIP of standard printer languages Postscript and PCL. Processing of text characters only, and moreover, processing of each character for given font and size alone, is an extremely fast procedure. The approach does not deteriorate halftoned bitmap and business graphics and provide toner saving for typical office documents up to 15-20%. Rate of toner saving is adjustable. Alteration of resulted characters' appearance is almost indistinguishable in comparison with solid black text due to random placement of small holes inside the character regions. The suggested method automatically skips small fonts to preserve its quality. Readability of text processed by proposed method is fine. OCR programs process that scanned hardcopy successfully too.

  16. Engineering Complex Embedded Systems with State Analysis and the Mission Data System

    NASA Technical Reports Server (NTRS)

    Ingham, Michel D.; Rasmussen, Robert D.; Bennett, Matthew B.; Moncada, Alex C.

    2004-01-01

    It has become clear that spacecraft system complexity is reaching a threshold where customary methods of control are no longer affordable or sufficiently reliable. At the heart of this problem are the conventional approaches to systems and software engineering based on subsystem-level functional decomposition, which fail to scale in the tangled web of interactions typically encountered in complex spacecraft designs. Furthermore, there is a fundamental gap between the requirements on software specified by systems engineers and the implementation of these requirements by software engineers. Software engineers must perform the translation of requirements into software code, hoping to accurately capture the systems engineer's understanding of the system behavior, which is not always explicitly specified. This gap opens up the possibility for misinterpretation of the systems engineer s intent, potentially leading to software errors. This problem is addressed by a systems engineering methodology called State Analysis, which provides a process for capturing system and software requirements in the form of explicit models. This paper describes how requirements for complex aerospace systems can be developed using State Analysis and how these requirements inform the design of the system software, using representative spacecraft examples.

  17. Automated Ground Umbilical Systems (AGUS) Project

    NASA Technical Reports Server (NTRS)

    Gosselin, Armand M.

    2007-01-01

    All space vehicles require ground umbilical systems for servicing. Servicing requirements can include, but are not limited to, electrical power and control, propellant loading and venting, pneumatic system supply, hazard gas detection and purging as well as systems checkout capabilities. Of the various types of umbilicals, all require several common subsystems. These typically include an alignment system, mating and locking system, fluid connectors, electrical connectors and control !checkout systems. These systems have been designed to various levels of detail based on the needs for manual and/or automation requirements. The Automated Ground Umbilical Systems (AGUS) project is a multi-phase initiative to develop design performance requirements and concepts for launch system umbilicals. The automation aspect minimizes operational time and labor in ground umbilical processing while maintaining reliability. This current phase of the project reviews the design, development, testing and operations of ground umbilicals built for the Saturn, Shuttle, X-33 and Atlas V programs. Based on the design and operations lessons learned from these systems, umbilicals can be optimized for specific applications. The product of this study is a document containing details of existing systems and requirements for future automated umbilical systems with emphasis on design-for-operations (DFO).

  18. SpaceCube v2.0 Space Flight Hybrid Reconfigurable Data Processing System

    NASA Technical Reports Server (NTRS)

    Petrick, Dave

    2014-01-01

    This paper details the design architecture, design methodology, and the advantages of the SpaceCube v2.0 high performance data processing system for space applications. The purpose in building the SpaceCube v2.0 system is to create a superior high performance, reconfigurable, hybrid data processing system that can be used in a multitude of applications including those that require a radiation hardened and reliable solution. The SpaceCube v2.0 system leverages seven years of board design, avionics systems design, and space flight application experiences. This paper shows how SpaceCube v2.0 solves the increasing computing demands of space data processing applications that cannot be attained with a standalone processor approach.The main objective during the design stage is to find a good system balance between power, size, reliability, cost, and data processing capability. These design variables directly impact each other, and it is important to understand how to achieve a suitable balance. This paper will detail how these critical design factors were managed including the construction of an Engineering Model for an experiment on the International Space Station to test out design concepts. We will describe the designs for the processor card, power card, backplane, and a mission unique interface card. The mechanical design for the box will also be detailed since it is critical in meeting the stringent thermal and structural requirements imposed by the processing system. In addition, the mechanical design uses advanced thermal conduction techniques to solve the internal thermal challenges.The SpaceCube v2.0 processing system is based on an extended version of the 3U cPCI standard form factor where each card is 190mm x 100mm in size The typical power draw of the processor card is 8 to 10W and scales with application complexity. The SpaceCube v2.0 data processing card features two Xilinx Virtex-5 QV Field Programmable Gate Arrays (FPGA), eight memory modules, a monitor FPGA with analog monitoring, Ethernet, configurable interconnect to the Xilinx FPGAs including gigabit transceivers, and the necessary voltage regulation. The processor board uses a back-to-back design methodology for common parts that maximizes the board real estate available. This paper will show how to meet the IPC 6012B Class 3A standard with a 22-layer board that has two column grid array devices with 1.0mm pitch. All layout trades such as stack-up options, via selection, and FPGA signal breakout will be discussed with feature size results. The overall board design process will be discussed including parts selection, circuit design, proper signal termination, layout placement and route planning, signal integrity design and verification, and power integrity results. The radiation mitigation techniques will also be detailed including configuration scrubbing options, Xilinx circuit mitigation and FPGA functional monitoring, and memory protection.Finally, this paper will describe how this system is being used to solve the extreme challenges of a robotic satellite servicing mission where typical space-rated processors are not sufficient enough to meet the intensive data processing requirements. The SpaceCube v2.0 is the main payload control computer and is required to control critical subsystems such as autonomous rendezvous and docking using a suite of vision sensors and object avoidance when controlling two robotic arms.

  19. Dielectrics for long term space exposure and spacecraft charging: A briefing

    NASA Technical Reports Server (NTRS)

    Frederickson, A. R.

    1989-01-01

    Charging of dielectrics is a bulk, not a surface property. Radiation driven charge stops within the bulk and is not quickly conducted to the surface. Very large electric fields develop in the bulk due to this stopped charge. At space radiation levels, it typically requires hours or days for the internal electric fields to reach steady state. The resulting electric fields are large enough to produce electrical failure within the insulator. This type failure is thought to produce nearly all electric discharge anomalies. Radiation also induces bond breakage, creates reactive radicals, displaces atoms and, in general, severely changes the chemistry of the solid state material. Electric fields can alter this process by reacting with charged species, driving them through the solid. Irradiated polymers often lose as much as a percent of their mass, or more, at exposures typical in space. Very different aging or contaminant emission can be induced by the stopped charge electric fields. These radiation effects are detailed.

  20. Direct, enantioselective α-alkylation of aldehydes using simple olefins.

    PubMed

    Capacci, Andrew G; Malinowski, Justin T; McAlpine, Neil J; Kuhne, Jerome; MacMillan, David W C

    2017-11-01

    Although the α-alkylation of ketones has already been established, the analogous reaction using aldehyde substrates has proven surprisingly elusive. Despite the structural similarities between the two classes of compounds, the sensitivity and unique reactivity of the aldehyde functionality has typically required activated substrates or specialized additives. Here, we show that the synergistic merger of three catalytic processes-photoredox, enamine and hydrogen-atom transfer (HAT) catalysis-enables an enantioselective α-aldehyde alkylation reaction that employs simple olefins as coupling partners. Chiral imidazolidinones or prolinols, in combination with a thiophenol, iridium photoredox catalyst and visible light, have been successfully used in a triple catalytic process that is temporally sequenced to deliver a new hydrogen and electron-borrowing mechanism. This multicatalytic process enables both intra- and intermolecular aldehyde α-methylene coupling with olefins to construct both cyclic and acyclic products, respectively. With respect to atom and step-economy ideals, this stereoselective process allows the production of high-value molecules from feedstock chemicals in one step while consuming only photons.

  1. Analysis of the control structures for an integrated ethanol processor for proton exchange membrane fuel cell systems

    NASA Astrophysics Data System (ADS)

    Biset, S.; Nieto Deglioumini, L.; Basualdo, M.; Garcia, V. M.; Serra, M.

    The aim of this work is to investigate which would be a good preliminary plantwide control structure for the process of Hydrogen production from bioethanol to be used in a proton exchange membrane (PEM) accounting only steady-state information. The objective is to keep the process under optimal operation point, that is doing energy integration to achieve the maximum efficiency. Ethanol, produced from renewable feedstocks, feeds a fuel processor investigated for steam reforming, followed by high- and low-temperature shift reactors and preferential oxidation, which are coupled to a polymeric fuel cell. Applying steady-state simulation techniques and using thermodynamic models the performance of the complete system with two different control structures have been evaluated for the most typical perturbations. A sensitivity analysis for the key process variables together with the rigorous operability requirements for the fuel cell are taking into account for defining acceptable plantwide control structure. This is the first work showing an alternative control structure applied to this kind of process.

  2. 3-d interpolation in object perception: evidence from an objective performance paradigm.

    PubMed

    Kellman, Philip J; Garrigan, Patrick; Shipley, Thomas F; Yin, Carol; Machado, Liana

    2005-06-01

    Object perception requires interpolation processes that connect visible regions despite spatial gaps. Some research has suggested that interpolation may be a 3-D process, but objective performance data and evidence about the conditions leading to interpolation are needed. The authors developed an objective performance paradigm for testing 3-D interpolation and tested a new theory of 3-D contour interpolation, termed 3-D relatability. The theory indicates for a given edge which orientations and positions of other edges in space may be connected to it by interpolation. Results of 5 experiments showed that processing of orientation relations in 3-D relatable displays was superior to processing in 3-D nonrelatable displays and that these effects depended on object formation. 3-D interpolation and 3-D relatabilty are discussed in terms of their implications for computational and neural models of object perception, which have typically been based on 2-D-orientation-sensitive units. ((c) 2005 APA, all rights reserved).

  3. Lessons learned: design, start-up, and operation of cryogenic systems

    NASA Astrophysics Data System (ADS)

    Bell, W. M.; Bagley, R. E.; Motew, S.; Young, P.-W.

    2014-11-01

    Cryogenic systems involving a pumped cryogenic fluid, such as liquid nitrogen (LN2), require careful design since the cryogen is close to its boiling point and cold. At 1 atmosphere, LN2 boils at 77.4 K (-320.4 F). These systems, typically, are designed to transport the cryogen, use it for process heat removal, or for generation of gas (GN2) for process use. As the design progresses, it is important to consider all aspects of the design including, cryogen storage, pressure control and safety relief systems, thermodynamic conditions, equipment and instrument selection, materials, insulation, cooldown, pump start-up, maximum design and minimum flow rates, two phase flow conditions, heat flow, process control to meet and maintain operating conditions, piping integrity, piping loads on served equipment, warm-up, venting, and shut-down. "Cutting corners" in the design process can result in stalled start-ups, field rework, schedule hits, or operational restrictions. Some of these "lessoned learned" are described in this paper.

  4. Power control electronics for cryogenic instrumentation

    NASA Technical Reports Server (NTRS)

    Ray, Biswajit; Gerber, Scott S.; Patterson, Richard L.; Myers, Ira T.

    1995-01-01

    In order to achieve a high-efficiency high-density cryogenic instrumentation system, the power processing electronics should be placed in the cold environment along with the sensors and signal-processing electronics. The typical instrumentation system requires low voltage dc usually obtained from processing line frequency ac power. Switch-mode power conversion topologies such as forward, flyback, push-pull, and half-bridge are used for high-efficiency power processing using pulse-width modulation (PWM) or resonant control. This paper presents several PWM and multiresonant power control circuits, implemented using commercially available CMOS and BiCMOS integrated circuits, and their performance at liquid-nitrogen temperature (77 K) as compared to their room temperature (300 K) performance. The operation of integrated circuits at cryogenic temperatures results in an improved performance in terms of increased speed, reduced latch-up susceptibility, reduced leakage current, and reduced thermal noise. However, the switching noise increased at 77 K compared to 300 K. The power control circuits tested in the laboratory did successfully restart at 77 K.

  5. Variable Stars in the Field of V729 Aql

    NASA Astrophysics Data System (ADS)

    Cagaš, P.

    2017-04-01

    Wide field instruments can be used to acquire light curves of tens or even hundreds of variable stars per night, which increases the probability of new discoveries of interesting variable stars and generally increases the efficiency of observations. At the same time, wide field instruments produce a large amount of data, which must be processed using advanced software. The traditional approach, typically used by amateur astronomers, requires an unacceptable amount of time needed to process each data set. New functionality, built into SIPS software package, can shorten the time needed to obtain light curves by several orders of magnitude. Also, newly introduced SILICUPS software is intended for post-processing of stored light curves. It can be used to visualize observations from many nights, to find variable star periods, evaluate types of variability, etc. This work provides an overview of tools used to process data from the large field of view around the variable star V729 Aql. and demonstrates the results.

  6. LIQUID EFFLUENT RETENTION FACILITY (LERF) BASIN 42 STUDIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DUNCAN JB

    2004-10-29

    This report documents laboratory results obtained under test plan RPP-21533 for samples submitted by the Effluent Treatment Facility (ETF) from the Liquid Effluent Retention Facility (LERF) Basin 42 (Reference 1). The LERF Basin 42 contains process condensate (PC) from the 242-A Evaporator and landfill leachate. The ETF processes one PC campaign approximately every 12 to 18 months. A typical PC campaign volume can range from 1.5 to 2.5 million gallons. During the September 2003 ETF Basin 42 processing campaign, a recurring problem with 'gelatinous buildup' on the outlet filters from 60A-TK-I (surge tank) was observed (Figure 1). This buildup appearedmore » on the filters after the contents of the surge tank were adjusted to a pH of between 5 and 6 using sulfuric acid. Biological activity in the PC feed was suspected to be the cause of the gelatinous material. Due to this buildup, the filters (10 {micro}m CUNO) required daily change out to maintain process throughput.« less

  7. Some uses of wavelets for imaging dynamic processes in live cochlear structures

    NASA Astrophysics Data System (ADS)

    Boutet de Monvel, J.

    2007-09-01

    A variety of image and signal processing algorithms based on wavelet filtering tools have been developed during the last few decades, that are well adapted to the experimental variability typically encountered in live biological microscopy. A number of processing tools are reviewed, that use wavelets for adaptive image restoration and for motion or brightness variation analysis by optical flow computation. The usefulness of these tools for biological imaging is illustrated in the context of the restoration of images of the inner ear and the analysis of cochlear motion patterns in two and three dimensions. I also report on recent work that aims at capturing fluorescence intensity changes associated with vesicle dynamics at synaptic zones of sensory hair cells. This latest application requires one to separate the intensity variations associated with the physiological process under study from the variations caused by motion of the observed structures. A wavelet optical flow algorithm for doing this is presented, and its effectiveness is demonstrated on artificial and experimental image sequences.

  8. Energetic electron injections and dipolarization events in Mercury's magnetotail: Substorm dynamics

    NASA Astrophysics Data System (ADS)

    Dewey, R. M.; Slavin, J. A.; Raines, J. M.; Imber, S.; Baker, D. N.; Lawrence, D. J.

    2017-12-01

    Despite its small size, Mercury's terrestrial-like magnetosphere experiences brief, yet intense, substorm intervals characterized by features similar to at Earth: loading/unloading of the tail lobes with open magnetic flux, dipolarization of the magnetic field at the inner edge of the plasma sheet, and, the focus of this presentation, energetic electron injection. We use the Gamma-Ray Spectrometer's high-time resolution (10 ms) energetic electron measurements to determine the relationship between substorm activity and energetic electron injections coincident with dipolarization fronts in the magnetotail. These dipolarizations were detected on the basis of their rapid ( 2 s) increase in the northward component of the tail magnetic field (ΔBz 30 nT), which typically persists for 10 s. We estimate the typical flow channel to be 0.15 RM, planetary convection speed of 750 km/s, cross-tail potential drop of 7 kV, and flux transport of 0.08 MWb for each dipolarization event, suggesting multiple simultaneous and sequential dipolarizations are required to unload the >1 MWb of magnetic flux typically returned to the dayside magnetosphere during a substorm interval. Indeed, while we observe most dipolarization-injections to be isolated or in small chains of events (i.e., 1-3 events), intervals of sawtooth-like injections with >20 sequential events are also present. The typical separation between dipolarization-injection events is 10 s. Magnetotail dipolarization, in addition to being a powerful source of electron acceleration, also plays a significant role in the substorm process at Mercury.

  9. Disentangling inhibition-based and retrieval-based aftereffects of distractors: Cognitive versus motor processes.

    PubMed

    Singh, Tarini; Laub, Ruth; Burgard, Jan Pablo; Frings, Christian

    2018-05-01

    Selective attention refers to the ability to selectively act upon relevant information at the expense of irrelevant information. Yet, in many experimental tasks, what happens to the representation of the irrelevant information is still debated. Typically, 2 approaches to distractor processing have been suggested, namely distractor inhibition and distractor-based retrieval. However, it is also typical that both processes are hard to disentangle. For instance, in the negative priming literature (for a review Frings, Schneider, & Fox, 2015) this has been a continuous debate since the early 1980s. In the present study, we attempted to prove that both processes exist, but that they reflect distractor processing at different levels of representation. Distractor inhibition impacts stimulus representation, whereas distractor-based retrieval impacts mainly motor processes. We investigated both processes in a distractor-priming task, which enables an independent measurement of both processes. For our argument that both processes impact different levels of distractor representation, we estimated the exponential parameter (τ) and Gaussian components (μ, σ) of the exponential Gaussian reaction-time (RT) distribution, which have previously been used to independently test the effects of cognitive and motor processes (e.g., Moutsopoulou & Waszak, 2012). The distractor-based retrieval effect was evident for the Gaussian component, which is typically discussed as reflecting motor processes, but not for the exponential parameter, whereas the inhibition component was evident for the exponential parameter, which is typically discussed as reflecting cognitive processes, but not for the Gaussian parameter. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  10. Carbon dioxide mineralization process design and evaluation: concepts, case studies, and considerations.

    PubMed

    Yuen, Yeo Tze; Sharratt, Paul N; Jie, Bu

    2016-11-01

    Numerous carbon dioxide mineralization (CM) processes have been proposed to overcome the slow rate of natural weathering of silicate minerals. Ten of these proposals are mentioned in this article. The proposals are described in terms of the four major areas relating to CM process design: pre-treatment, purification, carbonation, and reagent recycling operations. Any known specifics based on probable or representative operating and reaction conditions are listed, and basic analysis of the strengths and shortcomings associated with the individual process designs are given in this article. The processes typically employ physical or chemical pseudo-catalytic methods to enhance the rate of carbon dioxide mineralization; however, both methods have its own associated advantages and problems. To examine the feasibility of a CM process, three key aspects should be included in the evaluation criteria: energy use, operational considerations as well as product value and economics. Recommendations regarding the optimal level of emphasis and implementation of measures to control these aspects are given, and these will depend very much on the desired process objectives. Ultimately, a mix-and-match approach to process design might be required to provide viable and economic proposals for CM processes.

  11. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    PubMed

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  12. Coagulation chemistries for silica removal from cooling tower water.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nyman, May Devan; Altman, Susan Jeanne; Stewart, Tom

    2010-02-01

    The formation of silica scale is a problem for thermoelectric power generating facilities, and this study investigated the potential for removal of silica by means of chemical coagulation from source water before it is subjected to mineral concentration in cooling towers. In Phase I, a screening of many typical as well as novel coagulants was carried out using concentrated cooling tower water, with and without flocculation aids, at concentrations typical for water purification with limited results. In Phase II, it was decided that treatment of source or make up water was more appropriate, and that higher dosing with coagulants deliveredmore » promising results. In fact, the less exotic coagulants proved to be more efficacious for reasons not yet fully determined. Some analysis was made of the molecular nature of the precipitated floc, which may aid in process improvements. In Phase III, more detailed study of process conditions for aluminum chloride coagulation was undertaken. Lime-soda water softening and the precipitation of magnesium hydroxide were shown to be too limited in terms of effectiveness, speed, and energy consumption to be considered further for the present application. In Phase IV, sodium aluminate emerged as an effective coagulant for silica, and the most attractive of those tested to date because of its availability, ease of use, and low requirement for additional chemicals. Some process optimization was performed for coagulant concentration and operational pH. It is concluded that silica coagulation with simple aluminum-based agents is effective, simple, and compatible with other industrial processes.« less

  13. Emission factor for atmospheric ammonia from a typical municipal wastewater treatment plant in South China.

    PubMed

    Zhang, Chunlin; Geng, Xuesong; Wang, Hao; Zhou, Lei; Wang, Boguang

    2017-01-01

    Atmospheric ammonia (NH 3 ), a common alkaline gas found in air, plays a significant role in atmospheric chemistry, such as in the formation of secondary particles. However, large uncertainties remain in the estimation of ammonia emissions from nonagricultural sources, such as wastewater treatment plants (WWTPs). In this study, the ammonia emission factors from a large WWTP utilizing three typical biological treatment techniques to process wastewater in South China were calculated using the US EPA's WATER9 model with three years of raw sewage measurements and information about the facility. The individual emission factors calculated were 0.15 ± 0.03, 0.24 ± 0.05, 0.29 ± 0.06, and 0.25 ± 0.05 g NH 3  m -3 sewage for the adsorption-biodegradation activated sludge treatment process, the UNITANK process (an upgrade of the sequencing batch reactor activated sludge treatment process), and two slightly different anaerobic-anoxic-oxic treatment processes, respectively. The overall emission factor of the WWTP was 0.24 ± 0.06 g NH 3 m -3 sewage. The pH of the wastewater influent is likely an important factor affecting ammonia emissions, because higher emission factors existed at higher pH values. Based on the ammonia emission factor generated in this study, sewage treatment accounted for approximately 4% of the ammonia emissions for the urban area of South China's Pearl River Delta (PRD) in 2006, which is much less than the value of 34% estimated in previous studies. To reduce the large uncertainty in the estimation of ammonia emissions in China, more field measurements are required. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. A National Approach to Reimbursement Decision-Making on Drugs for Rare Diseases in Canada? Insights from Across the Ponds.

    PubMed

    Short, Hilary; Stafinski, Tania; Menon, Devidas

    2015-05-01

    Regardless of the type of health system or payer, coverage decisions on drugs for rare diseases (DRDs) are challenging. While these drugs typically represent the only active treatment option for a progressive and/or life-threatening condition, evidence of clinical benefit is often limited because of small patient populations and the costs are high. Thus, decisions come with considerable uncertainty and risk. In Canada, interest in developing a pan-Canadian decision-making approach informed by international experiences exists. To develop an inventory of existing policies and processes for making coverage decisions on DRDs around the world. A systematic review of published and unpublished documents describing current policies and processes in the top 20 gross domestic product countries was conducted. Bibliographic databases, the Internet and government/health technology assessment organization websites in each country were searched. Two researchers independently extracted information and tabulated it to facilitate qualitative comparative analyses. Policy experts from each country were contacted and asked to review the information collected for accuracy and completeness. Almost all countries have multiple mechanisms through which coverage for a DRD may be sought. However, they typically begin with a review that follows the same process as drugs for more common conditions (i.e., the centralized review process), although specific submission requirements could differ (e.g., no need to submit a cost-effectiveness analysis). When drugs fail to receive a positive recommendation/decision, they are reconsidered by "safety net"-type programs. Eligibility criteria vary across countries, as do the decision options, which may be applied to individual patients or patient groups. With few exceptions, countries have not created separate centralized review processes for DRDs. Instead, they have modified components of existing mechanisms and added safety nets. Copyright © 2015 Longwoods Publishing.

  15. Risk Evaluation in the Pre-Phase A Conceptual Design of Spacecraft

    NASA Technical Reports Server (NTRS)

    Fabisinski, Leo L., III; Maples, Charlotte Dauphne

    2010-01-01

    Typically, the most important decisions in the design of a spacecraft are made in the earliest stages of its conceptual design the Pre-Phase A stages. It is in these stages that the greatest number of design alternatives is considered, and the greatest number of alternatives is rejected. The focus of Pre-Phase A conceptual development is on the evaluation and comparison of whole concepts and the larger-scale systems comprising those concepts. This comparison typically uses general Figures of Merit (FOMs) to quantify the comparative benefits of designs and alternative design features. Along with mass, performance, and cost, risk should be one of the major FOMs in evaluating design decisions during the conceptual design phases. However, risk is often given inadequate consideration in conceptual design practice. The reasons frequently given for this lack of attention to risk include: inadequate mission definition, lack of rigorous design requirements in early concept phases, lack of fidelity in risk assessment methods, and under-evaluation of risk as a viable FOM for design evaluation. In this paper, the role of risk evaluation in early conceptual design is discussed. The various requirements of a viable risk evaluation tool at the Pre-Phase A level are considered in light of the needs of a typical spacecraft design study. A technique for risk identification and evaluation is presented. The application of the risk identification and evaluation approach to the conceptual design process is discussed. Finally, a computational tool for risk profiling is presented and applied to assess the risk for an existing Pre-Phase A proposal. The resulting profile is compared to the risks identified for the proposal by other means.

  16. V/STOL propulsion control analysis: Phase 2, task 5-9

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Typical V/STOL propulsion control requirements were derived for transition between vertical and horizontal flight using the General Electric RALS (Remote Augmented Lift System) concept. Steady-state operating requirements were defined for a typical Vertical-to-Horizontal transition and for a typical Horizontal-to-Vertical transition. Control mode requirements were established and multi-variable regulators developed for individual operating conditions. Proportional/Integral gain schedules were developed and were incorporated into a transition controller with capabilities for mode switching and manipulated variable reassignment. A non-linear component-level transient model of the engine was developed and utilized to provide a preliminary check-out of the controller logic. An inlet and nozzle effects model was developed for subsequent incorporation into the engine model and an aircraft model was developed for preliminary flight transition simulations. A condition monitoring development plan was developed and preliminary design requirements established. The Phase 1 long-range technology plan was refined and restructured toward the development of a real-time high fidelity transient model of a supersonic V/STOL propulsion system and controller for use in a piloted simulation program at NASA-Ames.

  17. The influence of geomorphology on the role of women at artisanal and small-scale mine sites

    USGS Publications Warehouse

    Malpeli, Katherine C.; Chirico, Peter G.

    2013-01-01

    The geologic and geomorphic expressions of a mineral deposit determine its location, size, and accessibility, characteristics which in turn greatly influence the success of artisans mining the deposit. Despite this critical information, which can be garnered through studying the surficial physical expression of a deposit, the geologic and geomorphic sciences have been largely overlooked in artisanal mining-related research. This study demonstrates that a correlation exists between the roles of female miners at artisanal diamond and gold mining sites in western and central Africa and the physical expression of the deposits. Typically, women perform ore processing and ancillary roles at mine sites. On occasion, however, women participate in the extraction process itself. Women were found to participate in the extraction of ore only when a deposit had a thin overburden layer, thus rendering the mineralized ore more accessible. When deposits required a significant degree of manual labour to access the ore due to thick overburden layers, women were typically relegated to other roles. The identification of this link encourages the establishment of an alternative research avenue in which the physical and social sciences merge to better inform policymakers, so that the most appropriate artisanal mining assistance programs can be developed and implemented.

  18. Toolsets for Airborne Data (TAD): Enhanced Airborne Data Merging Functionality through Spatial and Temporal Subsetting

    NASA Astrophysics Data System (ADS)

    Early, A. B.; Chen, G.; Beach, A. L., III; Northup, E. A.

    2016-12-01

    NASA has conducted airborne tropospheric chemistry studies for over three decades. These field campaigns have generated a great wealth of observations, including a wide range of the trace gases and aerosol properties. The Atmospheric Science Data Center (ASDC) at NASA Langley Research Center in Hampton Virginia originally developed the Toolsets for Airborne Data (TAD) web application in September 2013 to meet the user community needs for manipulating aircraft data for scientific research on climate change and air quality relevant issues. The analysis of airborne data typically requires data subsetting, which can be challenging and resource intensive for end users. In an effort to streamline this process, the TAD toolset enhancements will include new data subsetting features and updates to the current database model. These will include two subsetters: temporal and spatial, and vertical profile. The temporal and spatial subsetter will allow users to both focus on data from a specific location and/or time period. The vertical profile subsetter will retrieve data collected during an individual aircraft ascent or descent spiral. This effort will allow for the automation of the typically labor-intensive manual data subsetting process, which will provide users with data tailored to their specific research interests. The development of these enhancements will be discussed in this presentation.

  19. Quantification of the inevitable: the influence of soil macrofauna on soil water movement in rehabilitated open-cut mined lands

    NASA Astrophysics Data System (ADS)

    Arnold, S.; Williams, E. R.

    2016-01-01

    Recolonisation of soil by macrofauna (especially ants, termites and earthworms) in rehabilitated open-cut mine sites is inevitable and, in terms of habitat restoration and function, typically of great value. In these highly disturbed landscapes, soil invertebrates play a major role in soil development (macropore configuration, nutrient cycling, bioturbation, etc.) and can influence hydrological processes such as infiltration, seepage, runoff generation and soil erosion. Understanding and quantifying these ecosystem processes is important in rehabilitation design, establishment and subsequent management to ensure progress to the desired end goal, especially in waste cover systems designed to prevent water reaching and transporting underlying hazardous waste materials. However, the soil macrofauna is typically overlooked during hydrological modelling, possibly due to uncertainties on the extent of their influence, which can lead to failure of waste cover systems or rehabilitation activities. We propose that scientific experiments under controlled conditions and field trials on post-mining lands are required to quantify (i) macrofauna-soil structure interactions, (ii) functional dynamics of macrofauna taxa, and (iii) their effects on macrofauna and soil development over time. Such knowledge would provide crucial information for soil water models, which would increase confidence in mine waste cover design recommendations and eventually lead to higher likelihood of rehabilitation success of open-cut mining land.

  20. Plasticity of Neuron-Glial Transmission: Equipping Glia for Long-Term Integration of Network Activity.

    PubMed

    Croft, Wayne; Dobson, Katharine L; Bellamy, Tomas C

    2015-01-01

    The capacity of synaptic networks to express activity-dependent changes in strength and connectivity is essential for learning and memory processes. In recent years, glial cells (most notably astrocytes) have been recognized as active participants in the modulation of synaptic transmission and synaptic plasticity, implicating these electrically nonexcitable cells in information processing in the brain. While the concept of bidirectional communication between neurons and glia and the mechanisms by which gliotransmission can modulate neuronal function are well established, less attention has been focussed on the computational potential of neuron-glial transmission itself. In particular, whether neuron-glial transmission is itself subject to activity-dependent plasticity and what the computational properties of such plasticity might be has not been explored in detail. In this review, we summarize current examples of plasticity in neuron-glial transmission, in many brain regions and neurotransmitter pathways. We argue that induction of glial plasticity typically requires repetitive neuronal firing over long time periods (minutes-hours) rather than the short-lived, stereotyped trigger typical of canonical long-term potentiation. We speculate that this equips glia with a mechanism for monitoring average firing rates in the synaptic network, which is suited to the longer term roles proposed for astrocytes in neurophysiology.

Top