Validation Test Report for the Automated Optical Processing System (AOPS) Version 4.8
2013-06-28
be familiar with UNIX; BASH shell programming; and remote sensing, particularly regarding computer processing of satellite data. The system memory ...and storage requirements are difficult to gauge. The amount of memory needed is dependent upon the amount and type of satellite data you wish to...process; the larger the area, the larger the memory requirement. For example, the entire Atlantic Ocean will require more processing power than the
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernardin, John D; Baca, Allen G
This paper presents the mechanical design, fabrication and dynamic testing of an electrostatic analyzer spacecraft instrument. The functional and environmental requirements combined with limited spacecraft accommodations, resulted in complex component geometries, unique material selections, and difficult fabrication processes. The challenging aspects of the mechanical design and several of the more difficult production processes are discussed. In addition, the successes, failures, and lessons learned from acoustic and random vibration testing of a full-scale prototype instrument are presented.
Auditory Processing Disorder (For Parents)
... or other speech-language difficulties? Are verbal (word) math problems difficult for your child? Is your child ... inferences from conversations, understanding riddles, or comprehending verbal math problems — require heightened auditory processing and language levels. ...
Secondary Students' Perceptions about Learning Qualitative Analysis in Inorganic Chemistry
NASA Astrophysics Data System (ADS)
Tan, Kim-Chwee Daniel; Goh, Ngoh-Khang; Chia, Lian-Sai; Treagust, David F.
2001-02-01
Grade 10 students in Singapore find qualitative analysis one of the more difficult topics in their external examinations. Fifty-one grade 10 students (15-17 years old) from three schools were interviewed to investigate their perceptions about learning qualitative analysis and the aspects of qualitative analysis they found difficult. The results showed that students found qualitative analysis tedious, difficult to understand and found the practical sessions unrelated to what they learned in class. They also believed that learning qualitative analysis required a great amount of memory work. It is proposed that their difficulties may arise from not knowing explicitly what is required in qualitative analysis, the content of qualitative analysis, the lack of motivation to understand qualitative analysis, cognitive overloading, and the lack of mastery of the required process skills.
Configuration Management, Capacity Planning Decision Support, Modeling and Simulation
1988-12-01
flow includes both top-down and bottom-up requirements. The flow also includes hardware, software and transfer acquisition, installation, operation ... management and upgrade as required. Satisfaction of a users needs and requirements is a difficult and detailed process. The key assumptions at this
Assessing Course Outcomes: It Doesn't Have to Be Difficult!
ERIC Educational Resources Information Center
Hammons, James O.; Hui, Mary Margaret; Keogh, Rochelle
2016-01-01
As the title suggests, this article offers the concept that assessing course outcomes does not have to be difficult. The authors believe that some adaptation of the process they have described in this article should provide students with an opportunity to provide meaningful feedback about what the course covered and what was required of them. The…
A Framework for Business Process Change Requirements Analysis
NASA Astrophysics Data System (ADS)
Grover, Varun; Otim, Samuel
The ability to quickly and continually adapt business processes to accommodate evolving requirements and opportunities is critical for success in competitive environments. Without appropriate linkage between redesign decisions and strategic inputs, identifying processes that need to be modified will be difficult. In this paper, we draw attention to the analysis of business process change requirements in support of process change initiatives. Business process redesign is a multifaceted phenomenon involving processes, organizational structure, management systems, human resource architecture, and many other aspects of organizational life. To be successful, the business process initiative should focus not only on identifying the processes to be redesigned, but also pay attention to various enablers of change. Above all, a framework is just a blueprint; management must lead change. We hope our modest contribution will draw attention to the broader framing of requirements for business process change.
Big data processing in the cloud - Challenges and platforms
NASA Astrophysics Data System (ADS)
Zhelev, Svetoslav; Rozeva, Anna
2017-12-01
Choosing the appropriate architecture and technologies for a big data project is a difficult task, which requires extensive knowledge in both the problem domain and in the big data landscape. The paper analyzes the main big data architectures and the most widely implemented technologies used for processing and persisting big data. Clouds provide for dynamic resource scaling, which makes them a natural fit for big data applications. Basic cloud computing service models are presented. Two architectures for processing big data are discussed, Lambda and Kappa architectures. Technologies for big data persistence are presented and analyzed. Stream processing as the most important and difficult to manage is outlined. The paper highlights main advantages of cloud and potential problems.
The difficult mountain: enriched composition in adjective–noun phrases
Pickering, Martin J.; McElree, Brian
2012-01-01
When readers need to go beyond the straightforward compositional meaning of a sentence (i.e., when enriched composition is required), costly additional processing is the norm. However, this conclusion is based entirely on research that has looked at enriched composition between two phrases or within the verb phrase (e.g., the verb and its complement in … started the book …) where there is a discrepancy between the semantic expectations of the verb and the semantics of the noun. We carried out an eye-tracking experiment investigating enriched composition within a single noun phrase, as in the difficult mountain. As compared with adjective–noun phrases that allow a straightforward compositional interpretation (the difficult exercise), the coerced phrases were more difficult to process. These results indicate that coercion effects can be found in the absence of a typing violation and within a single noun phrase. PMID:21826403
A corporate product integrity assurance process.
Weiler, E D; Keener, R
1991-10-01
One of the more difficult challenges that confronts the chemical industry throughout the industrialized world is how to effectively manage the various and often diverse regulatory requirements. What follows is a description of a process designed to help with new product introductions. The process is generic and is applicable to almost any corporate environment and structure.
The study on knowledge transferring incentive for information system requirement development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Yang
2015-03-10
Information system requirement development is a process of users’ knowledge sharing and transferring. However the tacit requirements developing is a main problem during requirement development process, for the reason of difficult to encoding, express, and communicate. Knowledge fusion and corporate effort is needed to finding tacit requirements. Under this background, our paper try to find out the rule of effort dynamic evolutionary of software developer and user by building an evolutionary game model on the condition of incentive system. And in addition this paper provides an in depth discussion at the end of this paper.
The Analytic Hierarchy Process and Participatory Decisionmaking
Daniel L. Schmoldt; Daniel L. Peterson; Robert L. Smith
1995-01-01
Managing natural resource lands requires social, as well as biophysical, considerations. Unfortunately, it is extremely difficult to accurately assess and quantify changing social preferences, and to aggregate conflicting opinions held by diverse social groups. The Analytic Hierarchy Process (AHP) provides a systematic, explicit, rigorous, and robust mechanism for...
Hospital renovation projects: phased construction requires planning at its best.
Cox, J C
1986-01-01
Building a new hospital facility is a difficult task, but adding onto and renovating an existing structure while normal activity continues is even more difficult. Project planners, designers, contractors, and hospital managers must carefully program the joint effort of construction and hospital operation. Several factors in the construction process and potential problems for hospital operations are described to help hospital managers better anticipate difficulties before plans are finalized and construction commences.
Practical Unitary Simulator for Non-Markovian Complex Processes
NASA Astrophysics Data System (ADS)
Binder, Felix C.; Thompson, Jayne; Gu, Mile
2018-06-01
Stochastic processes are as ubiquitous throughout the quantitative sciences as they are notorious for being difficult to simulate and predict. In this Letter, we propose a unitary quantum simulator for discrete-time stochastic processes which requires less internal memory than any classical analogue throughout the simulation. The simulator's internal memory requirements equal those of the best previous quantum models. However, in contrast to previous models, it only requires a (small) finite-dimensional Hilbert space. Moreover, since the simulator operates unitarily throughout, it avoids any unnecessary information loss. We provide a stepwise construction for simulators for a large class of stochastic processes hence directly opening the possibility for experimental implementations with current platforms for quantum computation. The results are illustrated for an example process.
Increasing Cognitive Inhibition with a Difficult Prior Task: Implications for Mathematical Thinking
ERIC Educational Resources Information Center
Attridge, Nina; Inglis, Matthew
2015-01-01
Dual-process theories posit two distinct types of cognitive processing: Type 1, which does not use working memory making it fast and automatic, and Type 2, which does use working memory making it slow and effortful. Mathematics often relies on the inhibition of pervasive Type 1 processing to apply new skills or knowledge that require Type 2…
Xie, Yiyue; Dahlin, Jayme L; Oakley, Aaron J; Casarotto, Marco G; Board, Philip G; Baell, Jonathan B
2018-05-10
Early stage drug discovery reporting on relatively new or difficult targets is often associated with insufficient hit triage. Literature reviews of such targets seldom delve into the detail required to critically analyze the associated screening hits reported. Here we take the enzyme glutathione transferase omega-1 (GSTO1-1) as an example of a relatively difficult target and review the associated literature involving small-molecule inhibitors. As part of this process we deliberately pay closer-than-usual attention to assay interference and hit quality aspects. We believe this Perspective will be a useful guide for future development of GSTO1-1 inhibitors, as well serving as a template for future review formats of new or difficult targets.
A high-speed linear algebra library with automatic parallelism
NASA Technical Reports Server (NTRS)
Boucher, Michael L.
1994-01-01
Parallel or distributed processing is key to getting highest performance workstations. However, designing and implementing efficient parallel algorithms is difficult and error-prone. It is even more difficult to write code that is both portable to and efficient on many different computers. Finally, it is harder still to satisfy the above requirements and include the reliability and ease of use required of commercial software intended for use in a production environment. As a result, the application of parallel processing technology to commercial software has been extremely small even though there are numerous computationally demanding programs that would significantly benefit from application of parallel processing. This paper describes DSSLIB, which is a library of subroutines that perform many of the time-consuming computations in engineering and scientific software. DSSLIB combines the high efficiency and speed of parallel computation with a serial programming model that eliminates many undesirable side-effects of typical parallel code. The result is a simple way to incorporate the power of parallel processing into commercial software without compromising maintainability, reliability, or ease of use. This gives significant advantages over less powerful non-parallel entries in the market.
Cisneros, Carolina; Díaz-Campos, Rocío Magdalena; Marina, Núria; Melero, Carlos; Padilla, Alicia; Pascual, Silvia; Pinedo, Celia; Trisán, Andrea
2017-01-01
This paper, developed by consensus of staff physicians of accredited asthma units for the management of severe asthma, presents information on the process and requirements for already-existing asthma units to achieve official accreditation by the Spanish Society of Pneumology and Thoracic Surgery (SEPAR). Three levels of specialized asthma care have been established based on available resources, which include specialized units for highly complex asthma, specialized asthma units, and basic asthma units. Regardless of the level of accreditation obtained, the distinction of “excellence” could be granted when more requirements in the areas of provision of care, technical and human resources, training in asthma, and teaching and research activities were met at each level. The Spanish experience in the process of accreditation of specialized asthma units, particularly for the care of patients with difficult-to-control asthma, may be applicable to other health care settings. PMID:28533690
Visual field tunneling in aviators induced by memory demands.
Williams, L J
1995-04-01
Aviators are required rapidly and accurately to process enormous amounts of visual information located foveally and peripherally. The present study, expanding upon an earlier study (Williams, 1988), required young aviators to process within the framework of a single eye fixation a briefly displayed foveally presented memory load while simultaneously trying to identify common peripheral targets presented on the same display at locations up to 4.5 degrees of visual angle from the fixation point. This task, as well as a character classification task (Williams, 1985, 1988), has been shown to be very difficult for nonaviators: It results in a tendency toward tunnel vision. Limited preliminary measurements of peripheral accuracy suggested that aviators might be less susceptible than nonaviators to this visual tunneling. The present study demonstrated moderate susceptibility to cognitively induced tunneling in aviators when the foveal task was sufficiently difficult and reaction time was the principal dependent measure.
Manufacturing of glassy thin shell for adaptive optics: results achieved
NASA Astrophysics Data System (ADS)
Poutriquet, F.; Rinchet, A.; Carel, J.-L.; Leplan, H.; Ruch, E.; Geyl, R.; Marque, G.
2012-07-01
Glassy thin shells are key components for the development of adaptive optics and are part of future & innovative projects such as ELT. However, manufacturing thin shells is a real challenge. Even though optical requirements for the front face - or optical face - are relaxed compared to conventional passive mirrors, requirements concerning thickness uniformity are difficult to achieve. In addition, process has to be completely re-defined as thin mirror generates new manufacturing issues. In particular, scratches and digs requirement is more difficult as this could weaken the shell, handling is also an important issue due to the fragility of the mirror. Sagem, through REOSC program, has recently manufactured different types of thin shells in the frame of European projects: E-ELT M4 prototypes and VLT Deformable Secondary Mirror (VLT DSM).
DoD Message Protocol Report. Volume I. Message Protocol Specification.
1981-12-15
26L 2.6 STATUS-REPORTING SERVICES ........................................ 26 2.6.1 Acknowledgements and Processing Status...and data. Envelopes give processing instructions and/or descriptions of their contents. Data are not altered (as regards content) by the CBMS except...tailored to an individual user’s requirements, we view them as application-layer processes . The potential diversity of UAs makes verifi- cation difficult
Magnetorheological finishing: a perfect solution to nanofinishing requirements
NASA Astrophysics Data System (ADS)
Sidpara, Ajay
2014-09-01
Finishing of optics for different applications is the most important as well as difficult step to meet the specification of optics. Conventional grinding or other polishing processes are not able to reduce surface roughness beyond a certain limit due to high forces acting on the workpiece, embedded abrasive particles, limited control over process, etc. Magnetorheological finishing (MRF) process provides a new, efficient, and innovative way to finish optical materials as well many metals to their desired level of accuracy. This paper provides an overview of MRF process for different applications, important process parameters, requirement of magnetorheological fluid with respect to workpiece material, and some areas that need to be explored for extending the application of MRF process.
Cross-Correlation-Based Structural System Identification Using Unmanned Aerial Vehicles
Yoon, Hyungchul; Hoskere, Vedhus; Park, Jong-Woong; Spencer, Billie F.
2017-01-01
Computer vision techniques have been employed to characterize dynamic properties of structures, as well as to capture structural motion for system identification purposes. All of these methods leverage image-processing techniques using a stationary camera. This requirement makes finding an effective location for camera installation difficult, because civil infrastructure (i.e., bridges, buildings, etc.) are often difficult to access, being constructed over rivers, roads, or other obstacles. This paper seeks to use video from Unmanned Aerial Vehicles (UAVs) to address this problem. As opposed to the traditional way of using stationary cameras, the use of UAVs brings the issue of the camera itself moving; thus, the displacements of the structure obtained by processing UAV video are relative to the UAV camera. Some efforts have been reported to compensate for the camera motion, but they require certain assumptions that may be difficult to satisfy. This paper proposes a new method for structural system identification using the UAV video directly. Several challenges are addressed, including: (1) estimation of an appropriate scale factor; and (2) compensation for the rolling shutter effect. Experimental validation is carried out to validate the proposed approach. The experimental results demonstrate the efficacy and significant potential of the proposed approach. PMID:28891985
The Effect of Process Writing Activities on the Writing Skills of Prospective Turkish Teachers
ERIC Educational Resources Information Center
Dilidüzgün, Sükran
2013-01-01
Problem statement: Writing an essay is a most difficult creative work and consequently requires detailed instruction. There are in fact two types of instruction that contribute to the development of writing skills: Reading activities analysing texts in content and schematic structure to find out how they are composed and process writing…
Triangulating System Requirements for Users with Severe Motor Disabilities
ERIC Educational Resources Information Center
Randolph, Adriane B.
2012-01-01
By giving a voice to users in the design process of information systems, they often feel more empowered and engaged. The inclusion of users with disabilities in the design process, however, can be markedly more difficult. User profiling allows a user's preferences and interests to be captured and represented. However, for a user with severe motor…
Image Processing Algorithms in the Secondary School Programming Education
ERIC Educational Resources Information Center
Gerják, István
2017-01-01
Learning computer programming for students of the age of 14-18 is difficult and requires endurance and engagement. Being familiar with the syntax of a computer language and writing programs in it are challenges for youngsters, not to mention that understanding algorithms is also a big challenge. To help students in the learning process, teachers…
ERIC Educational Resources Information Center
Barajas-Saavedra, Arturo; Álvarez-Rodriguez, Francisco J.; Mendoza-González, Ricardo; Oviedo-De-Luna, Ana C.
2015-01-01
Development of digital resources is difficult due to their particular complexity relying on pedagogical aspects. Another aspect is the lack of well-defined development processes, experiences documented, and standard methodologies to guide and organize game development. Added to this, there is no documented technique to ensure correct…
CNC Machining Of The Complex Copper Electrodes
NASA Astrophysics Data System (ADS)
Popan, Ioan Alexandru; Balc, Nicolae; Popan, Alina
2015-07-01
This paper presents the machining process of the complex copper electrodes. Machining of the complex shapes in copper is difficult because this material is soft and sticky. This research presents the main steps for processing those copper electrodes at a high dimensional accuracy and a good surface quality. Special tooling solutions are required for this machining process and optimal process parameters have been found for the accurate CNC equipment, using smart CAD/CAM software.
NASA Technical Reports Server (NTRS)
1984-01-01
The electroepitaxial process and the Very Large Scale Integration (VLSI) circuits (chips) facilities were chosen because each requires a very high degree of automation, and therefore involved extensive use of teleoperators, robotics, process mechanization, and artificial intelligence. Both cover a raw materials process and a sophisticated multi-step process and are therfore highly representative of the kinds of difficult operation, maintenance, and repair challenges which can be expected for any type of space manufacturing facility. Generic areas were identified which will require significant further study. The initial design will be based on terrestrial state-of-the-art hard automation. One hundred candidate missions were evaluated on the basis of automation portential and availability of meaning ful knowldege. The design requirements and unconstrained design concepts developed for the two missions are presented.
ERIC Educational Resources Information Center
Noguchi, Mary Goebel
Increasingly, foreign nationals living in Japan are sending their children to Japanese elementary schools. This requires that the children's native language be taught outside of school, most often at home. While teaching oral language is not difficult for parents, teaching reading requires different skills. Some difficulties in this process are…
Processing lunar soils for oxygen and other materials
NASA Technical Reports Server (NTRS)
Knudsen, Christian W.; Gibson, Michael A.
1992-01-01
Two types of lunar materials are excellent candidates for lunar oxygen production: ilmenite and silicates such as anorthite. Both are lunar surface minable, occurring in soils, breccias, and basalts. Because silicates are considerably more abundant than ilmenite, they may be preferred as source materials. Depending on the processing method chosen for oxygen production and the feedstock material, various useful metals and bulk materials can be produced as byproducts. Available processing techniques include hydrogen reduction of ilmenite and electrochemical and chemical reductions of silicates. Processes in these categories are generally in preliminary development stages and need significant research and development support to carry them to practical deployment, particularly as a lunar-based operation. The goal of beginning lunar processing operations by 2010 requires that planning and research and development emphasize the simplest processing schemes. However, more complex schemes that now appear to present difficult technical challenges may offer more valuable metal byproducts later. While they require more time and effort to perfect, the more complex or difficult schemes may provide important processing and product improvements with which to extend and elaborate the initial lunar processing facilities. A balanced R&D program should take this into account. The following topics are discussed: (1) ilmenite--semi-continuous process; (2) ilmenite--continuous fluid-bed reduction; (3) utilization of spent ilmenite to produce bulk materials; (4) silicates--electrochemical reduction; and (5) silicates--chemical reduction.
A process for prototyping onboard payload displays for Space Station Freedom
NASA Technical Reports Server (NTRS)
Moore, Loretta A.
1992-01-01
Significant advances have been made in the area of Human-Computer Interface design. However, there is no well-defined process for going from user interface requirements to user interface design. Developing and designing a clear and consistent user interface for medium to large scale systems is a very challenging and complex task. The task becomes increasingly difficult when there is very little guidance and procedures on how the development process should flow from one stage to the next. Without a specific sequence of development steps each design becomes difficult to repeat, to evaluate, to improve, and to articulate to others. This research contributes a process which identifies the phases of development and products produced as a result of each phase for a rapid prototyping process to be used to develop requirements for the onboard payload displays for Space Station Freedom. The functional components of a dynamic prototyping environment in which this process can be carried out is also discussed. Some of the central questions which are answered here include: How does one go from specifications to an actual prototype? How is a prototype evaluated? How is usability defined and thus measured? How do we use the information from evaluation in redesign of an interface? and Are there techniques which allow for convergence on a design?
NASA Astrophysics Data System (ADS)
Xing, Xi; Rey-de-Castro, Roberto; Rabitz, Herschel
2014-12-01
Optimally shaped femtosecond laser pulses can often be effectively identified in adaptive feedback quantum control experiments, but elucidating the underlying control mechanism can be a difficult task requiring significant additional analysis. We introduce landscape Hessian analysis (LHA) as a practical experimental tool to aid in elucidating control mechanism insights. This technique is applied to the dissociative ionization of CH2BrI using shaped fs laser pulses for optimization of the absolute yields of ionic fragments as well as their ratios for the competing processes of breaking the C-Br and C-I bonds. The experimental results suggest that these nominally complex problems can be reduced to a low-dimensional control space with insights into the control mechanisms. While the optimal yield for some fragments is dominated by a non-resonant intensity-driven process, the optimal generation of other fragments maa difficult task requiring significant additionaly be explained by a non-resonant process coupled to few level resonant dynamics. Theoretical analysis and modeling is consistent with the experimental observations.
Prioritizing parts from cutting bills when gang-ripping first
R. Edward Thomas
1996-01-01
Computer optimization of gang-rip-first processing is a difficult problem when working with specific cutting bills. Interactions among board grade and size, arbor setup, and part sizes and quantities greatly complicate the decision making process. Cutting the wrong parts at any moment will mean that more board footage will be required to meet the bill. Using the ROugh...
ISSUES AND CHALLENGES IN MODELING CHILDREN'S LONGITUDINAL EXPOSURES: AN OZONE STUDY
Modeling children's exposures is a complicated, data-intensive process. Modeling longitudinal exposures, which are important for regulatory decision making, especially for most air toxics, adds another level of complexity and data requirements. Because it is difficult to model in...
Multimedia Networks: Mission Impossible?
ERIC Educational Resources Information Center
Weiss, Andrew M.
1996-01-01
Running multimedia on a network, often difficult because of the memory and processing power required, is becoming easier thanks to new protocols and products. Those developing network design criteria may wish to consider making use of Fast Ethernet, Asynchronous Transfer Method (ATM), switches, "fat pipes", additional network…
78 FR 22922 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-17
...-consuming applications annually, 4 applications of medium difficulty, and 10 of the least difficult... the exemptive order application process, including preparation and revision of an application and... the costs required to prepare other more complex and novel applications. See also Political...
Forming YBa2Cu3O7-x Superconductors On Copper Substrates
NASA Technical Reports Server (NTRS)
Mackenzie, J. Devin; Young, Stanley G.
1991-01-01
Experimental process forms layer of high-critical-temperature ceramic superconductor YBa2Cu3O7-x on surface of copper substrate. Offers possible solution to problem of finishing ceramic superconductors to required final sizes and shapes (difficult problem because these materials brittle and cannot be machined or bent). Further research necessary to evaluate superconducting qualities of surface layers and optimize process.
Ambiguity in the processing of Mandarin Chinese relative clauses: One factor cannot explain it all
Mansbridge, Michael P.; Tamaoka, Katsuo; Xiong, Kexin; Verdonschot, Rinus G.
2017-01-01
This study addresses the question of whether native Mandarin Chinese speakers process and comprehend subject-extracted relative clauses (SRC) more readily than object-extracted relative clauses (ORC) in Mandarin Chinese. Presently, this has been a hotly debated issue, with various studies producing contrasting results. Using two eye-tracking experiments with ambiguous and unambiguous RCs, this study shows that both ORCs and SRCs have different processing requirements depending on the locus and time course during reading. The results reveal that ORC reading was possibly facilitated by linear/temporal integration and canonicity. On the other hand, similarity-based interference made ORCs more difficult, and expectation-based processing was more prominent for unambiguous ORCs. Overall, RC processing in Mandarin should not be broken down to a single ORC (dis)advantage, but understood as multiple interdependent factors influencing whether ORCs are either more difficult or easier to parse depending on the task and context at hand. PMID:28594939
NASA Astrophysics Data System (ADS)
Nilsson, Thomy H.
2001-09-01
The psychophysical method of limits was used to measure the distance at which observers could distinguish military vehicles photographed in natural landscapes. Obtained from the TNO-TM Search_2 dataset, these pictures either were rear-projected 35-mm slides or were presented on a computer monitor. Based on the rationale that more difficult vehicle targets would require more visual pathways for recognition, difficult of acquisition was defined in terms of the relative retinal area required for recognition. Relative retinal area was derived from the inverse square of the recognition distance of a particular vehicle relative to the distance of the vehicle that could be seen furthest away. Results are compared with data on the time required to find the vehicles in these pictures. These comparison indicate recognition distance thresholds can be a suitable means of defining standards for the effectiveness of vital graphic information; and the two methods are complementary with respect to distinguishing different degrees of acquisition difficulty, and together may provide a means to measure the total information processing required for recognition.
Gary Bentrup; Gary Wells
2005-01-01
Despite the use of planting plans and engineering drawings, many landowners find it difficult to conceptualize what a future conservation practice or system will actually look like on their landscape. This lack of understanding can create challenging barriers in the planning process and is exacerbated by the long-term commitment that many conservation systems require...
Exploring component-based approaches in forest landscape modeling
H. S. He; D. R. Larsen; D. J. Mladenoff
2002-01-01
Forest management issues are increasingly required to be addressed in a spatial context, which has led to the development of spatially explicit forest landscape models. The numerous processes, complex spatial interactions, and diverse applications in spatial modeling make the development of forest landscape models difficult for any single research group. New...
Using a Didactic Manipulator in Mechatronics and Industrial Engineering Courses
ERIC Educational Resources Information Center
Stankovski, Stevan; Tarjan, Laslo; Skrinjar, Dragana; Ostojic, Gordana; Senk, Ivana
2010-01-01
One of the most difficult and most important engineering tasks is the integration of a robot-manipulator into material handling, assembly, and production processes, offering the possibility of supervision and control. The knowledge and skills required for these kinds of tasks are purely mechatronic and, thus, multidisciplinary. This paper…
The Spelling Skills of French-Speaking Dyslexic Children
ERIC Educational Resources Information Center
Plisson, Anne; Daigle, Daniel; Montesinos-Gelet, Isabelle
2013-01-01
Learning to spell is very difficult for dyslexic children, a phenomenon explained by a deficit in processing phonological information. However, to spell correctly in an alphabetic language such as French, phonological knowledge is not enough. Indeed, the French written system requires the speller to acquire visuo-orthographical and morphological…
The Evaluation and Research of Multi-Project Programs: Program Component Analysis.
ERIC Educational Resources Information Center
Baker, Eva L.
1977-01-01
It is difficult to base evaluations on concepts irrelevant to state policy making. Evaluation of a multiproject program requires both time and differentiation of method. Data from the California Early Childhood Program illustrate process variables for program component analysis, and research questions for intraprogram comparison. (CP)
Experimental Results of LightSAR Mision Planning Using a Market-Based System
NASA Technical Reports Server (NTRS)
Wessen, R.; Porter, D.; Hilland, J.
1999-01-01
The allocation of scarce spacecraft resources to multiple users has always been a difficult process. This difficulty arises from the fact that there are never enough resources to meet the stated requirements of the scientific investigators who compete to acquire their desired data sets.
Sea Stories: A Collaborative Tool for Articulating Tactical Knowledge.
ERIC Educational Resources Information Center
Radtke, Paul H.; Frey, Paul R.
Having subject matter experts (SMEs) identify the skills and knowledge to be taught is among the more difficult and time-consuming steps in the training development process. A procedure has been developed for identifying specific tactical decision-making knowledge requirements and translating SME knowledge into appropriate multimedia…
Comparative Analysis on Nonlinear Models for Ron Gasoline Blending Using Neural Networks
NASA Astrophysics Data System (ADS)
Aguilera, R. Carreño; Yu, Wen; Rodríguez, J. C. Tovar; Mosqueda, M. Elena Acevedo; Ortiz, M. Patiño; Juarez, J. J. Medel; Bautista, D. Pacheco
The blending process always being a nonlinear process is difficult to modeling, since it may change significantly depending on the components and the process variables of each refinery. Different components can be blended depending on the existing stock, and the chemical characteristics of each component are changing dynamically, they all are blended until getting the expected specification in different properties required by the customer. One of the most relevant properties is the Octane, which is difficult to control in line (without the component storage). Since each refinery process is quite different, a generic gasoline blending model is not useful when a blending in line wants to be done in a specific process. A mathematical gasoline blending model is presented in this paper for a given process described in state space as a basic gasoline blending process description. The objective is to adjust the parameters allowing the blending gasoline model to describe a signal in its trajectory, representing in neural networks extreme learning machine method and also for nonlinear autoregressive-moving average (NARMA) in neural networks method, such that a comparative work be developed.
ERIC Educational Resources Information Center
Bifuh-Ambe, Elizabeth
2013-01-01
Writing is a complex, recursive and difficult process that requires strategic decision-making across multiple domains (Graham, 2006; Pritchard & Honeycutt, 2006). Students are expected to use this process to communicate with a variety of audiences for a variety of purposes. Modelling and providing effective instruction is critical, especially…
[Early mother-infant interaction and factors negatively affecting parenting].
Cerezo, María Angeles; Trenado, Rosa María; Pons-Salvador, Gemma
2006-08-01
The social information-processing model contributes to identifying the psychological processes underlying the construct "sensitivity" in early mother-child interaction. Negative emotional states associated with inadequate self-regulation in coping with stressors affect the mother's attention skills and the processing of the baby's signals. This leads to less synchronous parental practices, particularly unsatisfactory when the baby is unhappy, or crying because the required self-regulation is not provided. This micro-social research studies the sequential profile of maternal reactions to the baby's positive/neutral vs. difficult behaviours and compares them in two groups of dyads, one with mothers who reported high levels of distress and other negative factors for parenting and another group with low levels. The unfavourable circumstances of the high stress group and their negative effects on interaction were observed in some indiscriminate maternal responses and particularly as they reacted to their baby's difficult behaviour, when the mother's regulatory role is more necessary.
A Multidisciplinary Approach to Mixer-Ejector Analysis and Design
NASA Technical Reports Server (NTRS)
Hendricks, Eric, S.; Seidel, Jonathan, A.
2012-01-01
The design of an engine for a civil supersonic aircraft presents a difficult multidisciplinary problem to propulsion system engineers. There are numerous competing requirements for the engine, such as to be efficient during cruise while yet quiet enough at takeoff to meet airport noise regulations. The use of mixer-ejector nozzles presents one possible solution to this challenge. However, designing a mixer-ejector which will successfully address both of these concerns is a difficult proposition. Presented in this paper is an integrated multidisciplinary approach to the analysis and design of these systems. A process that uses several low-fidelity tools to evaluate both the performance and acoustics of mixer-ejectors nozzles is described. This process is further expanded to include system-level modeling of engines and aircraft to determine the effects on mission performance and noise near airports. The overall process is developed in the OpenMDAO framework currently being developed by NASA. From the developed process, sample results are given for a notional mixer-ejector design, thereby demonstrating the capabilities of the method.
Cognitive Load Mediates the Effect of Emotion on Analytical Thinking.
Trémolière, Bastien; Gagnon, Marie-Ève; Blanchette, Isabelle
2016-11-01
Although the detrimental effect of emotion on reasoning has been evidenced many times, the cognitive mechanism underlying this effect remains unclear. In the present paper, we explore the cognitive load hypothesis as a potential explanation. In an experiment, participants solved syllogistic reasoning problems with either neutral or emotional contents. Participants were also presented with a secondary task, for which the difficult version requires the mobilization of cognitive resources to be correctly solved. Participants performed overall worse and took longer on emotional problems than on neutral problems. Performance on the secondary task, in the difficult version, was poorer when participants were reasoning about emotional, compared to neutral contents, consistent with the idea that processing emotion requires more cognitive resources. Taken together, the findings afford evidence that the deleterious effect of emotion on reasoning is mediated by cognitive load.
Students Fail to Transfer Knowledge of Chromosome Structure to Topics Pertaining to Cell Division
ERIC Educational Resources Information Center
Newman, Dina L.; Catavero, Christina M.; Wright, L. Kate
2012-01-01
Cellular processes that rely on knowledge of molecular behavior are difficult for students to comprehend. For example, thorough understanding of meiosis requires students to integrate several complex concepts related to chromosome structure and function. Using a grounded theory approach, we have unified classroom observations, assessment data, and…
Applying Project Management Strategies in a Large Curriculum Conversion Project in Higher Education
ERIC Educational Resources Information Center
Gardner, Joel; Bennett, Patrick A.; Hyatt, Niccole; Stoker, Kevin
2017-01-01
Higher education is undergoing great changes that require universities to adapt quickly, and making these changes can be difficult. One discipline that can aid in executing change is project management, which has developed a set of clear processes and strategies for completing initiatives quickly and effectively. Several authors have identified…
Profits or Professionalism: Issues Facing the Professionalization of TESL in Canada
ERIC Educational Resources Information Center
MacPherson, Seonaigh; Kouritzin, Sandra; Kim, Sohee
2005-01-01
TESL is a field in the process of professionalization. As TESL organizations in Canada struggle to gain professional stature for the field, market demands for ESL teachers in Canada and around the world increase exponentially. This creates a dilemma; whereas professionalization require making the field more difficult to access without specialized…
Determination of the Size and Depth of Craters on the Moon
ERIC Educational Resources Information Center
Grubelnik, Vladimir; Marhl, Marko; Repnik, Robert
2018-01-01
Experimental work in the research of astronomical phenomena is often difficult or even impossible because of long-lasting processes or too distant objects and correspondingly too expensive equipment. In this paper, we present an example of observation of the Moon, which is our nearest astronomic object and therefore does not require professional…
Early Domain-Specific Knowledge? Nonlinear Developmental Trajectories Further Erode a House of Sand
ERIC Educational Resources Information Center
Deak, Gedeon O.
2011-01-01
Rakison and Yermolayeva (this issue) argue that domain specificity is difficult to reconcile with U-, N-, or M-shaped developmental trends. They are justified because: (1) There is no compelling evidence that nonlinear trends require mechanisms beyond general, well-known cognitive processes; and (2) epigenetic neuroscience provides no clear…
Increasing Conceptual Understanding of Glycolysis & the Krebs Cycle Using Role-Play
ERIC Educational Resources Information Center
Ross, Pauline M.; Tronson, Deidre A.; Ritchie, Raymond J.
2008-01-01
Cellular respiration and metabolism are topics that are reportedly poorly understood by students and judged to be difficult by many teachers. Although these topics may not be required learning areas in some high school biology curricula, a grasp of fundamental concepts of cellular metabolic processes is advantageous for students undertaking (or…
International and Immigrant Students in Community Colleges: Who They Are and How To Help Them.
ERIC Educational Resources Information Center
Peterman, Dana
2003-01-01
Provides resources to help counselors assess, understand, and provide appropriate services for international and immigrant student populations in community colleges. States that, because there is so much variation among the population, the adjustment process is difficult and requires administrative intervention. (Contains 11 citations.) (AUTH/NB)
Welding And Cutting A Nickel Alloy By Laser
NASA Technical Reports Server (NTRS)
Banas, C. M.
1990-01-01
Technique effective and energy-efficient. Report describes evaluation of laser welding and cutting of Inconel(R) 718. Notes that electron-beam welding processes developed for In-718, but difficult to use on large or complex structures. Cutting of In-718 by laser fast and produces only narrow kerf. Cut edge requires dressing, to endure fatigue.
Linguistic Skills Involved in Learning to Spell: An Australian Study
ERIC Educational Resources Information Center
Daffern, Tessa
2017-01-01
Being able to accurately spell in Standard English requires efficient coordination of multiple knowledge sources. Therefore, spelling is a word-formation problem-solving process that can be difficult to learn. The present study uses Triple Word Form Theory as a conceptual framework to analyse Standard English spelling performance levels of…
Advanced silver zinc battery development for the SRB and ET range safety subsystems
NASA Technical Reports Server (NTRS)
Adamedes, Zoe
1994-01-01
This document presents in viewgraph format the design and development of silver zinc (AgZn) batteries for the solid rocket booster (SRB) and external tank (ET) range safety subsystems. Various engineering techniques, including composite separator systems, new electrode processing techniques, and new restraint techniques, were used to meet difficult requirements.
ERIC Educational Resources Information Center
She, Hsiao-Ching
2005-01-01
The author explored the potential to promote students' understanding of difficult science concepts through an examination of the inter-relationships among the teachers' instructional approach, students' learning preference styles, and their levels of learning process. The concept "air pressure," which requires an understanding of…
Model-based pH monitor for sensor assessment.
van Schagen, Kim; Rietveld, Luuk; Veersma, Alex; Babuska, Robert
2009-01-01
Owing to the nature of the treatment processes, monitoring the processes based on individual online measurements is difficult or even impossible. However, the measurements (online and laboratory) can be combined with a priori process knowledge, using mathematical models, to objectively monitor the treatment processes and measurement devices. The pH measurement is a commonly used measurement at different stages in the drinking water treatment plant, although it is a unreliable instrument, requiring significant maintenance. It is shown that, using a grey-box model, it is possible to assess the measurement devices effectively, even if detailed information of the specific processes is unknown.
Nuclear data for r-process models from ion trap measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, Jason, E-mail: jclark@anl.gov
2016-06-21
To truly understand how elements are created in the universe via the astrophysical r process, accurate nuclear data are required. Historically, the isotopes involved in the r process have been difficult to access for study, but the development of new facilities and measurement techniques have put many of the r-process isotopes within reach. This paper will discuss the new CARIBU facility at Argonne National Laboratory and two pieces of experimental equipment, the Beta-decay Paul Trap and the Canadian Penning Trap, that will dramatically increase the nuclear data available for models of the astrophysical r process.
Database management systems for process safety.
Early, William F
2006-03-17
Several elements of the process safety management regulation (PSM) require tracking and documentation of actions; process hazard analyses, management of change, process safety information, operating procedures, training, contractor safety programs, pre-startup safety reviews, incident investigations, emergency planning, and compliance audits. These elements can result in hundreds of actions annually that require actions. This tracking and documentation commonly is a failing identified in compliance audits, and is difficult to manage through action lists, spreadsheets, or other tools that are comfortably manipulated by plant personnel. This paper discusses the recent implementation of a database management system at a chemical plant and chronicles the improvements accomplished through the introduction of a customized system. The system as implemented modeled the normal plant workflows, and provided simple, recognizable user interfaces for ease of use.
Edge systems in the deep ocean
NASA Astrophysics Data System (ADS)
Coon, Andrew; Earp, Samuel L.
2010-04-01
DARPA has initiated a program to explore persistent presence in the deep ocean. The deep ocean is difficult to access and presents a hostile environment. Persistent operations in the deep ocean will require new technology for energy, communications and autonomous operations. Several fundamental characteristics of the deep ocean shape any potential system architecture. The deep sea presents acoustic sensing opportunities that may provide significantly enhanced sensing footprints relative to sensors deployed at traditional depths. Communication limitations drive solutions towards autonomous operation of the platforms and automation of data collection and processing. Access to the seabed presents an opportunity for fixed infrastructure with no important limitations on size and weight. Difficult access and persistence impose requirements for long-life energy sources and potentially energy harvesting. The ocean is immense, so there is a need to scale the system footprint for presence over tens of thousands and perhaps hundreds of thousands of square nautical miles. This paper focuses on the aspect of distributed sensing, and the engineering of networks of sensors to cover the required footprint.
NASA Technical Reports Server (NTRS)
Bredt, J. H.
1974-01-01
Two types of space processing operations may be considered economically justified; they are manufacturing operations that make profits and experiment operations that provide needed applied research results at lower costs than those of alternative methods. Some examples from the Skylab experiments suggest that applied research should become cost effective soon after the space shuttle and Spacelab become operational. In space manufacturing, the total cost of space operations required to process materials must be repaid by the value added to the materials by the processing. Accurate estimates of profitability are not yet possible because shuttle operational costs are not firmly established and the markets for future products are difficult to estimate. However, approximate calculations show that semiconductor products and biological preparations may be processed on a scale consistent with market requirements and at costs that are at least compatible with profitability using the Shuttle/Spacelab system.
Optical radiation measurements: instrumentation and sources of error.
Landry, R J; Andersen, F A
1982-07-01
Accurate measurement of optical radiation is required when sources of this radiation are used in biological research. The most difficult measurements of broadband noncoherent optical radiations usually must be performed by a highly trained specialist using sophisticated, complex, and expensive instruments. Presentation of the results of such measurement requires correct use of quantities and units with which many biological researchers are unfamiliar. The measurement process, physical quantities and units, measurement systems with instruments, and sources of error and uncertainties associated with optical radiation measurements are reviewed.
A Novel Approach to Teaching and Understanding Transformations of Matter in Dynamic Earth Systems
ERIC Educational Resources Information Center
Clark, Scott K.; Sibley, Duncan F.; Libarkin, Julie C.; Heidemann, Merle
2009-01-01
The need to engage K-12 and post-secondary students in considering the Earth as a dynamic system requires explicit discussion of system characteristics. Fundamentally, dynamic systems involve the movement and change of matter, often through processes that are difficult to see and comprehend. We introduce a novel instructional method, termed…
What Makes a Word Difficult? Insights into the Mental Representation of Technical Terms
ERIC Educational Resources Information Center
Jucks, Regina; Paus, Elisabeth
2012-01-01
Learning from texts requires reflection on how far one has mastered the material. Learners use such metacognitive processes to decide whether to engage in deeper learning activities or not. This article examines how the lexical surface of specialist concepts influences their mental representation. Lexical encodings that are the concise wordings of…
From Assessment to Annual Goal: Engaging a Decision-Making Process in Writing Measurable IEPs
ERIC Educational Resources Information Center
Capizzi, Andrea M.
2008-01-01
Despite federal regulations requiring measurable individualized education programs (IEPs), IEPs are often vague and unfocused, making them difficult to use in guiding instructional planning. Although a well-written IEP can be time consuming and labor intensive, a clearly written IEP, based on documented student needs, can and should be a guidepost…
Bridging Physics and Biology Using Resistance and Axons
ERIC Educational Resources Information Center
Dyer, Joshua M.
2014-01-01
When teaching physics, it is often difficult to get biology-oriented students to see the relevance of physics. A complaint often heard is that biology students are required to take physics for the Medical College Admission Test (MCAT) as part of a "weeding out" process, but that they don't feel like they need physics for biology. Despite…
ERIC Educational Resources Information Center
Wood, Brian D.
2009-01-01
Although the multiscale structure of many important processes in engineering is becoming more widely acknowledged, making this connection in the classroom is a difficult task. This is due in part because the concept of multiscale structure itself is challenging and it requires the students to develop new conceptual pictures of physical systems,…
Translating Head Motion into Attention - Towards Processing of Student's Body-Language
ERIC Educational Resources Information Center
Raca, Mirko; Kidzinski, Lukasz; Dillenbourg, Pierre
2015-01-01
Evidence has shown that student's attention is a crucial factor for engagement and learning gain. Although it can be accurately assessed ad-hoc by an experienced teacher, continuous contact with all students in a large class is difficult to maintain and requires training for novice practitioners. We continue our previous work on investigating…
Teaching Historical Research Skills to Generation Y: One Instructor's Approach
ERIC Educational Resources Information Center
Thaler, Valerie S.
2013-01-01
In this article, the author offers a summary of the major research assignment she has developed for HIST 100, as well as the successes and struggles she has had along the way. The project requires students to experience research as a difficult process that demands their patience, perseverance, and assiduousness. Group work in class clearly plays…
ERIC Educational Resources Information Center
Carrington, Linda G.
2012-01-01
Both students and instructors alike will generally agree that intermediate accounting courses are among the most difficult and demanding in an accounting or finance curriculum, and perhaps even on the college campus. Intermediate accounting contains subject matter which requires a higher level of thinking and a greater ability to process prior…
Airbreathing Propulsion System Analysis Using Multithreaded Parallel Processing
NASA Technical Reports Server (NTRS)
Schunk, Richard Gregory; Chung, T. J.; Rodriguez, Pete (Technical Monitor)
2000-01-01
In this paper, parallel processing is used to analyze the mixing, and combustion behavior of hypersonic flow. Preliminary work for a sonic transverse hydrogen jet injected from a slot into a Mach 4 airstream in a two-dimensional duct combustor has been completed [Moon and Chung, 1996]. Our aim is to extend this work to three-dimensional domain using multithreaded domain decomposition parallel processing based on the flowfield-dependent variation theory. Numerical simulations of chemically reacting flows are difficult because of the strong interactions between the turbulent hydrodynamic and chemical processes. The algorithm must provide an accurate representation of the flowfield, since unphysical flowfield calculations will lead to the faulty loss or creation of species mass fraction, or even premature ignition, which in turn alters the flowfield information. Another difficulty arises from the disparity in time scales between the flowfield and chemical reactions, which may require the use of finite rate chemistry. The situations are more complex when there is a disparity in length scales involved in turbulence. In order to cope with these complicated physical phenomena, it is our plan to utilize the flowfield-dependent variation theory mentioned above, facilitated by large eddy simulation. Undoubtedly, the proposed computation requires the most sophisticated computational strategies. The multithreaded domain decomposition parallel processing will be necessary in order to reduce both computational time and storage. Without special treatments involved in computer engineering, our attempt to analyze the airbreathing combustion appears to be difficult, if not impossible.
The Integrated Airframe/Propulsion Control System Architecture program (IAPSA)
NASA Technical Reports Server (NTRS)
Palumbo, Daniel L.; Cohen, Gerald C.; Meissner, Charles W.
1990-01-01
The Integrated Airframe/Propulsion Control System Architecture program (IAPSA) is a two-phase program which was initiated by NASA in the early 80s. The first phase, IAPSA 1, studied different architectural approaches to the problem of integrating engine control systems with airframe control systems in an advanced tactical fighter. One of the conclusions of IAPSA 1 was that the technology to construct a suitable system was available, yet the ability to create these complex computer architectures has outpaced the ability to analyze the resulting system's performance. With this in mind, the second phase of IAPSA approached the same problem with the added constraint that the system be designed for validation. The intent of the design for validation requirement is that validation requirements should be shown to be achievable early in the design process. IAPSA 2 has demonstrated that despite diligent efforts, integrated systems can retain characteristics which are difficult to model and, therefore, difficult to validate.
A low-cost solid–liquid separation process for enzymatically hydrolyzed corn stover slurries
Sievers, David A.; Lischeske, James J.; Biddy, Mary J.; ...
2015-07-01
Solid-liquid separation of intermediate process slurries is required in some process configurations for the conversion of lignocellulosic biomass to transportation fuels. Thermochemically pretreated and enzymatically hydrolyzed corn stover slurries have proven difficult to filter due to formation of very low permeability cakes that are rich in lignin. Treatment of two different slurries with polyelectrolyte flocculant was demonstrated to increase mean particle size and filterability. Filtration flux was greatly improved, and thus scaled filter unit capacity was increased approximately 40-fold compared with unflocculated slurry. Although additional costs were accrued using polyelectrolyte, techno-economic analysis revealed that the increase in filter capacity significantlymore » reduced overall production costs. Fuel production cost at 95% sugar recovery was reduced by $1.35 US per gallon gasoline equivalent for dilute-acid pretreated and enzymatically hydrolyzed slurries and $3.40 for slurries produced using an additional alkaline de-acetylation preprocessing step that is even more difficult to natively filter.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sievers, David A.; Lischeske, James J.; Biddy, Mary J.
Solid-liquid separation of intermediate process slurries is required in some process configurations for the conversion of lignocellulosic biomass to transportation fuels. Thermochemically pretreated and enzymatically hydrolyzed corn stover slurries have proven difficult to filter due to formation of very low permeability cakes that are rich in lignin. Treatment of two different slurries with polyelectrolyte flocculant was demonstrated to increase mean particle size and filterability. Filtration flux was greatly improved, and thus scaled filter unit capacity was increased approximately 40-fold compared with unflocculated slurry. Although additional costs were accrued using polyelectrolyte, techno-economic analysis revealed that the increase in filter capacity significantlymore » reduced overall production costs. Fuel production cost at 95% sugar recovery was reduced by $1.35 US per gallon gasoline equivalent for dilute-acid pretreated and enzymatically hydrolyzed slurries and $3.40 for slurries produced using an additional alkaline de-acetylation preprocessing step that is even more difficult to natively filter.« less
ERIC Educational Resources Information Center
Rodicio, Hector Garcia; Sanchez, Emilio; Acuna, Santiago R.
2013-01-01
Acquiring complex conceptual knowledge requires learners to self-regulate their learning by planning, monitoring, and adjusting the process but they find it difficult to do so. In one experiment, we examined whether learners need broad systems of support for self-regulation or whether they are also able to learn with more economical support…
ERIC Educational Resources Information Center
Alevriadou, Anastasia; Giaouri, Stergiani
2015-01-01
Written language is a difficult endeavour as the demands of transcription require self-regulatory skills from a motor, cognitive and attention perspective. The purpose of the present study was to investigate the relation between the Test of Writing Difficulties (Porpodas et al., 2007) and the Test of Detection and Investigation of Executive…
ERIC Educational Resources Information Center
Rajabi, Shima; Azizifar, Akbar; Gowhary, Habib
2015-01-01
Learning a foreign language requires students to acquire both grammatical knowledge and socio-pragmatic rules of a language. Pragmatic competence as one of the most difficult aspects of language provides several challenges to L2 learners in the process of learning a foreign language. To overcome this problem, EFL teachers should find the most…
Global Optimization of Low-Thrust Interplanetary Trajectories Subject to Operational Constraints
NASA Technical Reports Server (NTRS)
Englander, Jacob Aldo; Vavrina, Matthew; Hinckley, David
2016-01-01
Low-thrust electric propulsion provides many advantages for mission to difficult targets-Comets and asteroids-Mercury-Outer planets (with sufficient power supply)Low-thrust electric propulsion is characterized by high power requirements but also very high specific impulse (Isp), leading to very good mass fractions. Low-thrust trajectory design is a very different process from chemical trajectory.
How do they make it look so easy? The expert orienteer's cognitive advantage.
Eccles, David W; Arsal, Guler
2015-01-01
Expertise in sport can appear so extraordinary that it is difficult to imagine how "normal" individuals may achieve it. However, in this review, we show that experts in the sport of orienteering, which requires on-foot navigation using map and compass through wild terrain, can make the difficult look easy because they have developed a cognitive advantage. Specifically, they have acquired knowledge of cognitive and behavioural strategies that allow them to circumvent natural limitations on attention. Cognitive strategies include avoiding peaks of demand on attention by distributing the processing of map information over time and reducing the need to attend to the map by simplifying the navigation required to complete a race. Behavioural strategies include reducing the visual search required of the map by physically arranging and rearranging the map display during races. It is concluded that expertise in orienteering can be partly attributed to the circumvention of natural limitations on attention achieved via the employment of acquired cognitive and behavioural strategies. Thus, superior performance in sport may not be the possession of only a privileged few; it may be available to all aspiring athletes.
Luo, Mei; Wang, Hao; Lyu, Zhi
2017-12-01
Species distribution models (SDMs) are widely used by researchers and conservationists. Results of prediction from different models vary significantly, which makes users feel difficult in selecting models. In this study, we evaluated the performance of two commonly used SDMs, the Biomod2 and Maximum Entropy (MaxEnt), with real presence/absence data of giant panda, and used three indicators, i.e., area under the ROC curve (AUC), true skill statistics (TSS), and Cohen's Kappa, to evaluate the accuracy of the two model predictions. The results showed that both models could produce accurate predictions with adequate occurrence inputs and simulation repeats. Comparedto MaxEnt, Biomod2 made more accurate prediction, especially when occurrence inputs were few. However, Biomod2 was more difficult to be applied, required longer running time, and had less data processing capability. To choose the right models, users should refer to the error requirements of their objectives. MaxEnt should be considered if the error requirement was clear and both models could achieve, otherwise, we recommend the use of Biomod2 as much as possible.
Missile signal processing common computer architecture for rapid technology upgrade
NASA Astrophysics Data System (ADS)
Rabinkin, Daniel V.; Rutledge, Edward; Monticciolo, Paul
2004-10-01
Interceptor missiles process IR images to locate an intended target and guide the interceptor towards it. Signal processing requirements have increased as the sensor bandwidth increases and interceptors operate against more sophisticated targets. A typical interceptor signal processing chain is comprised of two parts. Front-end video processing operates on all pixels of the image and performs such operations as non-uniformity correction (NUC), image stabilization, frame integration and detection. Back-end target processing, which tracks and classifies targets detected in the image, performs such algorithms as Kalman tracking, spectral feature extraction and target discrimination. In the past, video processing was implemented using ASIC components or FPGAs because computation requirements exceeded the throughput of general-purpose processors. Target processing was performed using hybrid architectures that included ASICs, DSPs and general-purpose processors. The resulting systems tended to be function-specific, and required custom software development. They were developed using non-integrated toolsets and test equipment was developed along with the processor platform. The lifespan of a system utilizing the signal processing platform often spans decades, while the specialized nature of processor hardware and software makes it difficult and costly to upgrade. As a result, the signal processing systems often run on outdated technology, algorithms are difficult to update, and system effectiveness is impaired by the inability to rapidly respond to new threats. A new design approach is made possible three developments; Moore's Law - driven improvement in computational throughput; a newly introduced vector computing capability in general purpose processors; and a modern set of open interface software standards. Today's multiprocessor commercial-off-the-shelf (COTS) platforms have sufficient throughput to support interceptor signal processing requirements. This application may be programmed under existing real-time operating systems using parallel processing software libraries, resulting in highly portable code that can be rapidly migrated to new platforms as processor technology evolves. Use of standardized development tools and 3rd party software upgrades are enabled as well as rapid upgrade of processing components as improved algorithms are developed. The resulting weapon system will have a superior processing capability over a custom approach at the time of deployment as a result of a shorter development cycles and use of newer technology. The signal processing computer may be upgraded over the lifecycle of the weapon system, and can migrate between weapon system variants enabled by modification simplicity. This paper presents a reference design using the new approach that utilizes an Altivec PowerPC parallel COTS platform. It uses a VxWorks-based real-time operating system (RTOS), and application code developed using an efficient parallel vector library (PVL). A quantification of computing requirements and demonstration of interceptor algorithm operating on this real-time platform are provided.
NASA Astrophysics Data System (ADS)
Hong, Y.; Curteza, A.; Zeng, X.; Bruniaux, P.; Chen, Y.
2016-06-01
Material selection is the most difficult section in the customized garment product design and development process. This study aims to create a hierarchical framework for material selection. The analytic hierarchy process and fuzzy sets theories have been applied to mindshare the diverse requirements from the customer and inherent interaction/interdependencies among these requirements. Sensory evaluation ensures a quick and effective selection without complex laboratory test such as KES and FAST, using the professional knowledge of the designers. A real empirical application for the physically disabled people is carried out to demonstrate the proposed method. Both the theoretical and practical background of this paper have indicated the fuzzy analytical network process can capture expert's knowledge existing in the form of incomplete, ambiguous and vague information for the mutual influence on attribute and criteria of the material selection.
The Effect of Gravity on the Combustion Synthesis of Porous Biomaterials
NASA Technical Reports Server (NTRS)
Castillo, M.; Zhang, X.; Moore, J. J.; Schowengerdt, F. D.; Ayers, R. A.
2003-01-01
Production of highly porous composite materials by traditional materials processing is limited by difficult processing techniques. This work investigates the use of self propagating high temperature (combustion) synthesis (SHS) to create porous tricalcium phosphate (Ca3(PO4)2), TiB-Ti, and NiTi in low and microgravity. Combustion synthesis provides the ability to use set processing parameters to engineer the required porous structure suitable for bone repair or replacement. The processing parameters include green density, particle size, gasifying agents, composition, and gravity. The advantage of the TiB-Ti system is the high level of porosity achieved together with a modulus that can be controlled by both composition (TiB-Ti) and porosity. At the same time, NiTi exhibits shape memory properties. SHS of biomaterials allows the engineering of required porosity coupled with resorbtion properties and specific mechanical properties into the composite materials to allow for a better biomaterial.
Laser inactivation of pathogenic viruses in water
NASA Astrophysics Data System (ADS)
Grishkanich, Alexander; Zhevlakov, Alexander; Kascheev, Sergey; Sidorov, Igor; Ruzankina, Julia; Yakovlev, Alexey; Mak, Andrey
2016-03-01
Currently there is a situation that makes it difficult to provide the population with quality drinking water for the sanitary-hygienic requirements. One of the urgent problems is the need for water disinfection. Since the emergence of microorganisms that are pathogens transmitted through water such as typhoid, cholera, etc. requires constant cleansing of waters against pathogenic bacteria. In the water treatment process is destroyed up to 98% of germs, but among the remaining can be pathogenic viruses, the destruction of which requires special handling. As a result, the conducted research the following methods have been proposed for combating harmful microorganisms: sterilization of water by laser radiation and using a UV lamp.
Automated plasma control with optical emission spectroscopy
NASA Astrophysics Data System (ADS)
Ward, P. P.
Plasma etching and desmear processes for printed wiring board (PWB) manufacture are difficult to predict and control. Non-uniformity of most plasma processes and sensitivity to environmental changes make it difficult to maintain process stability from day to day. To assure plasma process performance, weight loss coupons or post-plasma destructive testing must be used. These techniques are not real-time methods however, and do not allow for immediate diagnosis and process correction. These tests often require scrapping some fraction of a batch to insure the integrity of the rest. Since these tests verify a successful cycle with post-plasma diagnostics, poor test results often determine that a batch is substandard and the resulting parts unusable. These tests are a costly part of the overall fabrication cost. A more efficient method of testing would allow for constant monitoring of plasma conditions and process control. Process anomalies should be detected and corrected before the parts being treated are damaged. Real time monitoring would allow for instantaneous corrections. Multiple site monitoring would allow for process mapping within one system or simultaneous monitoring of multiple systems. Optical emission spectroscopy conducted external to the plasma apparatus would allow for this sort of multifunctional analysis without perturbing the glow discharge. In this paper, optical emission spectroscopy for non-intrusive, in situ process control will be explored along with applications of this technique to for process control, failure analysis and endpoint determination in PWB manufacture.
Process and product development in the manufacturing of molecular therapeutics.
Atkinson, E M; Christensen, J R
1999-08-01
In the development of molecular therapies, a great deal of attention has focused on tissue targets, gene delivery vectors, and expression cassettes. In order to become an approved therapy, however, a molecular therapeutic has to pass down the same product registration pathway as any other biological product. Moving from research into industrial production requires careful attention to regulatory, manufacturing and quality concerns. Early work on developing and characterizing robust and scaleable manufacturing processes will ultimately be rewarded by ease of implementation as the product is successful in clinical trials. Regulatory agencies require solid process and product characterization studies to demonstrate control and understanding of the molecular therapeutic. As the gene therapy industry matures, standards will continue to rise, creating an industry that is capable of producing safe, high-quality and effective therapies for many of the world's most difficult disease targets.
Improved Concrete Cutting and Excavation Capabilities for Crater Repair Phase 2
2015-05-01
production rate and ease of execution. The current ADR techniques, tactics, and procedures (TTPs) indicate cutting of pavement around a small crater...demonstrations and evaluations were used to create the techniques, tactics, and procedures (TTPs) manual describing the processes and requirements of...was more difficult when dowels were present. In general, the OUA demonstration validated that the new materials, equipment, and procedures were
ERIC Educational Resources Information Center
McPheron, Benjamin D.; Thangaraj, Charles V.; Thomas, Charles R.
2017-01-01
Laboratory courses can be difficult to fit into an engineering program at a liberal arts-focused university, which requires students to be exposed to appropriate breadth, as well as sufficient depth in their engineering education. One possible solution to this issue is to integrate laboratory exercises with lecture in a "studio" format,…
ERIC Educational Resources Information Center
Asaro-Saddler, Kristie; Knox, Haley Muir; Meredith, Holly; Akhmedjanova, Diana
2015-01-01
Writing is an important content area that pervades all subject areas and is required for post-school success, yet many students with autism spectrum disorders (ASD) often struggle in written expression. In this article we discuss the characteristics of students with ASD that make writing difficult, and the strengths, such as the use of technology,…
Mark, Lynette J; Herzer, Kurt R; Cover, Renee; Pandian, Vinciya; Bhatti, Nasir I; Berkow, Lauren C; Haut, Elliott R; Hillel, Alexander T; Miller, Christina R; Feller-Kopman, David J; Schiavi, Adam J; Xie, Yanjun J; Lim, Christine; Holzmueller, Christine; Ahmad, Mueen; Thomas, Pradeep; Flint, Paul W; Mirski, Marek A
2015-07-01
Difficult airway cases can quickly become emergencies, increasing the risk of life-threatening complications or death. Emergency airway management outside the operating room is particularly challenging. We developed a quality improvement program-the Difficult Airway Response Team (DART)-to improve emergency airway management outside the operating room. DART was implemented by a team of anesthesiologists, otolaryngologists, trauma surgeons, emergency medicine physicians, and risk managers in 2005 at The Johns Hopkins Hospital in Baltimore, Maryland. The DART program had 3 core components: operations, safety, and education. The operations component focused on developing a multidisciplinary difficult airway response team, standardizing the emergency response process, and deploying difficult airway equipment carts throughout the hospital. The safety component focused on real-time monitoring of DART activations and learning from past DART events to continuously improve system-level performance. This objective entailed monitoring the paging system, reporting difficult airway events and DART activations to a Web-based registry, and using in situ simulations to identify and mitigate defects in the emergency airway management process. The educational component included development of a multispecialty difficult airway curriculum encompassing case-based lectures, simulation, and team building/communication to ensure consistency of care. Educational materials were also developed for non-DART staff and patients to inform them about the needs of patients with difficult airways and ensure continuity of care with other providers after discharge. Between July 2008 and June 2013, DART managed 360 adult difficult airway events comprising 8% of all code activations. Predisposing patient factors included body mass index >40, history of head and neck tumor, prior difficult intubation, cervical spine injury, airway edema, airway bleeding, and previous or current tracheostomy. Twenty-three patients (6%) required emergent surgical airways. Sixty-two patients (17%) were stabilized and transported to the operating room for definitive airway management. There were no airway management-related deaths, sentinel events, or malpractice claims in adult patients managed by DART. Five in situ simulations conducted in the first program year improved DART's teamwork, communication, and response times and increased the functionality of the difficult airway carts. Over the 5-year period, we conducted 18 airway courses, through which >200 providers were trained. DART is a comprehensive program for improving difficult airway management. Future studies will examine the comparative effectiveness of the DART program and evaluate how DART has impacted patient outcomes, operational efficiency, and costs of care.
Mark, Lynette J.; Herzer, Kurt R.; Cover, Renee; Pandian, Vinciya; Bhatti, Nasir I.; Berkow, Lauren C.; Haut, Elliott R.; Hillel, Alexander T.; Miller, Christina R.; Feller-Kopman, David J.; Schiavi, Adam J.; Xie, Yanjun J.; Lim, Christine; Holzmueller, Christine; Ahmad, Mueen; Thomas, Pradeep; Flint, Paul W.; Mirski, Marek A.
2015-01-01
Background Difficult airway cases can quickly become emergencies, increasing the risk of life-threatening complications or death. Emergency airway management outside the operating room is particularly challenging. Methods We developed a quality improvement program—the Difficult Airway Response Team (DART)—to improve emergency airway management outside the operating room. DART was implemented by a team of anesthesiologists, otolaryngologists, trauma surgeons, emergency medicine physicians, and risk managers in 2005 at The Johns Hopkins Hospital in Baltimore, Maryland. The DART program had three core components: operations, safety, and education. The operations component focused on developing a multidisciplinary difficult airway response team, standardizing the emergency response process, and deploying difficult airway equipment carts throughout the hospital. The safety component focused on real-time monitoring of DART activations and learning from past DART events to continuously improve system-level performance. This objective entailed monitoring the paging system, reporting difficult airway events and DART activations to a web-based registry, and using in situ simulations to identify and mitigate defects in the emergency airway management process. The educational component included development of a multispecialty difficult airway curriculum encompassing case-based lectures, simulation, and team building/communication to ensure consistency of care. Educational materials were also developed for non-DART staff and patients to inform them about the needs of patients with difficult airways and ensure continuity of care with other providers after discharge. Results Between July 2008 and June 2013, DART managed 360 adult difficult airway events comprising 8% of all code activations. Predisposing patient factors included body mass index > 40, history of head and neck tumor, prior difficult intubation, cervical spine injury, airway edema, airway bleeding, and previous or current tracheostomy. Twenty-three patients (6%) required emergent surgical airways. Sixty-two patients (17%) were stabilized and transported to the operating room for definitive airway management. There were no airway management-related deaths, sentinel events, or malpractice claims in adult patients managed by DART. Five in situ simulations conducted in the first program year improved DART's teamwork, communication, and response times and increased the functionality of the difficult airway carts. Over the 5-year period, we conducted 18 airway courses, through which more than 200 providers were trained. Conclusions DART is a comprehensive program for improving difficult airway management. Future studies will examine the comparative effectiveness of the DART program and evaluate how DART has impacted patient outcomes, operational efficiency, and costs of care. PMID:26086513
CdSe TFT AMLCDE manufacturing process
NASA Astrophysics Data System (ADS)
Pritchard, Annette M.
1995-06-01
Active Matrix Liquid Crystal Displays, AMLCDs, based on Cadmium Selenide Thin Film Transistors, have been developed by Litton for a number of defence/avionics applications. Fabrication processed for the thin film transistor (TFT) arrays, color filters and liquid crystal cell assembly have been developed which enable the end product to meet the difficult environmental and performance specifications of military applications, while maintaining focus on cost and yield issues. The fabrication of the AMLCD products is now transitioning into a new production facility which has been designed specifically to meet the requirements of the defence/avionics marketplace.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCoy, Michel; Archer, Bill; Hendrickson, Bruce
The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computationalmore » resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.« less
On the Processing of Martensitic Steels in Continuous Galvanizing Lines: Part 1
NASA Astrophysics Data System (ADS)
Song, Taejin; Kwak, Jaihyun; de Cooman, B. C.
2012-01-01
Whereas low-carbon (<0.2 mass pct) martensitic grades can be produced easily in continuous annealing processing lines equipped with the required cooling capacity, the thermal cycles in continuous galvanizing lines make it difficult to produce hot-dip Zn or Zn-alloy coated high-strength martensitic grades. This is because of the tempering processes occurring during dipping of the strip in the liquid Zn bath and, in the case of galvannealed sheet steel, the short thermal treatment required to achieve the alloying between the Zn and the steel. These short additional thermal treatments last less than 30 seconds but severely degrade the mechanical properties. Using a combination of internal friction, X-ray diffraction, and transmission electron microscopy, it is shown that the ultrafine-grained lath microstructure allows for a rapid dislocation recovery and carbide formation during the galvanizing processes. In addition, the effective dislocation pinning occurring during the galvannealing process results in strain localization and the suppression of strain hardening.
NASA Astrophysics Data System (ADS)
Nerita, S.; Maizeli, A.; Afza, A.
2017-09-01
Process Evaluation and Learning Outcomes of Biology subjects discusses the evaluation process in learning and application of designed and processed learning outcomes. Some problems found during this subject was the student difficult to understand the subject and the subject unavailability of learning resources that can guide and make students independent study. So, it necessary to develop a learning resource that can make active students to think and to make decisions with the guidance of the lecturer. The purpose of this study is to produce handout based on guided discovery method that match the needs of students. The research was done by using 4-D models and limited to define phase that is student requirement analysis. Data obtained from the questionnaire and analyzed descriptively. The results showed that the average requirement of students was 91,43%. Can be concluded that students need a handout based on guided discovery method in the learning process.
Energy conservation using face detection
NASA Astrophysics Data System (ADS)
Deotale, Nilesh T.; Kalbande, Dhananjay R.; Mishra, Akassh A.
2011-10-01
Computerized Face Detection, is concerned with the difficult task of converting a video signal of a person to written text. It has several applications like face recognition, simultaneous multiple face processing, biometrics, security, video surveillance, human computer interface, image database management, digital cameras use face detection for autofocus, selecting regions of interest in photo slideshows that use a pan-and-scale and The Present Paper deals with energy conservation using face detection. Automating the process to a computer requires the use of various image processing techniques. There are various methods that can be used for Face Detection such as Contour tracking methods, Template matching, Controlled background, Model based, Motion based and color based. Basically, the video of the subject are converted into images are further selected manually for processing. However, several factors like poor illumination, movement of face, viewpoint-dependent Physical appearance, Acquisition geometry, Imaging conditions, Compression artifacts makes Face detection difficult. This paper reports an algorithm for conservation of energy using face detection for various devices. The present paper suggests Energy Conservation can be done by Detecting the Face and reducing the brightness of complete image and then adjusting the brightness of the particular area of an image where the face is located using histogram equalization.
VARED: Verification and Analysis of Requirements and Early Designs
NASA Technical Reports Server (NTRS)
Badger, Julia; Throop, David; Claunch, Charles
2014-01-01
Requirements are a part of every project life cycle; everything going forward in a project depends on them. Good requirements are hard to write, there are few useful tools to test, verify, or check them, and it is difficult to properly marry them to the subsequent design, especially if the requirements are written in natural language. In fact, the inconsistencies and errors in the requirements along with the difficulty in finding these errors contribute greatly to the cost of the testing and verification stage of flight software projects [1]. Large projects tend to have several thousand requirements written at various levels by different groups of people. The design process is distributed and a lack of widely accepted standards for requirements often results in a product that varies widely in style and quality. A simple way to improve this would be to standardize the design process using a set of tools and widely accepted requirements design constraints. The difficulty with this approach is finding the appropriate constraints and tools. Common complaints against the tools available include ease of use, functionality, and available features. Also, although preferable, it is rare that these tools are capable of testing the quality of the requirements.
Statistical process control methods allow the analysis and improvement of anesthesia care.
Fasting, Sigurd; Gisvold, Sven E
2003-10-01
Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.
Klados, Manousos A.; Kanatsouli, Kassia; Antoniou, Ioannis; Babiloni, Fabio; Tsirka, Vassiliki; Bamidis, Panagiotis D.; Micheloyannis, Sifis
2013-01-01
The two core systems of mathematical processing (subitizing and retrieval) as well as their functionality are already known and published. In this study we have used graph theory to compare the brain network organization of these two core systems in the cortical layer during difficult calculations. We have examined separately all the EEG frequency bands in healthy young individuals and we found that the network organization at rest, as well as during mathematical tasks has the characteristics of Small World Networks for all the bands, which is the optimum organization required for efficient information processing. The different mathematical stimuli provoked changes in the graph parameters of different frequency bands, especially the low frequency bands. More specific, in Delta band the induced network increases it’s local and global efficiency during the transition from subitizing to retrieval system, while results suggest that difficult mathematics provoke networks with higher cliquish organization due to more specific demands. The network of the Theta band follows the same pattern as before, having high nodal and remote organization during difficult mathematics. Also the spatial distribution of the network’s weights revealed more prominent connections in frontoparietal regions, revealing the working memory load due to the engagement of the retrieval system. The cortical networks of the alpha brainwaves were also more efficient, both locally and globally, during difficult mathematics, while the fact that alpha’s network was more dense on the frontparietal regions as well, reveals the engagement of the retrieval system again. Concluding, this study gives more evidences regarding the interaction of the two core systems, exploiting the produced functional networks of the cerebral cortex, especially for the difficult mathematics. PMID:23990992
Klados, Manousos A; Kanatsouli, Kassia; Antoniou, Ioannis; Babiloni, Fabio; Tsirka, Vassiliki; Bamidis, Panagiotis D; Micheloyannis, Sifis
2013-01-01
The two core systems of mathematical processing (subitizing and retrieval) as well as their functionality are already known and published. In this study we have used graph theory to compare the brain network organization of these two core systems in the cortical layer during difficult calculations. We have examined separately all the EEG frequency bands in healthy young individuals and we found that the network organization at rest, as well as during mathematical tasks has the characteristics of Small World Networks for all the bands, which is the optimum organization required for efficient information processing. The different mathematical stimuli provoked changes in the graph parameters of different frequency bands, especially the low frequency bands. More specific, in Delta band the induced network increases it's local and global efficiency during the transition from subitizing to retrieval system, while results suggest that difficult mathematics provoke networks with higher cliquish organization due to more specific demands. The network of the Theta band follows the same pattern as before, having high nodal and remote organization during difficult mathematics. Also the spatial distribution of the network's weights revealed more prominent connections in frontoparietal regions, revealing the working memory load due to the engagement of the retrieval system. The cortical networks of the alpha brainwaves were also more efficient, both locally and globally, during difficult mathematics, while the fact that alpha's network was more dense on the frontparietal regions as well, reveals the engagement of the retrieval system again. Concluding, this study gives more evidences regarding the interaction of the two core systems, exploiting the produced functional networks of the cerebral cortex, especially for the difficult mathematics.
Estimated water requirements for gold heap-leach operations
Bleiwas, Donald I.
2012-01-01
This report provides a perspective on the amount of water necessary for conventional gold heap-leach operations. Water is required for drilling and dust suppression during mining, for agglomeration and as leachate during ore processing, to support the workforce (requires water in potable form and for sanitation), for minesite reclamation, and to compensate for water lost to evaporation and leakage. Maintaining an adequate water balance is especially critical in areas where surface and groundwater are difficult to acquire because of unfavorable climatic conditions [arid conditions and (or) a high evaporation rate]; where there is competition with other uses, such as for agriculture, industry, and use by municipalities; and where compliance with regulatory requirements may restrict water usage. Estimating the water consumption of heap-leach operations requires an understanding of the heap-leach process itself. The task is fairly complex because, although they all share some common features, each gold heap-leach operation is unique. Also, estimating the water consumption requires a synthesis of several fields of science, including chemistry, ecology, geology, hydrology, and meteorology, as well as consideration of economic factors.
Experimentally modeling stochastic processes with less memory by the use of a quantum processor
Palsson, Matthew S.; Gu, Mile; Ho, Joseph; Wiseman, Howard M.; Pryde, Geoff J.
2017-01-01
Computer simulation of observable phenomena is an indispensable tool for engineering new technology, understanding the natural world, and studying human society. However, the most interesting systems are often so complex that simulating their future behavior demands storing immense amounts of information regarding how they have behaved in the past. For increasingly complex systems, simulation becomes increasingly difficult and is ultimately constrained by resources such as computer memory. Recent theoretical work shows that quantum theory can reduce this memory requirement beyond ultimate classical limits, as measured by a process’ statistical complexity, C. We experimentally demonstrate this quantum advantage in simulating stochastic processes. Our quantum implementation observes a memory requirement of Cq = 0.05 ± 0.01, far below the ultimate classical limit of C = 1. Scaling up this technique would substantially reduce the memory required in simulations of more complex systems. PMID:28168218
NASA Astrophysics Data System (ADS)
Bradshaw, A. M.; Reuter, B.; Hamacher, T.
2015-08-01
The energy transformation process beginning to take place in many countries as a response to climate change will reduce substantially the consumption of fossil fuels, but at the same time cause a large increase in the demand for other raw materials. Whereas it is difficult to estimate the quantities of, for example, iron, copper and aluminium required, the situation is somewhat simpler for the rare elements that might be needed in a sustainable energy economy based largely on photovoltaic sources, wind and possibly nuclear fusion. We consider briefly each of these technologies and discuss the supply risks associated with the rare elements required, if they were to be used in the quantities that might be required for a global energy transformation process. In passing, we point out the need in resource studies to define the terms "rare", "scarce" and "critical" and to use them in a consistent way.
Managing unexpected events in the manufacturing of biologic medicines.
Grampp, Gustavo; Ramanan, Sundar
2013-08-01
The manufacturing of biologic medicines (biologics) requires robust process and facility design, rigorous regulatory compliance, and a well-trained workforce. Because of the complex attributes of biologics and their sensitivity to production and handling conditions, manufacturing of these medicines also requires a high-reliability manufacturing organization. As required by regulators, such an organization must monitor the state-of-control for the manufacturing process. A high-reliability organization also invests in an experienced and fully engaged technical support staff and fosters a management culture that rewards in-depth analysis of unexpected results, robust risk assessments, and timely and effective implementation of mitigation measures. Such a combination of infrastructure, technology, human capital, management, and a science-based operations culture does not occur without a strong organizational and financial commitment. These attributes of a high-reliability biologics manufacturer are difficult to achieve and may be differentiating factors as the supply of biologics diversifies in future years.
Process for Selecting System Level Assessments for Human System Technologies
NASA Technical Reports Server (NTRS)
Watts, James; Park, John
2006-01-01
The integration of many life support systems necessary to construct a stable habitat is difficult. The correct identification of the appropriate technologies and corresponding interfaces is an exhaustive process. Once technologies are selected secondary issues such as mechanical and electrical interfaces must be addressed. The required analytical and testing work must be approached in a piecewise fashion to achieve timely results. A repeatable process has been developed to identify and prioritize system level assessments and testing needs. This Assessment Selection Process has been defined to assess cross cutting integration issues on topics at the system or component levels. Assessments are used to identify risks, encourage future actions to mitigate risks, or spur further studies.
Ungers, L J; Moskowitz, P D; Owens, T W; Harmon, A D; Briggs, T M
1982-02-01
Determining occupational health and safety risks posed by emerging technologies is difficult because of limited statistics. Nevertheless, estimates of such risks must be constructed to permit comparison of various technologies to identify the most attractive processes. One way to estimate risks is to use statistics on related industries. Based on process labor requirements and associated occupational health data, risks to workers and to society posed by an emerging technology can be calculated. Using data from the California semiconductor industry, this study applies a five-step occupational risk assessment procedure to four processes for the fabrication of photovoltaic cells. The validity of the occupational risk assessment method is discussed.
Sung, Kyongje
2008-12-01
Participants searched a visual display for a target among distractors. Each of 3 experiments tested a condition proposed to require attention and for which certain models propose a serial search. Serial versus parallel processing was tested by examining effects on response time means and cumulative distribution functions. In 2 conditions, the results suggested parallel rather than serial processing, even though the tasks produced significant set-size effects. Serial processing was produced only in a condition with a difficult discrimination and a very large set-size effect. The results support C. Bundesen's (1990) claim that an extreme set-size effect leads to serial processing. Implications for parallel models of visual selection are discussed.
A Software Safety Risk Taxonomy for Use in Retrospective Safety Cases
NASA Technical Reports Server (NTRS)
Hill, Janice L.
2007-01-01
Safety standards contain technical and process-oriented safely requirements. The best time to include these requirements is early in the development lifecycle of the system. When software safety requirements are levied on a legacy system after the fact, a retrospective safety case will need to be constructed for the software in the system. This can be a difficult task because there may be few to no art facts available to show compliance to the software safely requirements. The risks associated with not meeting safely requirements in a legacy safely-critical computer system must be addressed to give confidence for reuse. This paper introduces a proposal for a software safely risk taxonomy for legacy safely-critical computer systems, by specializing the Software Engineering Institute's 'Software Development Risk Taxonomy' with safely elements and attributes.
Vitamin C Determination by Indophenol Method
NASA Astrophysics Data System (ADS)
Nielsen, S. Suzanne
Vitamin C is an essential nutrient in the diet, but is easily reduced or destroyed by exposure to heat and oxygen during processing, packaging, and storage of food. The U.S. Food and Drug Administration requires the Vitamin C content to be listed on the nutrition label of foods. The instability of Vitamin C makes it more difficult to ensure an accurate listing of Vitamin C content on the nutrition label.
Ethical dilemmas of recording and reviewing neonatal resuscitation.
den Boer, Maria C; Houtlosser, Mirjam; van Zanten, Henriëtte Anje; Foglia, Elizabeth E; Engberts, Dirk P; Te Pas, Arjan B
2018-05-01
Neonatal resuscitation is provided to approximately 3% of neonates. Adequate ventilation is often the key to successful resuscitation, but this can be difficult to provide. There is increasing evidence that inappropriate respiratory support can have severe consequences. Several neonatal intensive care units have recorded and reviewed neonatal resuscitation procedures for quality assessment, education and research; however, ethical dilemmas sometimes make it difficult to implement this review process. We reviewed the literature on the development of recording and reviewing neonatal resuscitation and have summarised the ethical concerns involved. Recording and reviewing vital physiological parameters and video imaging of neonatal resuscitation in the delivery room is a valuable tool for quality assurance, education and research. Furthermore, it can improve the quality of neonatal resuscitation provided. We observed that ethical dilemmas arise as the review process is operating in several domains of healthcare that all have their specific moral framework with requirements and conditions on issues such as consent, privacy and data storage. These moral requirements and conditions vary due to local circumstances. Further research on the ethical aspects of recording and reviewing is desirable before wider implementation of this technique can be recommended. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
In-Situ monitoring and modeling of metal additive manufacturing powder bed fusion
NASA Astrophysics Data System (ADS)
Alldredge, Jacob; Slotwinski, John; Storck, Steven; Kim, Sam; Goldberg, Arnold; Montalbano, Timothy
2018-04-01
One of the major challenges in metal additive manufacturing is developing in-situ sensing and feedback control capabilities to eliminate build errors and allow qualified part creation without the need for costly and destructive external testing. Previously, many groups have focused on high fidelity numerical modeling and true temperature thermal imaging systems. These approaches require large computational resources or costly hardware that requires complex calibration and are difficult to integrate into commercial systems. In addition, due to the rapid change in the state of the material as well as its surface properties, getting true temperature is complicated and difficult. Here, we describe a different approach where we implement a low cost thermal imaging solution allowing for relative temperature measurements sufficient for detecting unwanted process variability. We match this with a faster than real time qualitative model that allows the process to be rapidly modeled during the build. The hope is to combine these two, allowing for the detection of anomalies in real time, enabling corrective action to potentially be taken, or parts to be stopped immediately after the error, saving material and time. Here we describe our sensor setup, its costs and abilities. We also show the ability to detect in real time unwanted process deviations. We also show that the output of our high speed model agrees qualitatively with experimental results. These results lay the groundwork for our vision of an integrated feedback and control scheme that combines low cost, easy to use sensors and fast modeling for process deviation monitoring.
Forming Mandrels for X-Ray Mirror Substrates
NASA Technical Reports Server (NTRS)
Blake, Peter N.; Saha, Timo; Zhang, Will; O'Dell, Stephen; Kester, Thomas; Jones, William
2011-01-01
Future x-ray astronomical missions, like the International X-ray Observatory (IXO), will likely require replicated mirrors to reduce both mass and production costs. Accurately figured and measured mandrels - upon which the mirror substrates are thermally formed - are essential to enable these missions. The challenge of making these mandrels within reasonable costs and schedule has led the Goddard and Marshall Space Flight Centers to develop in-house processes and to encourage small businesses to attack parts of the problem. Both Goddard and Marshall have developed full-aperture polishing processes and metrologies that yield high-precision axial traces of the finished mandrels. Outside technologists have been addressing challenges presented by subaperture CNC machining processes: particularly difficult is the challenge of reducing mid-spatial frequency errors below 2 nm rms. The end-product of this approach is a realistic plan for the economically feasible production of mandrels that meet program requirements in both figure and quantity.
Filling of orbital fluid management systems
NASA Technical Reports Server (NTRS)
Merino, F.; Blatt, M. H.; Thies, N. C.
1978-01-01
A study was performed with three objectives: (1) analyze fluid management system fill under orbital conditions; (2) determine what experimentation is needed; and (3) develop an experimental program. The fluid management system was a 1.06m (41.7 in) diameter pressure vessel with screen channel device. Analyses were conducted using liquid hydrogen and N2O4. The influence of helium and autogenous pressurization systems was considered. Analyses showed that fluid management system fill will be more difficult with a cryogen than with an earth storable. The key to a successful fill with cryogens is in devising techniques for filling without vent liquid, and removing trapped vapor from the screen device at tank fill completion. This will be accomplished with prechill, fill, and vapor condensation processes. Refill will require a vent and purge process, to dilute the residual helium, prior to introducing liquid. Neither prechill, chill, nor purge processes will be required for earth storables.
Stiers, Peter; Falbo, Luciana; Goulas, Alexandros; van Gog, Tamara; de Bruin, Anique
2016-05-15
Monitoring of learning is only accurate at some time after learning. It is thought that immediate monitoring is based on working memory, whereas later monitoring requires re-activation of stored items, yielding accurate judgements. Such interpretations are difficult to test because they require reverse inference, which presupposes specificity of brain activity for the hidden cognitive processes. We investigated whether multivariate pattern classification can provide this specificity. We used a word recall task to create single trial examples of immediate and long term retrieval and trained a learning algorithm to discriminate them. Next, participants performed a similar task involving monitoring instead of recall. The recall-trained classifier recognized the retrieval patterns underlying immediate and long term monitoring and classified delayed monitoring examples as long-term retrieval. This result demonstrates the feasibility of decoding cognitive processes, instead of their content. Copyright © 2016 Elsevier Inc. All rights reserved.
Eye Tracking and Pupillometry are Indicators of Dissociable Latent Decision Processes
Cavanagh, James F.; Wiecki, Thomas V.; Kochar, Angad; Frank, Michael J.
2014-01-01
Can you predict what someone is going to do just by watching them? This is certainly difficult: it would require a clear mapping between observable indicators and unobservable cognitive states. In this report we demonstrate how this is possible by monitoring eye gaze and pupil dilation, which predict dissociable biases during decision making. We quantified decision making using the Drift Diffusion Model (DDM), which provides an algorithmic account of how evidence accumulation and response caution contribute to decisions through separate latent parameters of drift rate and decision threshold, respectively. We used a hierarchical Bayesian estimation approach to assess the single trial influence of observable physiological signals on these latent DDM parameters. Increased eye gaze dwell time specifically predicted an increased drift rate toward the fixated option, irrespective of the value of the option. In contrast, greater pupil dilation specifically predicted an increase in decision threshold during difficult decisions. These findings suggest that eye tracking and pupillometry reflect the operations of dissociated latent decision processes. PMID:24548281
Frequency adaptive metadynamics for the calculation of rare-event kinetics
NASA Astrophysics Data System (ADS)
Wang, Yong; Valsson, Omar; Tiwary, Pratyush; Parrinello, Michele; Lindorff-Larsen, Kresten
2018-08-01
The ability to predict accurate thermodynamic and kinetic properties in biomolecular systems is of both scientific and practical utility. While both remain very difficult, predictions of kinetics are particularly difficult because rates, in contrast to free energies, depend on the route taken. For this reason, specific enhanced sampling methods are needed to calculate long-time scale kinetics. It has recently been demonstrated that it is possible to recover kinetics through the so-called "infrequent metadynamics" simulations, where the simulations are biased in a way that minimally corrupts the dynamics of moving between metastable states. This method, however, requires the bias to be added slowly, thus hampering applications to processes with only modest separations of time scales. Here we present a frequency-adaptive strategy which bridges normal and infrequent metadynamics. We show that this strategy can improve the precision and accuracy of rate calculations at fixed computational cost and should be able to extend rate calculations for much slower kinetic processes.
Lunar exploration for resource utilization
NASA Technical Reports Server (NTRS)
Duke, Michael B.
1992-01-01
The strategy for developing resources on the Moon depends on the stage of space industrialization. A case is made for first developing the resources needed to provide simple materials required in large quantities for space operations. Propellants, shielding, and structural materials fall into this category. As the enterprise grows, it will be feasible to develop additional sources - those more difficult to obtain or required in smaller quantities. Thus, the first materials processing on the Moon will probably take the abundant lunar regolith, extract from it major mineral or glass species, and do relatively simple chemical processing. We need to conduct a lunar remote sensing mission to determine the global distribution of features, geophysical properties, and composition of the Moon, information which will serve as the basis for detailed models of and engineering decisions about a lunar mine.
The ambidextrous organization.
O'Reilly, Charles A; Tushman, Michael L
2004-04-01
Corporate executives must constantly look backward, attending to the products and processes of the past, while also gazing forward, preparing for the innovations that will define the future. This mental balancing act is one of the toughest of all managerial challenges--it requires executives to explore new opportunities even as they work diligently to exploit existing capabilities--and it's no surprise that few companies do it well. But as every businessperson knows, there are companies that do. What's their secret? These organizations separate their new, exploratory units from their traditional, exploitative ones, allowing them to have different processes, structures, and cultures; at the same time, they maintain tight links across units at the senior executive level. Such "ambidextrous organizations," as the authors call them, allow executives to pioneer radical or disruptive innovations while also pursuing incremental gains. Of utmost importance to the ambidextrous organization are ambidextrous managers--executives who have the ability to understand and be sensitive to the needs of very different kinds of businesses. They possess the attributes of rigorous cost cutters and free-thinking entrepreneurs while also maintaining the objectivity required to make difficult trade-offs. Almost every company needs to renew itself through the creation of breakthrough products and processes, but it shouldn't do so at the expense of its traditional business. Building an ambidextrous organization is by no means easy, but the structure itself, combining organizational separation with senior team integration, is not difficult to understand. Given the executive will to make it happen, any company can become ambidextrous.
Use of Analogies in the Study of Diffusion
ERIC Educational Resources Information Center
Letic, Milorad
2014-01-01
Emergent processes, such as diffusion, are considered more difficult to understand than direct processes. In physiology, most processes are presented as direct processes, so emergent processes, when encountered, are even more difficult to understand. It has been suggested that, when studying diffusion, misconceptions about random processes are the…
Wild-Wall, Nele; Falkenstein, Michael
2010-01-01
By using event-related potentials (ERPs) the present study examines if age-related differences in preparation and processing especially emerge during divided attention. Binaurally presented auditory cues called for focused (valid and invalid) or divided attention to one or both ears. Responses were required to subsequent monaurally presented valid targets (vowels), but had to be suppressed to non-target vowels or invalidly cued vowels. Middle-aged participants were more impaired under divided attention than young ones, likely due to an age-related decline in preparatory attention following cues as was reflected in a decreased CNV. Under divided attention, target processing was increased in the middle-aged, likely reflecting compensatory effort to fulfill task requirements in the difficult condition. Additionally, middle-aged participants processed invalidly cued stimuli more intensely as was reflected by stimulus ERPs. The results suggest an age-related impairment in attentional preparation after auditory cues especially under divided attention and latent difficulties to suppress irrelevant information.
NASA Technical Reports Server (NTRS)
Lindensmith, Chris A.; Briggs, H. Clark; Beregovski, Yuri; Feria, V. Alfonso; Goullioud, Renaud; Gursel, Yekta; Hahn, Inseob; Kinsella, Gary; Orzewalla, Matthew; Phillips, Charles
2006-01-01
SIM Planetquest (SIM) is a large optical interferometer for making microarcsecond measurements of the positions of stars, and to detect Earth-sized planets around nearby stars. To achieve this precision, SIM requires stability of optical components to tens of picometers per hour. The combination of SIM s large size (9 meter baseline) and the high stability requirement makes it difficult and costly to measure all aspects of system performance on the ground. To reduce risks, costs and to allow for a design with fewer intermediate testing stages, the SIM project is developing an integrated thermal, mechanical and optical modeling process that will allow predictions of the system performance to be made at the required high precision. This modeling process uses commercial, off-the-shelf tools and has been validated against experimental results at the precision of the SIM performance requirements. This paper presents the description of the model development, some of the models, and their validation in the Thermo-Opto-Mechanical (TOM3) testbed which includes full scale brassboard optical components and the metrology to test them at the SIM performance requirement levels.
Process Studies on Laser Welding of Copper with Brilliant Green and Infrared Lasers
NASA Astrophysics Data System (ADS)
Engler, Sebastian; Ramsayer, Reiner; Poprawe, Reinhart
Copper materials are classified as difficult to weld with state-of-the-art lasers. High thermal conductivity in combination with low absorption at room temperature require high intensities for reaching a deep penetration welding process. The low absorption also causes high sensitivity to variations in surface conditions. Green laser radiation shows a considerable higher absorption at room temperature. This reduces the threshold intensity for deep penetration welding significantly. The influence of the green wavelength on energy coupling during heat conduction welding and deep penetration welding as well as the influence on the weld shape has been investigated.
Plasma process control with optical emission spectroscopy
NASA Astrophysics Data System (ADS)
Ward, P. P.
Plasma processes for cleaning, etching and desmear of electronic components and printed wiring boards (PWB) are difficult to predict and control. Non-uniformity of most plasma processes and sensitivity to environmental changes make it difficult to maintain process stability from day to day. To assure plasma process performance, weight loss coupons or post-plasma destructive testing must be used. The problem with these techniques is that they are not real-time methods and do not allow for immediate diagnosis and process correction. These methods often require scrapping some fraction of a batch to insure the integrity of the rest. Since these methods verify a successful cycle with post-plasma diagnostics, poor test results often determine that a batch is substandard and the resulting parts unusable. Both of these methods are a costly part of the overall fabrication cost. A more efficient method of testing would allow for constant monitoring of plasma conditions and process control. Process failures should be detected before the parts being treated. are damaged. Real time monitoring would allow for instantaneous corrections. Multiple site monitoring would allow for process mapping within one system or simultaneous monitoring of multiple systems. Optical emission spectroscopy conducted external to the plasma apparatus would allow for this sort of multifunctional analysis without perturbing the glow discharge. In this paper, optical emission spectroscopy for non-intrusive, in situ process control will be explored. A discussion of this technique as it applies towards process control, failure analysis and endpoint determination will be conducted. Methods for identifying process failures, progress and end of etch back and desmear processes will be discussed.
Battle-Wise: Seeking Time-Information Superiority in Networked Warfare
2006-07-01
idiosyncratic and thus not repeatable—each person’s perceptions , experiences, and thought processes are different. This makes it difficult to rely on... perception management’ is central to the conduct of its war with the West.”26 It does not require dedicated information-network infrastructure or expensive...to leave a house after several failed attempts to fight flare-ups in the first floor kitchen. The chief attributed his decision to extrasensory
In-network processing of joins in wireless sensor networks.
Kang, Hyunchul
2013-03-11
The join or correlated filtering of sensor readings is one of the fundamental query operations in wireless sensor networks (WSNs). Although the join in centralized or distributed databases is a well-researched problem, join processing in WSNs has quite different characteristics and is much more difficult to perform due to the lack of statistics on sensor readings and the resource constraints of sensor nodes. Since data transmission is orders of magnitude more costly than processing at a sensor node, in-network processing of joins is essential. In this paper, the state-of-the-art techniques for join implementation in WSNs are surveyed. The requirements and challenges, join types, and components of join implementation are described. The open issues for further research are identified.
In-Network Processing of Joins in Wireless Sensor Networks
Kang, Hyunchul
2013-01-01
The join or correlated filtering of sensor readings is one of the fundamental query operations in wireless sensor networks (WSNs). Although the join in centralized or distributed databases is a well-researched problem, join processing in WSNs has quite different characteristics and is much more difficult to perform due to the lack of statistics on sensor readings and the resource constraints of sensor nodes. Since data transmission is orders of magnitude more costly than processing at a sensor node, in-network processing of joins is essential. In this paper, the state-of-the-art techniques for join implementation in WSNs are surveyed. The requirements and challenges, join types, and components of join implementation are described. The open issues for further research are identified. PMID:23478603
Pandit, J J; Popat, M T; Cook, T M; Wilkes, A R; Groom, P; Cooke, H; Kapila, A; O'Sullivan, E
2011-08-01
Faced with the concern that an increasing number of airway management devices were being introduced into clinical practice with little or no prior evidence of their clinical efficacy or safety, the Difficult Airway Society formed a working party (Airway Device Evaluation Project Team) to establish a process by which the airway management community within the profession could itself lead a process of formal device/equipment evaluation. Although there are several national and international regulations governing which products can come on to the market and be legitimately sold, there has hitherto been no formal professional guidance relating to how products should be selected (i.e. purchased). The Airway Device Evaluation Project Team's first task was to formulate such advice, emphasising evidence-based principles. Team discussions led to a definition of the minimum level of evidence needed to make a pragmatic decision about the purchase or selection of an airway device. The Team concluded that this definition should form the basis of a professional standard, guiding those with responsibility for selecting airway devices. We describe how widespread adoption of this professional standard can act as a driver to create an infrastructure in which the required evidence can be obtained. Essential elements are that: (i) the Difficult Airway Society facilitates a coherent national network of research-active units; and (ii) individual anaesthetists in hospital trusts play a more active role in local purchasing decisions, applying the relevant evidence and communicating their purchasing decisions to the Difficult Airway Society. © 2011 The Authors. Anaesthesia © 2011 The Association of Anaesthetists of Great Britain and Ireland.
Challenges and opportunities in the manufacture and expansion of cells for therapy.
Maartens, Joachim H; De-Juan-Pardo, Elena; Wunner, Felix M; Simula, Antonio; Voelcker, Nicolas H; Barry, Simon C; Hutmacher, Dietmar W
2017-10-01
Laboratory-based ex vivo cell culture methods are largely manual in their manufacturing processes. This makes it extremely difficult to meet regulatory requirements for process validation, quality control and reproducibility. Cell culture concepts with a translational focus need to embrace a more automated approach where cell yields are able to meet the quantitative production demands, the correct cell lineage and phenotype is readily confirmed and reagent usage has been optimized. Areas covered: This article discusses the obstacles inherent in classical laboratory-based methods, their concomitant impact on cost-of-goods and that a technology step change is required to facilitate translation from bed-to-bedside. Expert opinion: While traditional bioreactors have demonstrated limited success where adherent cells are used in combination with microcarriers, further process optimization will be required to find solutions for commercial-scale therapies. New cell culture technologies based on 3D-printed cell culture lattices with favourable surface to volume ratios have the potential to change the paradigm in industry. An integrated Quality-by-Design /System engineering approach will be essential to facilitate the scaled-up translation from proof-of-principle to clinical validation.
NASA Astrophysics Data System (ADS)
Kaplita, George A.; Schmitz, Stefan; Ranade, Rajiv; Mathad, Gangadhara S.
1999-09-01
The planarization and recessing of polysilicon to form a plug are processes of increasing importance in silicon IC fabrication. While this technology has been developed and applied to DRAM technology using Trench Storage Capacitors, the need for such processes in other IC applications (i.e. polysilicon studs) has increased. Both planarization and recess processes usually have stringent requirements on etch rate, recess uniformity, and selectivity to underlying films. Additionally, both processes generally must be isotropic, yet must not expand any seams that might be present in the polysilicon fill. These processes should also be insensitive to changes in exposed silicon area (pattern factor) on the wafer. A SF6 plasma process in a polysilicon DPS (Decoupled Plasma Source) reactor has demonstrated the capability of achieving the above process requirements for both planarization and recess etch. The SF6 process in the decoupled plasma source reactor exhibited less sensitivity to pattern factor than in other types of reactors. Control of these planarization and recess processes requires two endpoint systems to work sequentially in the same recipe: one for monitoring the endpoint when blanket polysilicon (100% Si loading) is being planarized and one for monitoring the recess depth while the plug is being recessed (less than 10% Si loading). The planarization process employs an optical emission endpoint system (OES). An interferometric endpoint system (IEP), capable of monitoring lateral interference, is used for determining the recess depth. The ability of using either or both systems is required to make these plug processes manufacturable. Measuring the recess depth resulting from the recess process can be difficult, costly and time- consuming. An Atomic Force Microscope (AFM) can greatly alleviate these problems and can serve as a critical tool in the development of recess processes.
NASA Technical Reports Server (NTRS)
Withey, James V.
1986-01-01
The validity of real-time software is determined by its ability to execute on a computer within the time constraints of the physical system it is modeling. In many applications the time constraints are so critical that the details of process scheduling are elevated to the requirements analysis phase of the software development cycle. It is not uncommon to find specifications for a real-time cyclic executive program included to assumed in such requirements. It was found that prelininary designs structured around this implementation abscure the data flow of the real world system that is modeled and that it is consequently difficult and costly to maintain, update and reuse the resulting software. A cyclic executive is a software component that schedules and implicitly synchronizes the real-time software through periodic and repetitive subroutine calls. Therefore a design method is sought that allows the deferral of process scheduling to the later stages of design. The appropriate scheduling paradigm must be chosen given the performance constraints, the largest environment and the software's lifecycle. The concept of process inversion is explored with respect to the cyclic executive.
Adaptation of video game UVW mapping to 3D visualization of gene expression patterns
NASA Astrophysics Data System (ADS)
Vize, Peter D.; Gerth, Victor E.
2007-01-01
Analysis of gene expression patterns within an organism plays a critical role in associating genes with biological processes in both health and disease. During embryonic development the analysis and comparison of different gene expression patterns allows biologists to identify candidate genes that may regulate the formation of normal tissues and organs and to search for genes associated with congenital diseases. No two individual embryos, or organs, are exactly the same shape or size so comparing spatial gene expression in one embryo to that in another is difficult. We will present our efforts in comparing gene expression data collected using both volumetric and projection approaches. Volumetric data is highly accurate but difficult to process and compare. Projection methods use UV mapping to align texture maps to standardized spatial frameworks. This approach is less accurate but is very rapid and requires very little processing. We have built a database of over 180 3D models depicting gene expression patterns mapped onto the surface of spline based embryo models. Gene expression data in different models can easily be compared to determine common regions of activity. Visualization software, both Java and OpenGL optimized for viewing 3D gene expression data will also be demonstrated.
Implications of learning theory for developing programs to decrease overeating
Boutelle, Kerri N.; Bouton, Mark E.
2015-01-01
Childhood obesity is associated with medical and psychological comorbidities, and interventions targeting overeating could be pragmatic and have a significant impact on weight. Calorically dense foods are easily available, variable, and tasty which allows for effective opportunities to learn to associate behaviors and cues in the environment with food through fundamental conditioning processes, resulting in measurable psychological and physiological food cue reactivity in vulnerable children. Basic research suggests that initial learning is difficult to erase, and that it is vulnerable to a number of phenomena that will allow the original learning to re-emerge after it is suppressed or replaced. These processes may help explain why it may be difficult to change food cue reactivity and overeating over the long term. Extinction theory may be used to develop effective cue-exposure treatments to decrease food cue reactivity through inhibitory learning, although these processes are complex and require an integral understanding of the theory and individual differences. Additionally, learning theory can be used to develop other interventions that may prove to be useful. Through an integration of learning theory, basic and translational research, it may be possible to develop interventions that can decrease the urges to overeat, and improve the weight status of children. PMID:25998235
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bates, Robert; McConnell, Elizabeth
Machining methods across many industries generally require multiple operations to machine and process advanced materials, features with micron precision, and complex shapes. The resulting multiple machining platforms can significantly affect manufacturing cycle time and the precision of the final parts, with a resultant increase in cost and energy consumption. Ultrafast lasers represent a transformative and disruptive technology that removes material with micron precision and in a single step manufacturing process. Such precision results from athermal ablation without modification or damage to the remaining material which is the key differentiator between ultrafast laser technologies and traditional laser technologies or mechanical processes.more » Athermal ablation without modification or damage to the material eliminates post-processing or multiple manufacturing steps. Combined with the appropriate technology to control the motion of the work piece, ultrafast lasers are excellent candidates to provide breakthrough machining capability for difficult-to-machine materials. At the project onset in early 2012, the project team recognized that substantial effort was necessary to improve the application of ultrafast laser and precise motion control technologies (for micromachining difficult-to-machine materials) to further the aggregate throughput and yield improvements over conventional machining methods. The project described in this report advanced these leading-edge technologies thru the development and verification of two platforms: a hybrid enhanced laser chassis and a multi-application testbed.« less
Online Student Learning and Earth System Processes
NASA Astrophysics Data System (ADS)
Mackay, R. M.
2002-12-01
Many students have difficulty understanding dynamical processes related to Earth's climate system. This is particularly true in Earth System Science courses designed for non-majors. It is often tempting to gloss over these conceptually difficult topics and have students spend more study time learning factual information or ideas that require rather simple linear thought processes. Even when the professor is ambitious and tackles the more difficult ideas of system dynamics in such courses, they are typically greeted with frustration and limited success. However, an understanding of generic system concepts and processes is quite arguably an essential component of any quality liberal arts education. We present online student-centered learning modules that are designed to help students explore different aspects of Earth's climate system (see http://www.cs.clark.edu/mac/physlets/GlobalPollution/maintrace.htm for a sample activity). The JAVA based learning activities are designed to: be assessable to anyone with Web access; be self-paced, engaging, and hands-on; and make use of past results from science education research. Professors can use module activities to supplement lecture, as controlled-learning-lab activities, or as stand-alone homework assignments. Acknowledgement This work was supported by NASA Office of Space Science contract NASW-98037, Atmospheric and Environmental Research Inc. of Lexington, MA., and Clark College.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2015-07-01
Air sealing of building enclosures is a difficult and time-consuming process. Current methods in new construction require laborers to physically locate small and sometimes large holes in multiple assemblies and then manually seal each of them. The innovation demonstrated under this research study was the automated air sealing and compartmentalization of buildings through the use of an aerosolized sealant, developed by the Western Cooling Efficiency Center at University of California Davis.CARB sought to demonstrate this new technology application in a multifamily building in Queens, NY. The effectiveness of the sealing process was evaluated by three methods: air leakage testing ofmore » overall apartment before and after sealing, point-source testing of individual leaks, and pressure measurements in the walls of the target apartment during sealing. Aerosolized sealing was successful by several measures in this study. Many individual leaks that are labor-intensive to address separately were well sealed by the aerosol particles. In addition, many diffuse leaks that are difficult to identify and treat were also sealed. The aerosol-based sealing process resulted in an average reduction of 71% in air leakage across three apartments and an average apartment airtightness of 0.08 CFM50/SF of enclosure area.« less
Data Processing and Text Mining Technologies on Electronic Medical Records: A Review
Sun, Wencheng; Li, Yangyang; Liu, Fang; Fang, Shengqun; Wang, Guoyan
2018-01-01
Currently, medical institutes generally use EMR to record patient's condition, including diagnostic information, procedures performed, and treatment results. EMR has been recognized as a valuable resource for large-scale analysis. However, EMR has the characteristics of diversity, incompleteness, redundancy, and privacy, which make it difficult to carry out data mining and analysis directly. Therefore, it is necessary to preprocess the source data in order to improve data quality and improve the data mining results. Different types of data require different processing technologies. Most structured data commonly needs classic preprocessing technologies, including data cleansing, data integration, data transformation, and data reduction. For semistructured or unstructured data, such as medical text, containing more health information, it requires more complex and challenging processing methods. The task of information extraction for medical texts mainly includes NER (named-entity recognition) and RE (relation extraction). This paper focuses on the process of EMR processing and emphatically analyzes the key techniques. In addition, we make an in-depth study on the applications developed based on text mining together with the open challenges and research issues for future work. PMID:29849998
Singlet oxygen detection in biological systems: Uses and limitations.
Koh, Eugene; Fluhr, Robert
2016-07-02
The study of singlet oxygen in biological systems is challenging in many ways. Singlet oxygen is a relatively unstable ephemeral molecule, and its properties make it highly reactive with many biomolecules, making it difficult to quantify accurately. Several methods have been developed to study this elusive molecule, but most studies thus far have focused on those conditions that produce relatively large amounts of singlet oxygen. However, the need for more sensitive methods is required as one begins to explore the levels of singlet oxygen required in signaling and regulatory processes. Here we discuss the various methods used in the study of singlet oxygen, and outline their uses and limitations.
All-electric control of donor nuclear spin qubits in silicon
NASA Astrophysics Data System (ADS)
Sigillito, Anthony J.; Tyryshkin, Alexei M.; Schenkel, Thomas; Houck, Andrew A.; Lyon, Stephen A.
2017-10-01
The electronic and nuclear spin degrees of freedom of donor impurities in silicon form ultra-coherent two-level systems that are potentially useful for applications in quantum information and are intrinsically compatible with industrial semiconductor processing. However, because of their smaller gyromagnetic ratios, nuclear spins are more difficult to manipulate than electron spins and are often considered too slow for quantum information processing. Moreover, although alternating current magnetic fields are the most natural choice to drive spin transitions and implement quantum gates, they are difficult to confine spatially to the level of a single donor, thus requiring alternative approaches. In recent years, schemes for all-electrical control of donor spin qubits have been proposed but no experimental demonstrations have been reported yet. Here, we demonstrate a scalable all-electric method for controlling neutral 31P and 75As donor nuclear spins in silicon. Using coplanar photonic bandgap resonators, we drive Rabi oscillations on nuclear spins exclusively using electric fields by employing the donor-bound electron as a quantum transducer, much in the spirit of recent works with single-molecule magnets. The electric field confinement leads to major advantages such as low power requirements, higher qubit densities and faster gate times. Additionally, this approach makes it possible to drive nuclear spin qubits either at their resonance frequency or at its first subharmonic, thus reducing device bandwidth requirements. Double quantum transitions can be driven as well, providing easy access to the full computational manifold of our system and making it convenient to implement nuclear spin-based qudits using 75As donors.
Simple solution to the medical instrumentation software problem
NASA Astrophysics Data System (ADS)
Leif, Robert C.; Leif, Suzanne B.; Leif, Stephanie H.; Bingue, E.
1995-04-01
Medical devices now include a substantial software component, which is both difficult and expensive to produce and maintain. Medical software must be developed according to `Good Manufacturing Practices', GMP. Good Manufacturing Practices as specified by the FDA and ISO requires the definition and compliance to a software processes which ensures quality products by specifying a detailed method of software construction. The software process should be based on accepted standards. US Department of Defense software standards and technology can both facilitate the development and improve the quality of medical systems. We describe the advantages of employing Mil-Std-498, Software Development and Documentation, and the Ada programming language. Ada provides the very broad range of functionalities, from embedded real-time to management information systems required by many medical devices. It also includes advanced facilities for object oriented programming and software engineering.
Application of TRIZ Theory in Patternless Casting Manufacturing Technique
NASA Astrophysics Data System (ADS)
Yang, Weidong; Gan, Dequan; Jiang, Ping; Tian, Yumei
The ultimate goal of Patternless Casting Manufacturing (referred to as PCM) is how to obtain the casts by casting the sand mold directly. In the previous PCM, the resin content of sand mold is much higher than that required by traditional resin sand, so the casts obtained are difficult to be sound and qualified products, which limits the application of this technique greatly. In this paper, the TRIZ algorithm is introduced to the innovation process in PCM systematically.
Design of freeze-drying processes for pharmaceuticals: practical advice.
Tang, Xiaolin; Pikal, Michael J
2004-02-01
Design of freeze-drying processes is often approached with a "trial and error" experimental plan or, worse yet, the protocol used in the first laboratory run is adopted without further attempts at optimization. Consequently, commercial freeze-drying processes are often neither robust nor efficient. It is our thesis that design of an "optimized" freeze-drying process is not particularly difficult for most products, as long as some simple rules based on well-accepted scientific principles are followed. It is the purpose of this review to discuss the scientific foundations of the freeze-drying process design and then to consolidate these principles into a set of guidelines for rational process design and optimization. General advice is given concerning common stability issues with proteins, but unusual and difficult stability issues are beyond the scope of this review. Control of ice nucleation and crystallization during the freezing step is discussed, and the impact of freezing on the rest of the process and final product quality is reviewed. Representative freezing protocols are presented. The significance of the collapse temperature and the thermal transition, denoted Tg', are discussed, and procedures for the selection of the "target product temperature" for primary drying are presented. Furthermore, guidelines are given for selection of the optimal shelf temperature and chamber pressure settings required to achieve the target product temperature without thermal and/or mass transfer overload of the freeze dryer. Finally, guidelines and "rules" for optimization of secondary drying and representative secondary drying protocols are presented.
Porosity Estimation By Artificial Neural Networks Inversion . Application to Algerian South Field
NASA Astrophysics Data System (ADS)
Eladj, Said; Aliouane, Leila; Ouadfeul, Sid-Ali
2017-04-01
One of the main geophysicist's current challenge is the discovery and the study of stratigraphic traps, this last is a difficult task and requires a very fine analysis of the seismic data. The seismic data inversion allows obtaining lithological and stratigraphic information for the reservoir characterization . However, when solving the inverse problem we encounter difficult problems such as: Non-existence and non-uniqueness of the solution add to this the instability of the processing algorithm. Therefore, uncertainties in the data and the non-linearity of the relationship between the data and the parameters must be taken seriously. In this case, the artificial intelligence techniques such as Artificial Neural Networks(ANN) is used to resolve this ambiguity, this can be done by integrating different physical properties data which requires a supervised learning methods. In this work, we invert the acoustic impedance 3D seismic cube using the colored inversion method, then, the introduction of the acoustic impedance volume resulting from the first step as an input of based model inversion method allows to calculate the Porosity volume using the Multilayer Perceptron Artificial Neural Network. Application to an Algerian South hydrocarbon field clearly demonstrate the power of the proposed processing technique to predict the porosity for seismic data, obtained results can be used for reserves estimation, permeability prediction, recovery factor and reservoir monitoring. Keywords: Artificial Neural Networks, inversion, non-uniqueness , nonlinear, 3D porosity volume, reservoir characterization .
Process' standardization and change management in higher education. The case of TEI of Athens
NASA Astrophysics Data System (ADS)
Chalaris, Ioannis; Chalaris, Manolis; Gritzalis, Stefanos; Belsis, Petros
2015-02-01
The establishment of mature operational procedures and the effort of standardizing and certifying these procedures is a particularly arduous and demanding task which requires strong commitment from management to the existing objectives, administrative stability and continuity, availability of resources, an adequate implementation team with support from all stakeholders and of course great tolerance until tangible results of the investment are shown. Ensuring these conditions, particularly in times of economic crisis, is an extremely difficult task for large organizations such as TEI of Athens where there is heterogeneity in personnel and changes in the administrative hierarchy arise plethora of additional difficulties and require an effective change management. In this work we depict the path of standardization and certification of administrative functions of TEI of Athens, with emphasis on difficulties encountered and how to address them and in particular issues of change management and the culture related to this effort. The requirement for infrastructure needed to be maintained in processes and tools process & strategic management is embodied, in order to evolve mechanisms for continuous improvement processes and storage / recovery of the resulting knowledge. The work concludes with a general design of a road map of internal audit and continuous improvement processes for a large institution of higher education.
Synthesizing parallel imaging applications using the CAP (computer-aided parallelization) tool
NASA Astrophysics Data System (ADS)
Gennart, Benoit A.; Mazzariol, Marc; Messerli, Vincent; Hersch, Roger D.
1997-12-01
Imaging applications such as filtering, image transforms and compression/decompression require vast amounts of computing power when applied to large data sets. These applications would potentially benefit from the use of parallel processing. However, dedicated parallel computers are expensive and their processing power per node lags behind that of the most recent commodity components. Furthermore, developing parallel applications remains a difficult task: writing and debugging the application is difficult (deadlocks), programs may not be portable from one parallel architecture to the other, and performance often comes short of expectations. In order to facilitate the development of parallel applications, we propose the CAP computer-aided parallelization tool which enables application programmers to specify at a high-level of abstraction the flow of data between pipelined-parallel operations. In addition, the CAP tool supports the programmer in developing parallel imaging and storage operations. CAP enables combining efficiently parallel storage access routines and image processing sequential operations. This paper shows how processing and I/O intensive imaging applications must be implemented to take advantage of parallelism and pipelining between data access and processing. This paper's contribution is (1) to show how such implementations can be compactly specified in CAP, and (2) to demonstrate that CAP specified applications achieve the performance of custom parallel code. The paper analyzes theoretically the performance of CAP specified applications and demonstrates the accuracy of the theoretical analysis through experimental measurements.
Dankoski, Mary E; Bickel, Janet; Gusic, Maryellen E
2014-12-01
Dialogue is essential for transforming institutions into learning organizations, yet many well-known characteristics of academic health centers (AHCs) interfere with open discussion. Rigid hierarchies, intense competition for resources, and the power of peer review in advancement processes all hamper difficult conversations, thereby contributing to organizational silence, and at great cost to the institution. Information necessary for critical decisions is not shared, individuals and the organization do not learn from mistakes, and diverse perspectives from those with less power are not entertained, or worse, are suppressed. When leaders become more skilled at inviting multiple perspectives and faculty more adept at broaching difficult conversations with those in power, differences are more effectively addressed and conflicts resolved. In this article, the authors frame why this skill is an essential competency for faculty and leaders alike and provide the following recommendations to institutions for increasing capacity in this area: (1) develop leaders to counteract organizational silence, (2) develop faculty members' skills in raising difficult issues with those in positions of power, and (3) train mentors to coach others in raising difficult conversations. The vitality of AHCs requires that faculty and institutional leaders develop relational communication skills and partner in learning through challenging conversations.
Advances in spectroscopic methods for quantifying soil carbon
Reeves, James B.; McCarty, Gregory W.; Calderon, Francisco; Hively, W. Dean
2012-01-01
The current gold standard for soil carbon (C) determination is elemental C analysis using dry combustion. However, this method requires expensive consumables, is limited by the number of samples that can be processed (~100/d), and is restricted to the determination of total carbon. With increased interest in soil C sequestration, faster methods of analysis are needed, and there is growing interest in methods based on diffuse reflectance spectroscopy in the visible, near-infrared or mid-infrared spectral ranges. These spectral methods can decrease analytical requirements and speed sample processing, be applied to large landscape areas using remote sensing imagery, and be used to predict multiple analytes simultaneously. However, the methods require localized calibrations to establish the relationship between spectral data and reference analytical data, and also have additional, specific problems. For example, remote sensing is capable of scanning entire watersheds for soil carbon content but is limited to the surface layer of tilled soils and may require difficult and extensive field sampling to obtain proper localized calibration reference values. The objective of this chapter is to discuss the present state of spectroscopic methods for determination of soil carbon.
Development of sensor augmented robotic weld systems for aerospace propulsion system fabrication
NASA Technical Reports Server (NTRS)
Jones, C. S.; Gangl, K. J.
1986-01-01
In order to meet stringent performance goals for power and reuseability, the Space Shuttle Main Engine was designed with many complex, difficult welded joints that provide maximum strength and minimum weight. To this end, the SSME requires 370 meters of welded joints. Automation of some welds has improved welding productivity significantly over manual welding. Application has previously been limited by accessibility constraints, requirements for complex process control, low production volumes, high part variability, and stringent quality requirements. Development of robots for welding in this application requires that a unique set of constraints be addressed. This paper shows how robotic welding can enhance production of aerospace components by addressing their specific requirements. A development program at the Marshall Space Flight Center combining industrial robots with state-of-the-art sensor systems and computer simulation is providing technology for the automation of welds in Space Shuttle Main Engine production.
Wastewater Biosolid Composting Optimization Based on UV-VNIR Spectroscopy Monitoring
Temporal-Lara, Beatriz; Melendez-Pastor, Ignacio; Gómez, Ignacio; Navarro-Pedreño, Jose
2016-01-01
Conventional wastewater treatment generates large amounts of organic matter–rich sludge that requires adequate treatment to avoid public health and environmental problems. The mixture of wastewater sludge and some bulking agents produces a biosolid to be composted at adequate composting facilities. The composting process is chemically and microbiologically complex and requires an adequate aeration of the biosolid (e.g., with a turner machine) for proper maturation of the compost. Adequate (near) real-time monitoring of the compost maturity process is highly difficult and the operation of composting facilities is not as automatized as other industrial processes. Spectroscopic analysis of compost samples has been successfully employed for compost maturity assessment but the preparation of the solid compost samples is difficult and time-consuming. This manuscript presents a methodology based on a combination of a less time-consuming compost sample preparation and ultraviolet, visible and short-wave near-infrared spectroscopy. Spectroscopic measurements were performed with liquid compost extract instead of solid compost samples. Partial least square (PLS) models were developed to quantify chemical fractions commonly employed for compost maturity assessment. Effective regression models were obtained for total organic matter (residual predictive deviation—RPD = 2.68), humification ratio (RPD = 2.23), total exchangeable carbon (RPD = 2.07) and total organic carbon (RPD = 1.66) with a modular and cost-effective visible and near infrared (VNIR) spectroradiometer. This combination of a less time-consuming compost sample preparation with a versatile sensor system provides an easy-to-implement, efficient and cost-effective protocol for compost maturity assessment and near-real-time monitoring. PMID:27854280
Wastewater Biosolid Composting Optimization Based on UV-VNIR Spectroscopy Monitoring.
Temporal-Lara, Beatriz; Melendez-Pastor, Ignacio; Gómez, Ignacio; Navarro-Pedreño, Jose
2016-11-15
Conventional wastewater treatment generates large amounts of organic matter-rich sludge that requires adequate treatment to avoid public health and environmental problems. The mixture of wastewater sludge and some bulking agents produces a biosolid to be composted at adequate composting facilities. The composting process is chemically and microbiologically complex and requires an adequate aeration of the biosolid (e.g., with a turner machine) for proper maturation of the compost. Adequate (near) real-time monitoring of the compost maturity process is highly difficult and the operation of composting facilities is not as automatized as other industrial processes. Spectroscopic analysis of compost samples has been successfully employed for compost maturity assessment but the preparation of the solid compost samples is difficult and time-consuming. This manuscript presents a methodology based on a combination of a less time-consuming compost sample preparation and ultraviolet, visible and short-wave near-infrared spectroscopy. Spectroscopic measurements were performed with liquid compost extract instead of solid compost samples. Partial least square (PLS) models were developed to quantify chemical fractions commonly employed for compost maturity assessment. Effective regression models were obtained for total organic matter (residual predictive deviation-RPD = 2.68), humification ratio (RPD = 2.23), total exchangeable carbon (RPD = 2.07) and total organic carbon (RPD = 1.66) with a modular and cost-effective visible and near infrared (VNIR) spectroradiometer. This combination of a less time-consuming compost sample preparation with a versatile sensor system provides an easy-to-implement, efficient and cost-effective protocol for compost maturity assessment and near-real-time monitoring.
Counting Tree Growth Rings Moderately Difficult to Distinguish
C. B. Briscoe; M. Chudnoff
1964-01-01
There is an extensive literature dealing with techniques and gadgets to facilitate counting tree growth rings. A relatively simple method is described below, satisfactory for species too difficult to count in the field, but not sufficiently difficult to require the preparation of microscope slides nor staining techniques.
NASA Astrophysics Data System (ADS)
Moritzer, Elmar; Nordmeyer, Timo; Leister, Christian; Schmidt, Martin Andreas; Grishin, Artur; Knospe, Alexander
2016-03-01
The production of high-quality thermoplastic parts often requires an additional process step after the injection molding stage. This may be a coating, bonding process or a 2K-injection moulding process. A commonly used process to improve the bond strength is atmospheric pressure plasma treatment. A variety of applications are realized with the aid of CNC systems. Although they ensure excellent reproducibility, they make it difficult to implement inline applications. This paper therefore examines the possibility of surface treatment using a stationary plasma jet. However, before it is possible to integrate this technology into a production process, preliminary trials need to be carried out to establish which factors influence the process. Experimental tests were performed using a special test set-up, enabling geometric, plasma-specific parameters to be identified. These results can help with the practical integration of this technology into existing production processes.
PROCESS OF SECURING PLUTONIUM IN NITRIC ACID SOLUTIONS IN ITS TRIVALENT OXIDATION STATE
Thomas, J.R.
1958-08-26
>Various processes for the recovery of plutonium require that the plutonium be obtalned and maintained in the reduced or trivalent state in solution. Ferrous ions are commonly used as the reducing agent for this purpose, but it is difficult to maintain the plutonium in a reduced state in nitric acid solutions due to the oxidizing effects of the acid. It has been found that the addition of a stabilizing or holding reductant to such solution prevents reoxidation of the plutonium. Sulfamate ions have been found to be ideally suitable as such a stabilizer even in the presence of nitric acid.
1992-06-25
Zeolites are crystalline aluminosilicates that have complex framework structures. However, there are several features of zeolite crystals that make unequivocal structure determinations difficult. The acquisition of reliable structural information on zeolites is greatly facilitated by the availability of high-quality specimens. For structure determinations by conventional diffraction techniques, large single-crystal specimens are essential. Alternatively, structural determinations by powder profile refinement methods relax the constraints on crystal size, but still require materials with a high degree of crystalline perfection. Studies conducted at CAMMP (Center for Advanced Microgravity Materials Processing) have demonstrated that microgravity processing can produce larger crystal sizes and fewer structural defects relative to terrestrial crystal growth. Principal Investigator: Dr. Albert Sacco
NASA Technical Reports Server (NTRS)
1992-01-01
Zeolites are crystalline aluminosilicates that have complex framework structures. However, there are several features of zeolite crystals that make unequivocal structure determinations difficult. The acquisition of reliable structural information on zeolites is greatly facilitated by the availability of high-quality specimens. For structure determinations by conventional diffraction techniques, large single-crystal specimens are essential. Alternatively, structural determinations by powder profile refinement methods relax the constraints on crystal size, but still require materials with a high degree of crystalline perfection. Studies conducted at CAMMP (Center for Advanced Microgravity Materials Processing) have demonstrated that microgravity processing can produce larger crystal sizes and fewer structural defects relative to terrestrial crystal growth. Principal Investigator: Dr. Albert Sacco
Problem of Mistakes in Databases, Processing and Interpretation of Observations of the Sun. I.
NASA Astrophysics Data System (ADS)
Lozitska, N. I.
In databases of observations unnoticed mistakes and misprints could occur at any stage of observation, preparation and processing of databases. The current detection of errors is complicated by the fact that the works of observer, databases compiler and researcher were divided. Data acquisition from a spacecraft requires the greater amount of researchers than for ground-based observations. As a result, the probability of errors is increasing. Keeping track of the errors on each stage is very difficult, so we use of cross-comparison of data from different sources. We revealed some misprints in the typographic and digital results of sunspot group area measurements.
Kernel-Based Learning for Domain-Specific Relation Extraction
NASA Astrophysics Data System (ADS)
Basili, Roberto; Giannone, Cristina; Del Vescovo, Chiara; Moschitti, Alessandro; Naggar, Paolo
In a specific process of business intelligence, i.e. investigation on organized crime, empirical language processing technologies can play a crucial role. The analysis of transcriptions on investigative activities, such as police interrogatories, for the recognition and storage of complex relations among people and locations is a very difficult and time consuming task, ultimately based on pools of experts. We discuss here an inductive relation extraction platform that opens the way to much cheaper and consistent workflows. The presented empirical investigation shows that accurate results, comparable to the expert teams, can be achieved, and parametrization allows to fine tune the system behavior for fitting domain-specific requirements.
Alignment of an acoustic manipulation device with cepstral analysis of electronic impedance data.
Hughes, D A; Qiu, Y; Démoré, C; Weijer, C J; Cochran, S
2015-02-01
Acoustic particle manipulation is an emerging technology that uses ultrasonic standing waves to position objects with pressure gradients and acoustic radiation forces. To produce strong standing waves, the transducer and the reflector must be aligned properly such that they are parallel to each other. This can be a difficult process due to the need to visualise the ultrasound waves and as higher frequencies are introduced, this alignment requires higher accuracy. In this paper, we present a method for aligning acoustic resonators with cepstral analysis. This is a simple signal processing technique that requires only the electrical impedance measurement data of the resonator, which is usually recorded during the fabrication process of the device. We first introduce the mathematical basis of cepstral analysis and then demonstrate and validate it using a computer simulation of an acoustic resonator. Finally, the technique is demonstrated experimentally to create many parallel linear traps for 10 μm fluorescent beads inside an acoustic resonator. Copyright © 2014 Elsevier B.V. All rights reserved.
Eye tracking and pupillometry are indicators of dissociable latent decision processes.
Cavanagh, James F; Wiecki, Thomas V; Kochar, Angad; Frank, Michael J
2014-08-01
Can you predict what people are going to do just by watching them? This is certainly difficult: it would require a clear mapping between observable indicators and unobservable cognitive states. In this report, we demonstrate how this is possible by monitoring eye gaze and pupil dilation, which predict dissociable biases during decision making. We quantified decision making using the drift diffusion model (DDM), which provides an algorithmic account of how evidence accumulation and response caution contribute to decisions through separate latent parameters of drift rate and decision threshold, respectively. We used a hierarchical Bayesian estimation approach to assess the single trial influence of observable physiological signals on these latent DDM parameters. Increased eye gaze dwell time specifically predicted an increased drift rate toward the fixated option, irrespective of the value of the option. In contrast, greater pupil dilation specifically predicted an increase in decision threshold during difficult decisions. These findings suggest that eye tracking and pupillometry reflect the operations of dissociated latent decision processes. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Budin, Francois; Hoogstoel, Marion; Reynolds, Patrick; Grauer, Michael; O'Leary-Moore, Shonagh K; Oguz, Ipek
2013-01-01
Magnetic resonance imaging (MRI) of rodent brains enables study of the development and the integrity of the brain under certain conditions (alcohol, drugs etc.). However, these images are difficult to analyze for biomedical researchers with limited image processing experience. In this paper we present an image processing pipeline running on a Midas server, a web-based data storage system. It is composed of the following steps: rigid registration, skull-stripping, average computation, average parcellation, parcellation propagation to individual subjects, and computation of region-based statistics on each image. The pipeline is easy to configure and requires very little image processing knowledge. We present results obtained by processing a data set using this pipeline and demonstrate how this pipeline can be used to find differences between populations.
Lunar oxygen and metal for use in near-earth space - Magma electrolysis
NASA Technical Reports Server (NTRS)
Colson, Russell O.; Haskin, Larry A.
1990-01-01
The unique conditions on the moon, such as vacuum, absence of many reagents common on the earth, and presence of very nontraditional 'ores', suggest that a unique and nontraditional process for extracting materials from the ores may prove the most practical. An investigation has begun into unfluxed silicate electrolysis as a method for extracting oxygen, Fe, and Si from lunar regolith. The advantages of the process include simplicity of concept, absence of need to supply reagents from the earth, and low power and mass requirements for the processing plant. Disadvantages include the need for uninterrupted high temperature and the highly corrosive nature of the high-temperature silicate melts, which has made identifying suitable electrode and container materials difficult.
How to Measure Outcomes of Peripheral Nerve Surgery
Wang, Yirong; Sunitha, Malay; Chung, Kevin C.
2013-01-01
Synopsis Evaluation of outcomes after peripheral nerve surgeries include a number of assessment methods that reflect different aspects of recovery, including reinnervation, tactile gnosis, integrated sensory and motor function, pain and discomfort, neurophysiological and patient- reported outcomes. This review makes a list of measurements addressing these aspects as well as advantage and disadvantage of each tool. Because of complexities of neurophysiology, assessment remains a difficult process, which requires researchers focus on measurements best relevant to specific conditions and research questions. PMID:23895715
Photochemistry and Photophysics of Aqueous Cr(NH3)5(CN)(2+) and Trans-Cr(NH3)4(CN)(2+).
1983-06-01
molecular sieve., and distilled at reduced pressure, under nitrogen atmosphere, before use. -4- Equipment and procedures. - Emission lifetimes...either process could be assigned such an activation energy from the molecular point of view. Chemical reaction from Dl could certainly be activated. In...requiring both a change in molecular geometry and in spin. In the present case, incidentally, it is difficult to estimate the Ql0-D 0 energy gap because of
Fabrication Of Metal Chloride Cathodes By Sintering
NASA Technical Reports Server (NTRS)
Bugga, Ratnakumar V.; Di Stefano, Salvador; Bankston, C. Perry
1992-01-01
Transition-metal chloride cathodes for use in high-temperature rechargeable sodium batteries prepared by sintering transition-metal powders mixed with sodium chloride. Need for difficult and dangerous chlorination process eliminated. Proportions of transition metal and sodium chloride in mixture adjusted to suit specific requirements. Cathodes integral to sodium/metal-chloride batteries, which have advantages over sodium/sulfur batteries including energy densities, increased safety, reduced material and thermal-management problems, and ease of operation and assembly. Being evaluated for supplying electrical power during peak demand and electric vehicles.
How to measure outcomes of peripheral nerve surgery.
Wang, Yirong; Sunitha, Malay; Chung, Kevin C
2013-08-01
Evaluation of outcomes after peripheral nerve surgeries include several assessment methods that reflect different aspects of recovery, including reinnervation, tactile gnosis, integrated sensory and motor function, pain and discomfort, and neurophysiologic and patient-reported outcomes. This review lists measurements addressing these aspects as well as the advantages and disadvantages of each tool. Because of complexities of neurophysiology, assessment remains a difficult process, which requires researchers to focus on measurements best relevant to specific conditions and research questions. Copyright © 2013 Elsevier Inc. All rights reserved.
SIproc: an open-source biomedical data processing platform for large hyperspectral images.
Berisha, Sebastian; Chang, Shengyuan; Saki, Sam; Daeinejad, Davar; He, Ziqi; Mankar, Rupali; Mayerich, David
2017-04-10
There has recently been significant interest within the vibrational spectroscopy community to apply quantitative spectroscopic imaging techniques to histology and clinical diagnosis. However, many of the proposed methods require collecting spectroscopic images that have a similar region size and resolution to the corresponding histological images. Since spectroscopic images contain significantly more spectral samples than traditional histology, the resulting data sets can approach hundreds of gigabytes to terabytes in size. This makes them difficult to store and process, and the tools available to researchers for handling large spectroscopic data sets are limited. Fundamental mathematical tools, such as MATLAB, Octave, and SciPy, are extremely powerful but require that the data be stored in fast memory. This memory limitation becomes impractical for even modestly sized histological images, which can be hundreds of gigabytes in size. In this paper, we propose an open-source toolkit designed to perform out-of-core processing of hyperspectral images. By taking advantage of graphical processing unit (GPU) computing combined with adaptive data streaming, our software alleviates common workstation memory limitations while achieving better performance than existing applications.
Hendriks, A T W M; van Lier, J B; de Kreuk, M K
Fermentation and anaerobic digestion of organic waste and wastewater is broadly studied and applied. Despite widely available results and data for these processes, comparison of the generated results in literature is difficult. Not only due to the used variety of process conditions, but also because of the many different growth media that are used. Composition of growth media can influence biogas production (rates) and lead to process instability during anaerobic digestion. To be able to compare results of the different studies reported, and to ensure nutrient limitation is not influencing observations ascribed to process dynamics and/or reaction kinetics, a standard protocol for creating a defined growth medium for anaerobic digestion and mixed culture fermentation is proposed. This paper explains the role(s) of the different macro- and micronutrients, as well as the choices for a growth medium formulation strategy. In addition, the differences in nutrient requirements between mesophilic and thermophilic systems are discussed as well as the importance of specific trace metals regarding specific conversion routes and the possible supplementary requirement of vitamins. The paper will also give some insight into the bio-availability and toxicity of trace metals. A remarkable finding is that mesophilic and thermophilic enzymes are quite comparable at their optimum temperatures. This has consequences for the trace metal requirements of thermophiles under certain conditions. Under non-limiting conditions, the trace metal requirement of thermophilic systems is about 3 times higher than for mesophilic systems. Copyright © 2017 Elsevier Inc. All rights reserved.
[Leadership and change processes in hospitals].
Skogsaas, Bente P; Svendsen, Martin Veel
2006-11-30
Successful change processes in hospitals require leaders with strong competence and personal suitability, who can develop resource efficient and creative solutions. We have investigated how division leaders handle change processes and solve problems that arise in cross-disciplinary meeting activities. Eight division leaders at two hospitals in the same region of Norway have gone through in-depth interviews about change leadership. Some of the division leaders were familiar with facilitating change processes and used a range of methods and tools, but the majority had limited insight into which methods would be most appropriate in the various phases of a change process. They signalised that the most difficult challenge was to handle interactions dominated by suspicion, negative interpretation, assumptions and hidden agendas. Such interplays were the most limiting factor in the development of a common understanding of demands, goals and commitment to change processes across departments and units.
Advanced Q-switched DPSS lasers for ID-card marking
NASA Astrophysics Data System (ADS)
Hertwig, Michael; Paster, Martin; Terbrueggen, Ralf
2008-02-01
Increased homeland security concerns across the world have generated a strong demand for forgery-proof ID documents. Manufacturers currently employ a variety of high technology techniques to produce documents that are difficult to copy. However, production costs and lead times are still a concern when considering any possible manufacturing technology. Laser marking has already emerged as an important tool in the manufacturer's arsenal, and is currently being utilized to produce a variety of documents, such as plastic ID cards, drivers' licenses, health insurance cards and passports. The marks utilized can range from simple barcodes and text to high resolution, true grayscale images. The technical challenges posed by these marking tasks include delivering adequate mark legibility, minimizing substrate burning or charring, accurately reproducing grayscale data, and supporting the required process throughput. This article covers the advantages and basic requirements on laser marking of cards and reviews how laser output parameters affect marking quality, speed and overall process economics.
Automatic Analysis of Critical Incident Reports: Requirements and Use Cases.
Denecke, Kerstin
2016-01-01
Increasingly, critical incident reports are used as a means to increase patient safety and quality of care. The entire potential of these sources of experiential knowledge remains often unconsidered since retrieval and analysis is difficult and time-consuming, and the reporting systems often do not provide support for these tasks. The objective of this paper is to identify potential use cases for automatic methods that analyse critical incident reports. In more detail, we will describe how faceted search could offer an intuitive retrieval of critical incident reports and how text mining could support in analysing relations among events. To realise an automated analysis, natural language processing needs to be applied. Therefore, we analyse the language of critical incident reports and derive requirements towards automatic processing methods. We learned that there is a huge potential for an automatic analysis of incident reports, but there are still challenges to be solved.
Multiple functions of BCL-2 family proteins.
Hardwick, J Marie; Soane, Lucian
2013-02-01
BCL-2 family proteins are the regulators of apoptosis, but also have other functions. This family of interacting partners includes inhibitors and inducers of cell death. Together they regulate and mediate the process by which mitochondria contribute to cell death known as the intrinsic apoptosis pathway. This pathway is required for normal embryonic development and for preventing cancer. However, before apoptosis is induced, BCL-2 proteins have critical roles in normal cell physiology related to neuronal activity, autophagy, calcium handling, mitochondrial dynamics and energetics, and other processes of normal healthy cells. The relative importance of these physiological functions compared to their apoptosis functions in overall organismal physiology is difficult to decipher. Apoptotic and noncanonical functions of these proteins may be intertwined to link cell growth to cell death. Disentanglement of these functions may require delineation of biochemical activities inherent to the characteristic three-dimensional shape shared by distantly related viral and cellular BCL-2 family members.
Optimally designing games for behavioural research
Rafferty, Anna N.; Zaharia, Matei; Griffiths, Thomas L.
2014-01-01
Computer games can be motivating and engaging experiences that facilitate learning, leading to their increasing use in education and behavioural experiments. For these applications, it is often important to make inferences about the knowledge and cognitive processes of players based on their behaviour. However, designing games that provide useful behavioural data are a difficult task that typically requires significant trial and error. We address this issue by creating a new formal framework that extends optimal experiment design, used in statistics, to apply to game design. In this framework, we use Markov decision processes to model players' actions within a game, and then make inferences about the parameters of a cognitive model from these actions. Using a variety of concept learning games, we show that in practice, this method can predict which games will result in better estimates of the parameters of interest. The best games require only half as many players to attain the same level of precision. PMID:25002821
NASA Astrophysics Data System (ADS)
Bag, S.; de, A.
2010-09-01
The transport phenomena based heat transfer and fluid flow calculations in weld pool require a number of input parameters. Arc efficiency, effective thermal conductivity, and viscosity in weld pool are some of these parameters, values of which are rarely known and difficult to assign a priori based on the scientific principles alone. The present work reports a bi-directional three-dimensional (3-D) heat transfer and fluid flow model, which is integrated with a real number based genetic algorithm. The bi-directional feature of the integrated model allows the identification of the values of a required set of uncertain model input parameters and, next, the design of process parameters to achieve a target weld pool dimension. The computed values are validated with measured results in linear gas-tungsten-arc (GTA) weld samples. Furthermore, a novel methodology to estimate the overall reliability of the computed solutions is also presented.
First-Principles Prediction of Liquid/Liquid Interfacial Tension.
Andersson, M P; Bennetzen, M V; Klamt, A; Stipp, S L S
2014-08-12
The interfacial tension between two liquids is the free energy per unit surface area required to create that interface. Interfacial tension is a determining factor for two-phase liquid behavior in a wide variety of systems ranging from water flooding in oil recovery processes and remediation of groundwater aquifers contaminated by chlorinated solvents to drug delivery and a host of industrial processes. Here, we present a model for predicting interfacial tension from first principles using density functional theory calculations. Our model requires no experimental input and is applicable to liquid/liquid systems of arbitrary compositions. The consistency of the predictions with experimental data is significant for binary, ternary, and multicomponent water/organic compound systems, which offers confidence in using the model to predict behavior where no data exists. The method is fast and can be used as a screening technique as well as to extend experimental data into conditions where measurements are technically too difficult, time consuming, or impossible.
The Systems Engineering Process for Human Support Technology Development
NASA Technical Reports Server (NTRS)
Jones, Harry
2005-01-01
Systems engineering is designing and optimizing systems. This paper reviews the systems engineering process and indicates how it can be applied in the development of advanced human support systems. Systems engineering develops the performance requirements, subsystem specifications, and detailed designs needed to construct a desired system. Systems design is difficult, requiring both art and science and balancing human and technical considerations. The essential systems engineering activity is trading off and compromising between competing objectives such as performance and cost, schedule and risk. Systems engineering is not a complete independent process. It usually supports a system development project. This review emphasizes the NASA project management process as described in NASA Procedural Requirement (NPR) 7120.5B. The process is a top down phased approach that includes the most fundamental activities of systems engineering - requirements definition, systems analysis, and design. NPR 7120.5B also requires projects to perform the engineering analyses needed to ensure that the system will operate correctly with regard to reliability, safety, risk, cost, and human factors. We review the system development project process, the standard systems engineering design methodology, and some of the specialized systems analysis techniques. We will discuss how they could apply to advanced human support systems development. The purpose of advanced systems development is not directly to supply human space flight hardware, but rather to provide superior candidate systems that will be selected for implementation by future missions. The most direct application of systems engineering is in guiding the development of prototype and flight experiment hardware. However, anticipatory systems engineering of possible future flight systems would be useful in identifying the most promising development projects.
Raj, M; Choi, S W; Platt, J
2017-02-01
Informed consent (IC) struggles to meet the ethical principles it strives to embody in the context of hematopoietic cell transplantation (HCT). Patients often participate in multiple clinical trials making it difficult to effectively inform the participants and fulfill complex regulations. The recent Notice of Proposed Rule Making would make major changes to federal requirements, providing a timely opportunity to evaluate existing practice. Twenty health care professionals within a Midwest Academic Medical Center involved in obtaining IC in the HCT clinic or involved in patient care during or after the IC process were interviewed to understand: (1) how they approached the IC process; (2) how they described a 'successful' IC process; and (3) opportunities for innovation. Narrative and discourse analyses of interviews indicate that providers understand IC to be a collaborative process requiring engagement and participation of providers, patients and caregivers. 'Markers of success' were identified including cognitive, affective and procedural markers focusing on patient understanding and comfort with the decision to participate. Opportunities for innovating the process included use of decision aids and tablet-based technology, and better use of patient portals. Our findings suggest specific interventions for the IC process that could support the process of consent for providers, patients and caregivers.
Singlet oxygen detection in biological systems: Uses and limitations
Koh, Eugene; Fluhr, Robert
2016-01-01
ABSTRACT The study of singlet oxygen in biological systems is challenging in many ways. Singlet oxygen is a relatively unstable ephemeral molecule, and its properties make it highly reactive with many biomolecules, making it difficult to quantify accurately. Several methods have been developed to study this elusive molecule, but most studies thus far have focused on those conditions that produce relatively large amounts of singlet oxygen. However, the need for more sensitive methods is required as one begins to explore the levels of singlet oxygen required in signaling and regulatory processes. Here we discuss the various methods used in the study of singlet oxygen, and outline their uses and limitations. PMID:27231787
Brg1 coordinates multiple processes during retinogenesis and is a tumor suppressor in retinoblastoma
Aldiri, Issam; Ajioka, Itsuki; Xu, Beisi; ...
2015-12-01
Retinal development requires precise temporal and spatial coordination of cell cycle exit, cell fate specification, cell migration and differentiation. When this process is disrupted, retinoblastoma, a developmental tumor of the retina, can form. Epigenetic modulators are central to precisely coordinating developmental events, and many epigenetic processes have been implicated in cancer. Studying epigenetic mechanisms in development is challenging because they often regulate multiple cellular processes; therefore, elucidating the primary molecular mechanisms involved can be difficult. Here we explore the role of Brg1 (Smarca4) in retinal development and retinoblastoma in mice using molecular and cellular approaches. Brg1 was found to regulatemore » retinal size by controlling cell cycle length, cell cycle exit and cell survival during development. Brg1 was not required for cell fate specification but was required for photoreceptor differentiation and cell adhesion/polarity programs that contribute to proper retinal lamination during development. The combination of defective cell differentiation and lamination led to retinal degeneration in Brg1-deficient retinae. Despite the hypocellularity, premature cell cycle exit, increased cell death and extended cell cycle length, retinal progenitor cells persisted in Brg1-deficient retinae, making them more susceptible to retinoblastoma. In conclusion, ChIP-Seq analysis suggests that Brg1 might regulate gene expression through multiple mechanisms.« less
Brg1 coordinates multiple processes during retinogenesis and is a tumor suppressor in retinoblastoma
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aldiri, Issam; Ajioka, Itsuki; Xu, Beisi
Retinal development requires precise temporal and spatial coordination of cell cycle exit, cell fate specification, cell migration and differentiation. When this process is disrupted, retinoblastoma, a developmental tumor of the retina, can form. Epigenetic modulators are central to precisely coordinating developmental events, and many epigenetic processes have been implicated in cancer. Studying epigenetic mechanisms in development is challenging because they often regulate multiple cellular processes; therefore, elucidating the primary molecular mechanisms involved can be difficult. Here we explore the role of Brg1 (Smarca4) in retinal development and retinoblastoma in mice using molecular and cellular approaches. Brg1 was found to regulatemore » retinal size by controlling cell cycle length, cell cycle exit and cell survival during development. Brg1 was not required for cell fate specification but was required for photoreceptor differentiation and cell adhesion/polarity programs that contribute to proper retinal lamination during development. The combination of defective cell differentiation and lamination led to retinal degeneration in Brg1-deficient retinae. Despite the hypocellularity, premature cell cycle exit, increased cell death and extended cell cycle length, retinal progenitor cells persisted in Brg1-deficient retinae, making them more susceptible to retinoblastoma. In conclusion, ChIP-Seq analysis suggests that Brg1 might regulate gene expression through multiple mechanisms.« less
Requirements to the procedure and stages of innovative fuel development
NASA Astrophysics Data System (ADS)
Troyanov, V.; Zabudko, L.; Grachyov, A.; Zhdanova, O.
2016-04-01
According to the accepted current understanding under the nuclear fuel we will consider the assembled active zone unit (Fuel assembly) with its structural elements, fuel rods, pellet column, structural materials of fuel rods and fuel assemblies. The licensing process includes justification of safe application of the proposed modifications, including design-basis and experimental justification of the modified items under normal operating conditions and in violation of normal conditions, including accidents as well. Besides the justification of modified units itself, it is required to show the influence of modifications on the performance and safety of the other Reactor Unit’ and Nuclear Plant’ elements (e.g. burst can detection system, transportation and processing operations during fuel handling), as well as to justify the new standards of fuel storage etc. Finally, the modified fuel should comply with the applicable regulations, which often becomes a very difficult task, if only because those regulations, such as the NP-082-07, are not covered modification issues. Making amendments into regulations can be considered as the only solution, but the process is complicated and requires deep grounds for amendments. Some aspects of licensing new nuclear fuel are considered the example of mixed nitride uranium -plutonium fuel application for the BREST reactor unit.
An analysis of the processing requirements of a complex perceptual-motor task
NASA Technical Reports Server (NTRS)
Kramer, A. F.; Wickens, C. D.; Donchin, E.
1983-01-01
Current concerns in the assessment of mental workload are discussed, and the event-related brain potential (ERP) is introduced as a promising mental-workload index. Subjects participated in a series of studies in which they were required to perform a target acquisition task while also covertly counting either auditory or visual probes. The effects of several task-difficulty manipulations on the P300 component of the ERP elicited by the counted stimulus probes were investigated. With sufficiently practiced subjects the amplitude of the P300 was found to decrease with increases in task difficulty. The second experiment also provided evidence that the P300 is selectively sensitive to task-relevant attributes. A third experiment demonstrated a convergence in the amplitude of the P300s elicited in the simple and difficult versions of the tracking task. The amplitude of the P300 was also found to covary with the measures of tracking performance. The results of the series of three experiments illustrate the sensitivity of the P300 to the processing requirements of a complex target acquisition task. The findings are discussed in terms of the multidimensional nature of processing resources.
Functional Characterization of the Cingulo-Opercular Network in the Maintenance of Tonic Alertness
Sadaghiani, Sepideh; D'Esposito, Mark
2015-01-01
The complex processing architecture underlying attentional control requires delineation of the functional role of different control-related brain networks. A key component is the cingulo-opercular (CO) network composed of anterior insula/operculum, dorsal anterior cingulate cortex, and thalamus. Its function has been particularly difficult to characterize due to the network's pervasive activity and frequent co-activation with other control-related networks. We previously suggested this network to underlie intrinsically maintained tonic alertness. Here, we tested this hypothesis by separately manipulating the demand for selective attention and for tonic alertness in a two-factorial, continuous pitch discrimination paradigm. The 2 factors had independent behavioral effects. Functional imaging revealed that activity as well as functional connectivity in the CO network increased when the task required more tonic alertness. Conversely, heightened selective attention to pitch increased activity in the dorsal attention (DAT) network but not in the CO network. Across participants, performance accuracy showed dissociable correlation patterns with activity in the CO, DAT, and fronto-parietal (FP) control networks. These results support tonic alertness as a fundamental function of the CO network. They further the characterization of this function as the effortful process of maintaining cognitive faculties available for current processing requirements. PMID:24770711
Repatriation of human remains following death in international travellers.
Connolly, Ruairi; Prendiville, Richard; Cusack, Denis; Flaherty, Gerard
2017-03-01
Death during international travel and the repatriation of human remains to one's home country is a distressing and expensive process. Much organization is required involving close liaison between various agencies. A review of the literature was conducted using the PubMed database. Search terms included: 'repatriation of remains', 'death', 'abroad', 'tourism', 'travel', 'travellers', 'travelling' and 'repatriation'. Additional articles were obtained from grey literature sources and reference lists. The local national embassy, travel insurance broker and tour operator are important sources of information to facilitate the repatriation of the deceased traveller. Formal identification of the deceased's remains is required and a funeral director must be appointed. Following this, the coroner in the country or jurisdiction receiving the repatriated remains will require a number of documents prior to providing clearance for burial. Costs involved in repatriating remains must be borne by the family of the deceased although travel insurance may help defray some of the costs. If the death is secondary to an infectious disease, cremation at the site of death is preferred. No standardized procedure is in place to deal with the remains of a migrant's body at present and these remains are often not repatriated to their country of origin. Repatriation of human remains is a difficult task which is emotionally challenging for the bereaving family and friends. As a travel medicine practitioner, it is prudent to discuss all eventualities, including the risk of death, during the pre-travel consultation. Awareness of the procedures involved in this process may ease the burden on the grieving family at a difficult time. © International Society of Travel Medicine, 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Lock hopper values for coal gasification plant service
NASA Technical Reports Server (NTRS)
Schoeneweis, E. F.
1977-01-01
Although the operating principle of the lock hopper system is extremely simple, valve applications involving this service for coal gasification plants are likewise extremely difficult. The difficulties center on the requirement of handling highly erosive pulverized coal or char (either in dry or slurry form) combined with the requirement of providing tight sealing against high-pressure (possibly very hot) gas. Operating pressures and temperatures in these applications typically range up to 1600 psi (110bar) and 600F (316C), with certain process requirements going even higher. In addition, and of primary concern, is the need for reliable operation over long service periods with the provision for practical and economical maintenance. Currently available data indicate the requirement for something in the order of 20,000 to 30,000 open-close cycles per year and a desire to operate at least that long without valve failure.
Zhang, Yajun; Chai, Tianyou; Wang, Hong; Wang, Dianhui; Chen, Xinkai
2018-06-01
Complex industrial processes are multivariable and generally exhibit strong coupling among their control loops with heavy nonlinear nature. These make it very difficult to obtain an accurate model. As a result, the conventional and data-driven control methods are difficult to apply. Using a twin-tank level control system as an example, a novel multivariable decoupling control algorithm with adaptive neural-fuzzy inference system (ANFIS)-based unmodeled dynamics (UD) compensation is proposed in this paper for a class of complex industrial processes. At first, a nonlinear multivariable decoupling controller with UD compensation is introduced. Different from the existing methods, the decomposition estimation algorithm using ANFIS is employed to estimate the UD, and the desired estimating and decoupling control effects are achieved. Second, the proposed method does not require the complicated switching mechanism which has been commonly used in the literature. This significantly simplifies the obtained decoupling algorithm and its realization. Third, based on some new lemmas and theorems, the conditions on the stability and convergence of the closed-loop system are analyzed to show the uniform boundedness of all the variables. This is then followed by the summary on experimental tests on a heavily coupled nonlinear twin-tank system that demonstrates the effectiveness and the practicability of the proposed method.
Implications of learning theory for developing programs to decrease overeating.
Boutelle, Kerri N; Bouton, Mark E
2015-10-01
Childhood obesity is associated with medical and psychological comorbidities, and interventions targeting overeating could be pragmatic and have a significant impact on weight. Calorically dense foods are easily available, variable, and tasty which allows for effective opportunities to learn to associate behaviors and cues in the environment with food through fundamental conditioning processes, resulting in measurable psychological and physiological food cue reactivity in vulnerable children. Basic research suggests that initial learning is difficult to erase, and that it is vulnerable to a number of phenomena that will allow the original learning to re-emerge after it is suppressed or replaced. These processes may help explain why it may be difficult to change food cue reactivity and overeating over the long term. Extinction theory may be used to develop effective cue-exposure treatments to decrease food cue reactivity through inhibitory learning, although these processes are complex and require an integral understanding of the theory and individual differences. Additionally, learning theory can be used to develop other interventions that may prove to be useful. Through an integration of learning theory, basic and translational research, it may be possible to develop interventions that can decrease the urges to overeat, and improve the weight status of children. Copyright © 2015 Elsevier Ltd. All rights reserved.
Milan, Felise B; Parish, Sharon J; Reichgott, Michael J
2006-01-01
Feedback is an essential tool in medical education, and the process is often difficult for both faculty and learner. There are strong analogies between the provision of educational feedback and doctor-patient communication during the clinical encounter. Relationship-building skills used in the clinical setting-Partnership, Empathy, Apology, Respect, Legitimation, Support (PEARLS)-can establish trust with the learner to better manage difficult feedback situations involving personal issues, unprofessional behavior, or a defensive learner. Using the stage of readiness to change (transtheoretical) model, the educator can "diagnose" the learner's stage of readiness and employ focused interventions to encourage desired changes. This approach has been positively received by medical educators in faculty development workshops. A model for provision of educational feedback based on communication skills used in the clinical encounter can be useful in the medical education setting. More robust evaluation of the construct validity is required in actual training program situations.
Intuitive Logic Revisited: New Data and a Bayesian Mixed Model Meta-Analysis
Singmann, Henrik; Klauer, Karl Christoph; Kellen, David
2014-01-01
Recent research on syllogistic reasoning suggests that the logical status (valid vs. invalid) of even difficult syllogisms can be intuitively detected via differences in conceptual fluency between logically valid and invalid syllogisms when participants are asked to rate how much they like a conclusion following from a syllogism (Morsanyi & Handley, 2012). These claims of an intuitive logic are at odds with most theories on syllogistic reasoning which posit that detecting the logical status of difficult syllogisms requires effortful and deliberate cognitive processes. We present new data replicating the effects reported by Morsanyi and Handley, but show that this effect is eliminated when controlling for a possible confound in terms of conclusion content. Additionally, we reanalyze three studies () without this confound with a Bayesian mixed model meta-analysis (i.e., controlling for participant and item effects) which provides evidence for the null-hypothesis and against Morsanyi and Handley's claim. PMID:24755777
An Algorithm to Automate Yeast Segmentation and Tracking
Doncic, Andreas; Eser, Umut; Atay, Oguzhan; Skotheim, Jan M.
2013-01-01
Our understanding of dynamic cellular processes has been greatly enhanced by rapid advances in quantitative fluorescence microscopy. Imaging single cells has emphasized the prevalence of phenomena that can be difficult to infer from population measurements, such as all-or-none cellular decisions, cell-to-cell variability, and oscillations. Examination of these phenomena requires segmenting and tracking individual cells over long periods of time. However, accurate segmentation and tracking of cells is difficult and is often the rate-limiting step in an experimental pipeline. Here, we present an algorithm that accomplishes fully automated segmentation and tracking of budding yeast cells within growing colonies. The algorithm incorporates prior information of yeast-specific traits, such as immobility and growth rate, to segment an image using a set of threshold values rather than one specific optimized threshold. Results from the entire set of thresholds are then used to perform a robust final segmentation. PMID:23520484
Hybrid neural network and fuzzy logic approaches for rendezvous and capture in space
NASA Technical Reports Server (NTRS)
Berenji, Hamid R.; Castellano, Timothy
1991-01-01
The nonlinear behavior of many practical systems and unavailability of quantitative data regarding the input-output relations makes the analytical modeling of these systems very difficult. On the other hand, approximate reasoning-based controllers which do not require analytical models have demonstrated a number of successful applications such as the subway system in the city of Sendai. These applications have mainly concentrated on emulating the performance of a skilled human operator in the form of linguistic rules. However, the process of learning and tuning the control rules to achieve the desired performance remains a difficult task. Fuzzy Logic Control is based on fuzzy set theory. A fuzzy set is an extension of a crisp set. Crisp sets only allow full membership or no membership at all, whereas fuzzy sets allow partial membership. In other words, an element may partially belong to a set.
Theoretical Estimation of Thermal Effects in Drilling of Woven Carbon Fiber Composite
Díaz-Álvarez, José; Olmedo, Alvaro; Santiuste, Carlos; Miguélez, María Henar
2014-01-01
Carbon Fiber Reinforced Polymer (CFRPs) composites are extensively used in structural applications due to their attractive properties. Although the components are usually made near net shape, machining processes are needed to achieve dimensional tolerance and assembly requirements. Drilling is a common operation required for further mechanical joining of the components. CFRPs are vulnerable to processing induced damage; mainly delamination, fiber pull-out, and thermal degradation, drilling induced defects being one of the main causes of component rejection during manufacturing processes. Despite the importance of analyzing thermal phenomena involved in the machining of composites, only few authors have focused their attention on this problem, most of them using an experimental approach. The temperature at the workpiece could affect surface quality of the component and its measurement during processing is difficult. The estimation of the amount of heat generated during drilling is important; however, numerical modeling of drilling processes involves a high computational cost. This paper presents a combined approach to thermal analysis of composite drilling, using both an analytical estimation of heat generated during drilling and numerical modeling for heat propagation. Promising results for indirect detection of risk of thermal damage, through the measurement of thrust force and cutting torque, are obtained. PMID:28788685
An Approach to Building a Traceability Tool for Software Development
NASA Technical Reports Server (NTRS)
Delgado, Nelly; Watson, Tom
1997-01-01
It is difficult in a large, complex computer program to ensure that it meets the specified requirements. As the program evolves over time, a11 program constraints originally elicited during the requirements phase must be maintained. In addition, during the life cycle of the program, requirements typically change and the program must consistently reflect those changes. Imagine the following scenario. Company X wants to develop a system to automate its assembly line. With such a large system, there are many different stakeholders, e.g., managers, experts such as industrial and mechanical engineers, and end-users. Requirements would be elicited from all of the stake holders involved in the system with each stakeholder contributing their point of view to the requirements. For example, some of the requirements provided by an industrial engineer may concern the movement of parts through the assembly line. A point of view provided by the electrical engineer may be reflected in constraints concerning maximum power usage. End-users may be concerned with comfort and safety issues, whereas managers are concerned with the efficiency of the operation. With so many points of view affecting the requirements, it is difficult to manage them, communicate information to relevant stakeholders. and it is likely that conflicts in the requirements will arise. In the coding process, the implementors will make additional assumptions and interpretations on the design and the requirements of the system. During any stage of development, stakeholders may request that a requirement be added or changed. In such a dynamic environment, it is difficult to guarantee that the system will preserve the current set of requirements. Tracing, the mapping between objects in the artifacts of the system being developed, addresses this issue. Artifacts encompass documents such as the system definition, interview transcripts, memoranda, the software requirements specification, user's manuals, the functional specifications, design reports, and system code. Tracing helps 1) validate system features against, the requirement specification, 2) identify error sources and, most importantly, 3) manage change. With so many people involved in the development of the system, it becomes necessary to identify the reasons behind the design requirements or the implementation decisions. This paper is concerned with an approach that maps documents to constraints that capture properties of and relationships between the objects being modeled by the program. Section 2 provides the reader with a background on traceability tools. Section 3 gives a brief description of the context monitoring system on which the approach suggested in this paper is based. Section 4 presents an overview of our approach to providing traceability. The last section presents our future direction of research.
NASA Astrophysics Data System (ADS)
Yadav, Vinod; Singh, Arbind Kumar; Dixit, Uday Shanker
2017-08-01
Flat rolling is one of the most widely used metal forming processes. For proper control and optimization of the process, modelling of the process is essential. Modelling of the process requires input data about material properties and friction. In batch production mode of rolling with newer materials, it may be difficult to determine the input parameters offline. In view of it, in the present work, a methodology to determine these parameters online by the measurement of exit temperature and slip is verified experimentally. It is observed that the inverse prediction of input parameters could be done with a reasonable accuracy. It was also assessed experimentally that there is a correlation between micro-hardness and flow stress of the material; however the correlation between surface roughness and reduction is not that obvious.
Experimental analysis of Nd-YAG laser cutting of sheet materials - A review
NASA Astrophysics Data System (ADS)
Sharma, Amit; Yadava, Vinod
2018-01-01
Cutting of sheet material is considered as an important process due to its relevance among products of everyday life such as aircrafts, ships, cars, furniture etc. Among various sheet cutting processes (ASCPs), laser beam cutting is one of the most capable ASCP to create complex geometries with stringent design requirements in difficult-to-cut sheet materials. Based on the recent research work in the area of sheet cutting, it is found that the Nd-YAG laser is used for cutting of sheet material in general and reflective sheet material in particular. This paper reviews the experimental analysis of Nd-YAG laser cutting process, carried out to study the influence of laser cutting parameters on the process performance index. The significance of experimental modeling and different optimization approaches employed by various researchers has also been discussed in this study.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tatur, I.R.; Prigul`skii, G.B.; Timokhin, I.A.
Enamel paints MS-17 and KhV-110 are used to protect drill bits during transportation and storage. But this requires the surface to be prepared carefully, which is often difficult under production conditions. Some of the promising anticorrosion agents are film-forming inhibited petroleum compounds (FIPC) - materials derived from high-temperature petroleum products blended with corrosion inhibitors and a solvent. Such compounds are used to protect unpainted and painted surfaces; this shortens the preservation process, and generally dispenses which depreservation. Further, they can be used to protect moist, wet, greasy, and rusted surfaces, and concealed inner areas where paint is difficult to apply.more » The purpose of this work was to obtain a film-forming inhibited petroleum compound that has high protective properties, can be applied on unprepared metal surfaces, and meets the following requirements: drill bit protection time during transportation and storage at least 24 months, coat adhesion to the metal at least of force 2, drop point > 90{degrees}C, the material must be applied by pneumatic spraying, in toxicity and inflammability the compound must be of class three, and coat drying time at 60{degrees}C not more than 12 min. The anticorrosion agents are described.« less
NASA Astrophysics Data System (ADS)
Rimland, Jeffrey; Ballora, Mark; Shumaker, Wade
2013-05-01
As the sheer volume of data grows exponentially, it becomes increasingly difficult for existing visualization techniques to keep pace. The sonification field attempts to address this issue by enlisting our auditory senses to detect anomalies or complex events that are difficult to detect via visualization alone. Storification attempts to improve analyst understanding by converting data streams into organized narratives describing the data at a higher level of abstraction than the input stream that they area derived from. While these techniques hold a great deal of promise, they also each have a unique set of challenges that must be overcome. Sonification techniques must represent a broad variety of distributed heterogeneous data and present it to the analyst/listener in a manner that doesn't require extended listening - as visual "snapshots" are useful but auditory sounds only exist over time. Storification still faces many human-computer interface (HCI) challenges as well as technical hurdles related to automatically generating a logical narrative from lower-level data streams. This paper proposes a novel approach that utilizes a service oriented architecture (SOA)-based hybrid visualization/ sonification / storification framework to enable distributed human-in-the-loop processing of data in a manner that makes optimized usage of both visual and auditory processing pathways while also leveraging the value of narrative explication of data streams. It addresses the benefits and shortcomings of each processing modality and discusses information infrastructure and data representation concerns required with their utilization in a distributed environment. We present a generalizable approach with a broad range of applications including cyber security, medical informatics, facilitation of energy savings in "smart" buildings, and detection of natural and man-made disasters.
Constraints on Fluctuations in Sparsely Characterized Biological Systems.
Hilfinger, Andreas; Norman, Thomas M; Vinnicombe, Glenn; Paulsson, Johan
2016-02-05
Biochemical processes are inherently stochastic, creating molecular fluctuations in otherwise identical cells. Such "noise" is widespread but has proven difficult to analyze because most systems are sparsely characterized at the single cell level and because nonlinear stochastic models are analytically intractable. Here, we exactly relate average abundances, lifetimes, step sizes, and covariances for any pair of components in complex stochastic reaction systems even when the dynamics of other components are left unspecified. Using basic mathematical inequalities, we then establish bounds for whole classes of systems. These bounds highlight fundamental trade-offs that show how efficient assembly processes must invariably exhibit large fluctuations in subunit levels and how eliminating fluctuations in one cellular component requires creating heterogeneity in another.
Constraints on Fluctuations in Sparsely Characterized Biological Systems
NASA Astrophysics Data System (ADS)
Hilfinger, Andreas; Norman, Thomas M.; Vinnicombe, Glenn; Paulsson, Johan
2016-02-01
Biochemical processes are inherently stochastic, creating molecular fluctuations in otherwise identical cells. Such "noise" is widespread but has proven difficult to analyze because most systems are sparsely characterized at the single cell level and because nonlinear stochastic models are analytically intractable. Here, we exactly relate average abundances, lifetimes, step sizes, and covariances for any pair of components in complex stochastic reaction systems even when the dynamics of other components are left unspecified. Using basic mathematical inequalities, we then establish bounds for whole classes of systems. These bounds highlight fundamental trade-offs that show how efficient assembly processes must invariably exhibit large fluctuations in subunit levels and how eliminating fluctuations in one cellular component requires creating heterogeneity in another.
Sequential establishment of stripe patterns in an expanding cell population.
Liu, Chenli; Fu, Xiongfei; Liu, Lizhong; Ren, Xiaojing; Chau, Carlos K L; Li, Sihong; Xiang, Lu; Zeng, Hualing; Chen, Guanhua; Tang, Lei-Han; Lenz, Peter; Cui, Xiaodong; Huang, Wei; Hwa, Terence; Huang, Jian-Dong
2011-10-14
Periodic stripe patterns are ubiquitous in living organisms, yet the underlying developmental processes are complex and difficult to disentangle. We describe a synthetic genetic circuit that couples cell density and motility. This system enabled programmed Escherichia coli cells to form periodic stripes of high and low cell densities sequentially and autonomously. Theoretical and experimental analyses reveal that the spatial structure arises from a recurrent aggregation process at the front of the continuously expanding cell population. The number of stripes formed could be tuned by modulating the basal expression of a single gene. The results establish motility control as a simple route to establishing recurrent structures without requiring an extrinsic pacemaker.
Fluid Physics and Macromolecular Crystal Growth in Microgravity
NASA Technical Reports Server (NTRS)
Pusey, M.; Snell, E.; Judge, R.; Chayen, N.; Boggon, T.; Helliwell, J.; Rose, M. Franklin (Technical Monitor)
2000-01-01
The molecular structure of biological macromolecules is important in understanding how these molecules work and has direct application to rational drug design for new medicines and for the improvement and development of industrial enzymes. In order to obtain the molecular structure, large, well formed, single macromolecule crystals are required. The growth of macromolecule crystals is a difficult task and is often hampered on the ground by fluid flows that result from the interaction of gravity with the crystal growth process. One such effect is the bulk movement of the crystal through the fluid due to sedimentation. A second is buoyancy driven convection close to the crystal surface. On the ground the crystallization process itself induces both of these flows.
Suau, Salvador J; DeBlieux, Peter M C
2016-02-01
Acute asthma and chronic obstructive pulmonary disease (COPD) exacerbations are the most common respiratory diseases requiring emergent medical evaluation and treatment. Asthma and COPD are chronic, debilitating disease processes that have been differentiated traditionally by the presence or absence of reversible airflow obstruction. Asthma and COPD exacerbations impose an enormous economic burden on the US health care budget. In daily clinical practice, it is difficult to differentiate these 2 obstructive processes based on their symptoms, and on their nearly identical acute treatment strategies; major differences are important when discussing anatomic sites involved, long-term prognosis, and the nature of inflammatory markers. Copyright © 2016 Elsevier Inc. All rights reserved.
Study of issues in difficult-to-weld thick materials by hybrid laser arc welding
NASA Astrophysics Data System (ADS)
Mazar Atabaki, Mehdi
There is a high interest for the high strength-to-weight ratio with good ductility for the welds of advanced alloys. The concern about the welding of thick materials (Advanced high strength steels (AHSS) and 5xxx and 6xxx series of aluminum alloys) has stimulated the development of manufacturing processes to overcome the associated issues. The need to weld the dissimilar materials (AHSS and aluminum alloys) is also required for some specific applications in different industries. Hence, the requirement in the development of a state-of-the-art welding procedure can be helpful to fulfill the constraints. Among the welding methods hybrid laser/arc welding (HLAW) has shown to be an effective method to join thick and difficult-to-weld materials. This process benefits from both advantages of the gas metal arc welding (GMAW) and laser welding processes. The interaction of the arc and laser can help to have enough penetration of weld in thick plates. However, as the welding of dissimilar aluminum alloys and steels is very difficult because of the formation of brittle intermetallics the present work proposed a procedure to effectively join the alloys. The reports showed that the explosively welded aluminum alloys to steels have the highest toughness, and that could be used as an "insert" (TRICLAD) for welding the thick plates of AHSS to aluminum alloys. Therefore, the HLAW of the TRICLAD-Flange side (Aluminum alloy (AA 5456)) to the Web side (Aluminum alloys (AA 6061 and AA 5456)) and the TRICLAD-Flange side (ASTM A516) to the Web side (AHSS) was studied in the present work. However, there are many issues related to HLAW of the dissimilar steels as well as dissimilar aluminum alloys that have to be resolved in order to obtain sound welds. To address the challenges, the most recent welding methods for joining aluminum alloys to steels were studied and the microstructural development, mechanical properties, and on-line monitoring of the welding processes were discussed as well. The heat and mass transfer and the issues in joining of dissimilar alloys by the hybrid laser/arc welding process (HLAW) were explicitly explained in details. A finite element model was developed to simulate the heat transfer in HLAW of the aluminum alloys. Two double-ellipsoidal heat source models were considered to describe the heat input of the gas metal arc welding and laser welding processes. An experimental procedure was also developed for joining thick advanced high strength steel plates by using the HLAW, by taking into consideration different butt joint configurations. The geometry of the weld groove was optimized according to the requirements of ballistic test, where the length of the softened heat affected zone should be less than 15.9 mm measured from the weld centerline. Since the main issue in HLAW of the AHSS was the formation of the pores, the possible mechanisms of the pores formation and their mitigation methods during the welding process were investigated. Mitigation methods were proposed to reduce the pores inside in the weld area and the influence of each method on the process stability was investigated by an on-line monitoring system of the HLAW process. The groove angle was optimized for the welding process based on the allowed amount of heat input along the TRICLADRTM interface generated by an explosive welding. The weld was fractured in the heat affected zone of the aluminum side in the tensile test. The microharness was shown that the temperature variation caused minor softening in the heat affected zone satisfying the requirement that the width of the softened heat affected zone in the steel side falls within 15.9 mm far away from the weld centerline. The microstructure analysis showed the presence of tempered martensite at the vicinity of the weld area, which it was a cause of softening in the heat affected zone.
Minimized state complexity of quantum-encoded cryptic processes
NASA Astrophysics Data System (ADS)
Riechers, Paul M.; Mahoney, John R.; Aghamohammadi, Cina; Crutchfield, James P.
2016-05-01
The predictive information required for proper trajectory sampling of a stochastic process can be more efficiently transmitted via a quantum channel than a classical one. This recent discovery allows quantum information processing to drastically reduce the memory necessary to simulate complex classical stochastic processes. It also points to a new perspective on the intrinsic complexity that nature must employ in generating the processes we observe. The quantum advantage increases with codeword length: the length of process sequences used in constructing the quantum communication scheme. In analogy with the classical complexity measure, statistical complexity, we use this reduced communication cost as an entropic measure of state complexity in the quantum representation. Previously difficult to compute, the quantum advantage is expressed here in closed form using spectral decomposition. This allows for efficient numerical computation of the quantum-reduced state complexity at all encoding lengths, including infinite. Additionally, it makes clear how finite-codeword reduction in state complexity is controlled by the classical process's cryptic order, and it allows asymptotic analysis of infinite-cryptic-order processes.
Brunger, Axel T; Das, Debanu; Deacon, Ashley M; Grant, Joanna; Terwilliger, Thomas C; Read, Randy J; Adams, Paul D; Levitt, Michael; Schröder, Gunnar F
2012-04-01
Phasing by molecular replacement remains difficult for targets that are far from the search model or in situations where the crystal diffracts only weakly or to low resolution. Here, the process of determining and refining the structure of Cgl1109, a putative succinyl-diaminopimelate desuccinylase from Corynebacterium glutamicum, at ∼3 Å resolution is described using a combination of homology modeling with MODELLER, molecular-replacement phasing with Phaser, deformable elastic network (DEN) refinement and automated model building using AutoBuild in a semi-automated fashion, followed by final refinement cycles with phenix.refine and Coot. This difficult molecular-replacement case illustrates the power of including DEN restraints derived from a starting model to guide the movements of the model during refinement. The resulting improved model phases provide better starting points for automated model building and produce more significant difference peaks in anomalous difference Fourier maps to locate anomalous scatterers than does standard refinement. This example also illustrates a current limitation of automated procedures that require manual adjustment of local sequence misalignments between the homology model and the target sequence.
Brunger, Axel T.; Das, Debanu; Deacon, Ashley M.; Grant, Joanna; Terwilliger, Thomas C.; Read, Randy J.; Adams, Paul D.; Levitt, Michael; Schröder, Gunnar F.
2012-01-01
Phasing by molecular replacement remains difficult for targets that are far from the search model or in situations where the crystal diffracts only weakly or to low resolution. Here, the process of determining and refining the structure of Cgl1109, a putative succinyl-diaminopimelate desuccinylase from Corynebacterium glutamicum, at ∼3 Å resolution is described using a combination of homology modeling with MODELLER, molecular-replacement phasing with Phaser, deformable elastic network (DEN) refinement and automated model building using AutoBuild in a semi-automated fashion, followed by final refinement cycles with phenix.refine and Coot. This difficult molecular-replacement case illustrates the power of including DEN restraints derived from a starting model to guide the movements of the model during refinement. The resulting improved model phases provide better starting points for automated model building and produce more significant difference peaks in anomalous difference Fourier maps to locate anomalous scatterers than does standard refinement. This example also illustrates a current limitation of automated procedures that require manual adjustment of local sequence misalignments between the homology model and the target sequence. PMID:22505259
Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N A; Bin Zaheer, Kashif
2015-01-01
Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called 'StakeMeter'. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error.
Environmental Conditions for Space Flight Hardware: A Survey
NASA Technical Reports Server (NTRS)
Plante, Jeannette; Lee, Brandon
2005-01-01
Interest in generalization of the physical environment experienced by NASA hardware from the natural Earth environment (on the launch pad), man-made environment on Earth (storage acceptance an d qualification testing), the launch environment, and the space environment, is ed to find commonality among our hardware in an effort to reduce cost and complexity. NASA is entering a period of increase in its number of planetary missions and it is important to understand how our qualification requirements will evolve with and track these new environments. Environmental conditions are described for NASA projects in several ways for the different periods of the mission life cycle. At the beginning, the mission manager defines survivability requirements based on the mission length, orbit, launch date, launch vehicle, and other factors . such as the use of reactor engines. Margins are then applied to these values (temperature extremes, vibration extremes, radiation tolerances, etc,) and a new set of conditions is generalized for design requirements. Mission assurance documents will then assign an additional margin for reliability, and a third set of values is provided for during testing. A fourth set of environmental condition values may evolve intermittently from heritage hardware that has been tested to a level beyond the actual mission requirement. These various sets of environment figures can make it quite confusing and difficult to capture common hardware environmental requirements. Environmental requirement information can be found in a wide variety of places. The most obvious is with the individual projects. We can easily get answers to questions about temperature extremes being used and radiation tolerance goals, but it is more difficult to map the answers to the process that created these requirements: for design, for qualification, and for actual environment with no margin applied. Not everyone assigned to a NASA project may have that kind of insight, as many have only the environmental requirement numbers needed to do their jobs but do not necessarily have a programmatic-level understanding of how all of the environmental requirements fit together.
[New therapeutic strategies for the treatment of difficult wounds].
Onesti, M G; Bitonti, Adriana; Fino, P; Ciotti, M; Scuderi, N
2008-05-01
The medical-surgical treatment of the difficult wounds represents a socio-sanitary problem in continuous growth, currently involving in our Country around 2,000,000 people. The "difficult wound" is a loss of cutaneous substances, usually due to multifactorial pathogenesis, that do not spontaneously lead to a complete recovery. Numerous studies in the literature have evidenced that the use of the advanced wound dressings allows to reach the best clinical and economic results in the process of recovery of the difficult wounds. The advanced would dressing assures a longer period of permanence on the injury and shorten the time of treatment and, as a consequence, it is required a smaller number of applications in comparison with the traditional medications. The Wound Bed Preparation (WBP) can be defined as the global and coordinate management of the cutaneous injury, enabling to chip off the local barriers to the recovery, or promoting the effectiveness of the innovative therapeutic instruments. The term advanced wound dressing indicates the dressing material having biocompatibility characteristics. The purpose of the advanced wound dressings is the one to create the ideal environment for the cicatrization process and isolate the wound from traumas and external infections. The "Difficult Wounds" Unit of the Department of Plastic and Reconstructive Surgery of the Policlinico Umberto I in Rome, from January to December 2006, treated 570 patients (308 men and 262 women), whose age was between 2 days and 85 years, affected by ulcers of various nature. Among our cases, 200 patients were selected and randomly separated in two different groups: group A consisting of 100 patients entirely treated with traditional medications; group B composed by 100 patients treated with advanced dressings. Every patient has locally been treated with periodic and specific medications, according to the type of difficult wound, and subsequently they proceeded to find out how to treat the systemic factors causing ulcer. The patients underwent 3 times a week to medications in those cases presenting infection signs and 2 times a week in those cases where no infection signs were shown, for period varying from 1 month up to one year for the chronic forms. The results showed a higher percentage of recovery reached by using the advanced dressings. Group A showed the followings results: the 53% of patients recovered from wounds; the remaining 47% patients did'nt not recover but in 17% cases medications showed to be of some help in the preparation of the vascular bed for the execution of a definitive operation (application of grafts or local edges), while the remaining 30% has shown a scarce improvement of the injury and they are still under treatment. Group B showed the 65% of patients recovered from wounds; as for the remaining 35% not recovered patients, medications represented an auxiliary aid to the preparation of the vascular bed for the execution of a definitive operation (application of grafts or local edges) for the 15% of patients, while the remaining 20%, even if not completely recovered, showed a notable improvement of the injury (reduction of the dimensions and disappearance of the infection and improvement of the patient quality of life). In synthesis, it emerges that the advanced dressings, if correctly used, offer advantages in terms of clinical effectiveness (rapid recovery from the injury), patient quality of the life and cheapness. It has also to be considered that the difficult wound is often the epiphenomenon of a systemic illness. The difficult wound requires, therefore, a multidisciplinary treatment.
NASA Astrophysics Data System (ADS)
Logsdon, M.; Richey, J.; Campbell, B.; Stoermer, M.
2004-12-01
Earth system sciences is being challenged by the intellectual and the societal requirements of how to quantify the spatial patterns and temporal dynamics of changes in the atmosphere, landscape, and seascape, including human resources management. There are multiple issues in how to do this. The first is establishing the multi-disciplinary basis of how to systematically organize the required geophysical elements, from the very slow geological process forming the basic template to the very fast moving event-driven processes brought on by an individual rainstorm. The second is how to mobilize, access, see, and interact with the very disparate sources of information required. The third problem, perhaps the most difficult, is how to get the disparate disciplinary and management experts to constructively interact. These requirements drove the process for establishing the PRISM "Virtual Puget Sound." The basic construct is recognizing the inherent time and space attributes of the landscape, and then constructing an informatics environment that will allow the respective elements to be brought together in a collaboratory. Central to the enterprise is the use of an XML-enabled DataStream, to mobilize data from archives to models to visualizations. Outcomes are addressing such regional issues and daily stream flow, seasonal water supply and demand, low oxygen in Hood Canal, and sewage treatment plan siting. This model is being extended, as an Earth System Module, elsewhere in the world, from the Amazon to the Mekong.
Hussain, Hirra; Fisher, David I; Abbott, W Mark; Roth, Robert G; Dickson, Alan J
2017-10-01
Certain recombinant proteins are deemed "difficult to express" in mammalian expression systems requiring significant cell and/or process engineering to abrogate expression bottlenecks. With increasing demand for the production of recombinant proteins in mammalian cells, low protein yields can have significant consequences for industrial processes. To investigate the molecular mechanisms that restrict expression of recombinant proteins, naturally secreted model proteins were analyzed from the tissue inhibitors of metalloproteinase (TIMP) protein family. In particular, TIMP-2 and TIMP-3 were subjected to detailed study. TIMP proteins share significant sequence homology (∼50% identity and ∼70% similarity in amino acid sequence). However, they show marked differences in secretion in mammalian expression systems despite this extensive sequence homology. Using these two proteins as models, this study characterized the molecular mechanisms responsible for poor recombinant protein production. Our results reveal that both TIMP-2 and TIMP-3 are detectable at mRNA and protein level within the cell but only TIMP-2 is secreted effectively into the extracellular medium. Analysis of protein localization and the nature of intracellular protein suggest TIMP-3 is severely limited in its post-translational processing. To overcome this challenge, modification of the TIMP-3 sequence to include a furin protease-cleavable pro-sequence resulted in secretion of the modified TIMP-3 protein, however, incomplete processing was observed. Based on the TIMP-3 data, the protein engineering approach was optimized and successfully applied in combination with cell engineering, the overexpression of furin, to another member of the TIMP protein family (the poorly expressed TIMP-4). Use of the described protein engineering strategy resulted in successful secretion of poorly (TIMP-4) and non-secreted (TIMP-3) targets, and presents a novel strategy to enhance the production of "difficult" recombinant targets. Biotechnol. Bioeng. 2017;114: 2348-2359. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Holmberg, Leif
2007-11-01
A health-care organization simultaneously belongs to two different institutional value patterns: a professional and an administrative value pattern. At the administrative level, medical problem-solving processes are generally perceived as the efficient application of familiar chains of activities to well-defined problems; and a low task uncertainty is therefore assumed at the work-floor level. This assumption is further reinforced through clinical pathways and other administrative guidelines. However, studies have shown that in clinical practice such administrative guidelines are often considered inadequate and difficult to implement mainly because physicians generally perceive task uncertainty to be high and that the guidelines do not cover the scope of encountered deviations. The current administrative level guidelines impose uniform structural features that meet the requirement for low task uncertainty. Within these structural constraints, physicians must organize medical problem-solving processes to meet any task uncertainty that may be encountered. Medical problem-solving processes with low task uncertainty need to be organized independently of processes with high task uncertainty. Each process must be evaluated according to different performance standards and needs to have autonomous administrative guideline models. Although clinical pathways seem appropriate when there is low task uncertainty, other kinds of guidelines are required when the task uncertainty is high.
Making Difficult Things Easy and Easy Things Difficult.
ERIC Educational Resources Information Center
Campbell, J. Arthur; Bent, Henry A.
1982-01-01
Suggestions are offered to illustrate concepts and processes by using simple materials such as paper, paper clip, rubber band (bonding, entropy, endothermic processes). Also suggests using basic terminology: elementary ratios, percent, reaction chemistry for entropy function; equilibrium constants for Gibbs energies; and chemical mechanics for…
Converging on the optimal attainment of requirements
NASA Technical Reports Server (NTRS)
Feather, M. S.; Menzies, T.
2002-01-01
Planning for the optimal attainment of requirements is an important early lifecycle activity. However, such planning is difficult when dealing with competing requirements, limited resources, and the incompleteness of information available at requirements time.
Estimating direct fatality impacts at wind farms: how far we’ve come, where we have yet to go
Huso, Manuela M.; Schwartz, Susan Savitt
2013-01-01
Measuring the potential impacts of wind farms on wildlife can be difficult and may require development of new statistical tools and models to accurately reflect the measurement process. This presentation reviews the recent history of approaches to estimating wildlife fatality under the unique conditions encountered at wind farms, their unifying themes and their potential shortcomings. Avenues of future research are suggested to continue to address the needs of resource managers and industry in understanding direct impacts of wind turbine-caused wildlife fatality.
NASA Technical Reports Server (NTRS)
Criswell, David R.
1990-01-01
Space Grant Colleges and Universities must build the space curriculum of the future on the firm basis of deep knowledge of an involvement with the present operating programs of the nation and an on-going and extensive program of leading edge research in the aerospace sciences and engineering, management, law, finance, and the other arts that are integral to our planetary society. The Space Grant College and Fellowship Program must create new academic fields of enquiry, which is a long and difficult process that will require deeper and broader interaction between NASA and academia than has previously existed.
The Evolution of the U.S. Navy’s Maritime Strategy, 1977-1986
2003-01-01
making. It is rare to have as au- thoritative an account of the difficult, complex process of strategy making as that which Hattendorf produced within a...1 9 8 6 1 3 draws a clear distinction between capabilities and requirements, and which uses the one to build on the other; that takes into account ...high degree of versatility in the form of a wider range of military and political action at a moderate increase in cost .”39 Within the navy, Sea Plan
Non-invasive monitoring of vascularization of grafted engineered human oral mucosa
NASA Astrophysics Data System (ADS)
Wolf, D. E.; Seetamraju, M.; Gurjar, R. S.; Kuo, R. S.; Fasi, A.; Feinberg, S. E.
2012-03-01
Accident victims and victims of explosive devices often suffer from complex maxillofacial injuries. The lips are one of the most difficult areas of the face to reconstruct after an avulsion. Lip avulsion results in compromised facial esthetics and functions of speech and mastication. The process of reconstruction requires assessment of the vascularization of grafted ex vivo engineered tissue while it is buried underneath the skin. We describe the design and animal testing of a hand-held surgical probe based upon diffuse correlation spectroscopy to assess vascularization.
A fast passive and planar liquid sample micromixer.
Melin, Jessica; Gimenéz, Guillem; Roxhed, Niclas; van der Wijngaart, Wouter; Stemme, Göran
2004-06-01
A novel microdevice for passively mixing liquid samples based on surface tension and a geometrical mixing chamber is presented. Due to the laminar flow regime on the microscale, mixing becomes difficult if not impossible. We present a micromixer where a constantly changing time dependent flow pattern inside a two sample liquid plug is created as the plug simply passes through the planar mixer chamber. The device requires no actuation during mixing and is fabricated using a single etch process. The effective mixing of two coloured liquid samples is demonstrated.
A novel variable baseline visibility detection system and its measurement method
NASA Astrophysics Data System (ADS)
Li, Meng; Jiang, Li-hui; Xiong, Xing-long; Zhang, Guizhong; Yao, JianQuan
2017-10-01
As an important meteorological observation instrument, the visibility meter can ensure the safety of traffic operation. However, due to the optical system contamination as well as sample error, the accuracy and stability of the equipment are difficult to meet the requirement in the low-visibility environment. To settle this matter, a novel measurement equipment was designed based upon multiple baseline, which essentially acts as an atmospheric transmission meter with movable optical receiver, applying weighted least square method to process signal. Theoretical analysis and experiments in real atmosphere environment support this technique.
Holmberg, D L
1979-05-01
Pyothorax is a serious disease process which requires both medical and surgical intervention. Late recognition, management problems, and likely recurrence make successful treatment difficult and often frustrating. Aims of therapy should be to avoid undue stress to the patient, to relieve respiratory distress by thoracocentesis, to eliminate infectious agents with antimicrobials, to remove pleural exudate, and to provide supportive care. Close monitoring of the patient is necessary to prevent iatrogenic complications such as pneumothorax, hemothorax, hypothermia, or hypoproteinemia. Exploratory thoracotomy for removal of granulomatous material and fibroelastic pleural "peels" is occasionally necessary to resolve compressive cardiopulmonary lesions.
NASA Astrophysics Data System (ADS)
Gisario, Annamaria; Barletta, Massimiliano; Venettacci, Simone; Veniali, Francesco
2015-06-01
Achievement of sharp bending angles with small fillet radius on stainless steel sheets by mechanical bending requires sophisticated bending device and troublesome operational procedures, which can involve expensive molds, huge presses and large loads. In addition, springback is always difficult to control, thus often leading to final parts with limited precision and accuracy. In contrast, laser-assisted bending of metals is an emerging technology, as it often allows to perform difficult and multifaceted manufacturing tasks with relatively small efforts. In the present work, laser-assisted bending of stainless steel sheets to achieve sharp angles is thus investigated. First, bending trials were performed by combining laser irradiation with an auxiliary bending device triggered by a pneumatic actuator and based on kinematic of deformable quadrilaterals. Second, laser operational parameters, that is, scanning speed, power and number of passes, were varied to identify the most suitable processing settings. Bending angles and fillet radii were measured by coordinate measurement machine. Experimental data were elaborated by combined ANalysis Of Mean (ANOM) and ANalysis Of VAriance (ANOVA). Based on experimental findings, the best strategy to achieve an aircraft prototype from a stainless steel sheet was designed and implemented.
Handling e-waste in developed and developing countries: initiatives, practices, and consequences.
Sthiannopkao, Suthipong; Wong, Ming Hung
2013-10-01
Discarded electronic goods contain a range of toxic materials requiring special handling. Developed countries have conventions, directives, and laws to regulate their disposal, most based on extended producer responsibility. Manufacturers take back items collected by retailers and local governments for safe destruction or recovery of materials. Compliance, however, is difficult to assure, and frequently runs against economic incentives. The expense of proper disposal leads to the shipment of large amounts of e-waste to China, India, Pakistan, Nigeria, and other developing countries. Shipment is often through middlemen, and under tariff classifications that make quantities difficult to assess. There, despite the intents of national regulations and hazardous waste laws, most e-waste is treated as general refuse, or crudely processed, often by burning or acid baths, with recovery of only a few materials of value. As dioxins, furans, and heavy metals are released, harm to the environment, workers, and area residents is inevitable. The faster growth of e-waste generated in the developing than in the developed world presages continued expansion of a pervasive and inexpensive informal processing sector, efficient in its own way, but inherently hazard-ridden. Copyright © 2012 Elsevier B.V. All rights reserved.
Software Would Largely Automate Design of Kalman Filter
NASA Technical Reports Server (NTRS)
Chuang, Jason C. H.; Negast, William J.
2005-01-01
Embedded Navigation Filter Automatic Designer (ENFAD) is a computer program being developed to automate the most difficult tasks in designing embedded software to implement a Kalman filter in a navigation system. The most difficult tasks are selection of error states of the filter and tuning of filter parameters, which are timeconsuming trial-and-error tasks that require expertise and rarely yield optimum results. An optimum selection of error states and filter parameters depends on navigation-sensor and vehicle characteristics, and on filter processing time. ENFAD would include a simulation module that would incorporate all possible error states with respect to a given set of vehicle and sensor characteristics. The first of two iterative optimization loops would vary the selection of error states until the best filter performance was achieved in Monte Carlo simulations. For a fixed selection of error states, the second loop would vary the filter parameter values until an optimal performance value was obtained. Design constraints would be satisfied in the optimization loops. Users would supply vehicle and sensor test data that would be used to refine digital models in ENFAD. Filter processing time and filter accuracy would be computed by ENFAD.
Data Relationships: Towards a Conceptual Model of Scientific Data Catalogs
NASA Astrophysics Data System (ADS)
Hourcle, J. A.
2008-12-01
As the amount of data, types of processing and storage formats increase, the total number of record permutations increase dramatically. The result is an overwhelming number of records that make identifying the best data object to answer a user's needs more difficult. The issue is further complicated as each archive's data catalog may be designed around different concepts - - anything from individual files to be served, series of similarly generated and processed data, or something entirely different. Catalogs may not only be flat tables, but may be structured as multiple tables with each table being a different data series, or a normalized structure of the individual data files. Merging federated search results from archives with different catalog designs can create situations where the data object of interest is difficult to find due to an overwhelming number of seemingly similar or entirely unwanted records. We present a reference model for discussing data catalogs and the complex relationships between similar data objects. We show how the model can be used to improve scientist's ability to quickly identify the best data object for their purposes and discuss technical issues required to use this model in a federated system.
Cost-effectiveness analysis: problems and promise for evaluating medical technology
NASA Astrophysics Data System (ADS)
Juday, Timothy R.
1994-12-01
Although using limited financial resources in the most beneficial way, in principle, a laudable goal, actually developing standards for measuring the cost-effectiveness of medical technologies and incorporating them into the coverage process is a much more difficult proposition. Important methodological difficulties include determining how to compare a technology to its leading alternative, defining costs, incorporating patient preferences, and defining health outcomes. In addition, more practical questions must be addressed. These questions include: who does the analysis? who makes the decisions? which technologies to evaluate? what resources are required? what is the political and legal environment? how much is a health outcome worth? The ultimate question that must be answered is what is a health outcome worth? Cost-effectiveness analysis cannot answer this question; it only enables comparison of cost-effectiveness ratios across technologies. In order to determine whether a technology should be covered, society or individual insurers must determine how much they are willing to pay for the health benefits. Conducting cost-effectiveness analysis will not remove the need to make difficult resource allocation decisions; however, explicitly examining the tradeoffs involved in these decisions should help to improve the process.
ISPATOM: A Generic Real-Time Data Processing Tool Without Programming
NASA Technical Reports Server (NTRS)
Dershowitz, Adam
2007-01-01
Information Sharing Protocol Advanced Tool of Math (ISPATOM) is an application program allowing for the streamlined generation of comps, which subscribe to streams of incoming telemetry data, perform any necessary computations on the data, then send the data to other programs for display and/or further processing in NASA mission control centers. Heretofore, the development of comps was difficult, expensive, and time-consuming: Each comp was custom written manually, in a low-level computing language, by a programmer attempting to follow requirements of flight controllers. ISPATOM enables a flight controller who is not a programmer to write a comp by simply typing in one or more equation( s) at a command line or retrieving the equation(s) from a text file. ISPATOM then subscribes to the necessary input data, performs all of necessary computations, and sends out the results. It sends out new results whenever the input data change. The use of equations in ISPATOM is no more difficult than is entering equations in a spreadsheet. The time involved in developing a comp is thus limited to the time taken to decide on the necessary equations. Thus, ISPATOM is a real-time dynamic calculator.
[Evidence-based Risk and Benefit Communication for Shared Decision Making].
Nakayama, Takeo
2018-01-01
Evidence-based medicine (EBM) can be defined as "the integration of the best research evidence with clinical expertise and a patient's unique values and circumstances". However, even with the best research evidence, many uncertainties can make clinical decisions difficult. As the social requirement of respecting patient values and preferences has been increasingly recognized, shared decision making (SDM) and consensus development between patients and clinicians have attracted attention. SDM is a process by which patients and clinicians make decisions and arrive at a consensus through interactive conversations and communications. During the process of SDM, patients and clinicians share information with each other on the goals they hope to achieve and responsibilities in meeting those goals. From the clinician's standpoint, information regarding the benefits and risks of potential treatment options based on current evidence and professional experience is provided to patients. From the patient's standpoint, information on personal values, preferences, and social roles is provided to clinicians. SDM is a sort of "wisdom" in the context of making autonomous decisions in uncertain, difficult situations through interactions and cooperation between patients and clinicians. Joint development of EBM and SDM will help facilitate patient-clinician relationships and improve the quality of healthcare.
ERIC Educational Resources Information Center
Maguire, Mandy J.; White, Joshua; Brier, Matthew R.
2011-01-01
Throughout middle-childhood, inhibitory processes, which underlie many higher order cognitive tasks, are developing. Little is known about how inhibitory processes change as a task becomes conceptually more difficult during these important years. In adults, as Go/NoGo tasks become more difficult there is a systematic decrease in the P3 NoGo…
Elements of Style and an Advanced ESL Student: The Case of Jun Shan Zhang.
ERIC Educational Resources Information Center
Ryan, Patrick
Despite educators' efforts to understand the process of composition, writing remains a mercurial process difficult to see or describe, even partially. Writing is a process even more difficult to grasp when the writer is possessed of a language--Chinese, for example--and must rely on that language to take possession of and write in a second…
Targeting polyamine metabolism for cancer therapy and prevention
Murray-Stewart, Tracy R.; Woster, Patrick M.; Casero, Robert A.
2017-01-01
The chemically simple, biologically complex eukaryotic polyamines, spermidine and spermine, are positively charged alkylamines involved in many crucial cellular processes. Along with their diamine precursor putrescine, their normally high intracellular concentrations require fine attenuation by multiple regulatory mechanisms to keep these essential molecules within strict physiologic ranges. Since the metabolism of and requirement for polyamines are frequently dysregulated in neoplastic disease, the metabolic pathway and functions of polyamines provide rational drug targets; however, these targets have been difficult to exploit for chemotherapy. It is the goal of this article to review the latest findings in the field that demonstrate the potential utility of targeting the metabolism and function of polyamines as strategies for both chemotherapy and, possibly more importantly, chemoprevention. PMID:27679855
Improvement of CFD Methods for Modeling Full Scale Circulating Fluidized Bed Combustion Systems
NASA Astrophysics Data System (ADS)
Shah, Srujal; Klajny, Marcin; Myöhänen, Kari; Hyppänen, Timo
With the currently available methods of computational fluid dynamics (CFD), the task of simulating full scale circulating fluidized bed combustors is very challenging. In order to simulate the complex fluidization process, the size of calculation cells should be small and the calculation should be transient with small time step size. For full scale systems, these requirements lead to very large meshes and very long calculation times, so that the simulation in practice is difficult. This study investigates the requirements of cell size and the time step size for accurate simulations, and the filtering effects caused by coarser mesh and longer time step. A modeling study of a full scale CFB furnace is presented and the model results are compared with experimental data.
[Rehabilitative measures in hearing-impaired children].
von Wedel, H; von Wedel, U C; Zorowka, P
1991-12-01
On the basis of certain fundamental data on the maturation processes of the central auditory pathways in early childhood the importance of early intervention with hearing aids is discussed and emphasized. Pathological hearing, that is acoustical deprivation in early childhood will influence the maturation process. Very often speech development is delayed if diagnosis and therapy or rehabilitation are not early enough. Anamnesis, early diagnosis and clinical differential diagnosis are required before a hearing aid can be fitted. Selection criteria and adjustment parameters are discussed, showing that the hearing aid fitting procedure must be embedded in a complex matrix of requirements related to the development of speech as well as to the cognitive, emotional and social development of the child. As a rule, finding and preparing the "best" hearing aids (binaural fitting is obligatory) for a child is a long and often difficult process, which can only be performed by specialists who are pedo-audiologists. After the binaural fitting of hearing aids an intensive hearing and speech education in close cooperation between parents, pedo-audiologist and teacher must support the whole development of the child.
Phase editing as a signal pre-processing step for automated bearing fault detection
NASA Astrophysics Data System (ADS)
Barbini, L.; Ompusunggu, A. P.; Hillis, A. J.; du Bois, J. L.; Bartic, A.
2017-07-01
Scheduled maintenance and inspection of bearing elements in industrial machinery contributes significantly to the operating costs. Savings can be made through automatic vibration-based damage detection and prognostics, to permit condition-based maintenance. However automation of the detection process is difficult due to the complexity of vibration signals in realistic operating environments. The sensitivity of existing methods to the choice of parameters imposes a requirement for oversight from a skilled operator. This paper presents a novel approach to the removal of unwanted vibrational components from the signal: phase editing. The approach uses a computationally-efficient full-band demodulation and requires very little oversight. Its effectiveness is tested on experimental data sets from three different test-rigs, and comparisons are made with two state-of-the-art processing techniques: spectral kurtosis and cepstral pre- whitening. The results from the phase editing technique show a 10% improvement in damage detection rates compared to the state-of-the-art while simultaneously improving on the degree of automation. This outcome represents a significant contribution in the pursuit of fully automatic fault detection.
Technology Infusion Challenges from a Decision Support Perspective
NASA Technical Reports Server (NTRS)
Adumitroaie, V.; Weisbin, C. R.
2009-01-01
In a restricted science budget environment and increasingly numerous required technology developments, the technology investment decisions within NASA are objectively more and more difficult to make such that the end results are satisfying the technical objectives and all the organizational constraints. Under these conditions it is rationally desirable to build an investment portfolio, which has the highest possible technology infusion rate. Arguably the path to infusion is subject to many influencing factors, but here only the challenges associated with the very initial stages are addressed: defining the needs and the subsequent investment decision-support process. It is conceivable that decision consistency and possibly its quality suffer when the decision-making process has limited or no traceability. This paper presents a structured decision-support framework aiming to provide traceable, auditable, infusion- driven recommendations towards a selection process in which these recommendations are used as reference points in further discussions among stakeholders. In this framework addressing well-defined requirements, different measures of success can be defined based on traceability to specific selection criteria. As a direct result, even by using simplified decision models the likelihood of infusion can be probed and consequently improved.
NASA Astrophysics Data System (ADS)
Hibbard-Lubow, David Luke
The demands of digital memory have increased exponentially in recent history, requiring faster, smaller and more accurate storage methods. Two promising solutions to this ever-present problem are Bit Patterned Media (BPM) and Spin-Transfer Torque Magnetic Random Access Memory (STT-MRAM). Producing these technologies requires difficult and expensive fabrication techniques. Thus, the production processes must be optimized to allow these storage methods to compete commercially while continuing to increase their information storage density and reliability. I developed a process for the production of nanomagnetic devices (which can take the form of several types of digital memory) embedded in thin silicon nitride films. My focus was on optimizing the reactive ion etching recipe required to embed the device in the film. Ultimately, I found that recipe 37 (Power: 250W, CF4 nominal/actual flow rate: 25/25.4 sccm, O2 nominal/actual flow rate: 3.1/5.2 sccm, which gave a maximum pressure around 400 mTorr) gave the most repeatable and anisotropic results. I successfully used processes described in this thesis to make embedded nanomagnets, which could be used as bit patterned media. Another promising application of this work is to make embedded magnetic tunneling junctions, which are the storage medium used in MRAM. Doing so will require still some tweaks to the fabrication methods. Techniques for making these changes and their potential effects are discussed.
Vacuum Brazing of Accelerator Components
NASA Astrophysics Data System (ADS)
Singh, Rajvir; Pant, K. K.; Lal, Shankar; Yadav, D. P.; Garg, S. R.; Raghuvanshi, V. K.; Mundra, G.
2012-11-01
Commonly used materials for accelerator components are those which are vacuum compatible and thermally conductive. Stainless steel, aluminum and copper are common among them. Stainless steel is a poor heat conductor and not very common in use where good thermal conductivity is required. Aluminum and copper and their alloys meet the above requirements and are frequently used for the above purpose. The accelerator components made of aluminum and its alloys using welding process have become a common practice now a days. It is mandatory to use copper and its other grades in RF devices required for accelerators. Beam line and Front End components of the accelerators are fabricated from stainless steel and OFHC copper. Fabrication of components made of copper using welding process is very difficult and in most of the cases it is impossible. Fabrication and joining in such cases is possible using brazing process especially under vacuum and inert gas atmosphere. Several accelerator components have been vacuum brazed for Indus projects at Raja Ramanna Centre for Advanced Technology (RRCAT), Indore using vacuum brazing facility available at RRCAT, Indore. This paper presents details regarding development of the above mentioned high value and strategic components/assemblies. It will include basics required for vacuum brazing, details of vacuum brazing facility, joint design, fixturing of the jobs, selection of filler alloys, optimization of brazing parameters so as to obtain high quality brazed joints, brief description of vacuum brazed accelerator components etc.
Interfering with free recall of words: Detrimental effects of phonological competition.
Fernandes, Myra A; Wammes, Jeffrey D; Priselac, Sandra; Moscovitch, Morris
2016-09-01
We examined the effect of different distracting tasks, performed concurrently during memory retrieval, on recall of a list of words. By manipulating the type of material and processing (semantic, orthographic, and phonological) required in the distracting task, and comparing the magnitude of memory interference produced, we aimed to infer the kind of representation upon which retrieval of words depends. In Experiment 1, identifying odd digits concurrently during free recall disrupted memory, relative to a full attention condition, when the numbers were presented orthographically (e.g. nineteen), but not numerically (e.g. 19). In Experiment 2, a distracting task that required phonological-based decisions to either word or picture material produced large, but equivalent effects on recall of words. In Experiment 3, phonological-based decisions to pictures in a distracting task disrupted recall more than when the same pictures required semantically-based size estimations. In Experiment 4, a distracting task that required syllable decisions to line drawings interfered significantly with recall, while an equally difficult semantically-based color-decision task about the same line drawings, did not. Together, these experiments demonstrate that the degree of memory interference experienced during recall of words depends primarily on whether the distracting task competes for phonological representations or processes, and less on competition for semantic or orthographic or material-specific representations or processes. Copyright © 2016 Elsevier Ltd. All rights reserved.
Advancement of CMOS Doping Technology in an External Development Framework
NASA Astrophysics Data System (ADS)
Jain, Amitabh; Chambers, James J.; Shaw, Judy B.
2011-01-01
The consumer appetite for a rich multimedia experience drives technology development for mobile hand-held devices and the infrastructure to support them. Enhancements in functionality, speed, and user experience are derived from advancements in CMOS technology. The technical challenges in developing each successive CMOS technology node to support these enhancements have become increasingly difficult. These trends have motivated the CMOS business towards a collaborative approach based on strategic partnerships. This paper describes our model and experience of CMOS development, based on multi-dimensional industrial and academic partnerships. We provide to our process equipment, materials, and simulation partners, as well as to our silicon foundry partners, the detailed requirements for future integrated circuit products. This is done very early in the development cycle to ensure that these requirements can be met. In order to determine these fundamental requirements, we rely on a strategy that requires strong interaction between process and device simulation, physical and chemical analytical methods, and research at academic institutions. This learning is shared with each project partner to address integration and manufacturing issues encountered during CMOS technology development from its inception through product ramp. We utilize TI's core strengths in physical analysis, unit processes and integration, yield ramp, reliability, and product engineering to support this technological development. Finally, this paper presents examples of the advancement of CMOS doping technology for the 28 nm node and beyond through this development model.
Methods for Decontamination of a Bipropellant Propulsion System
NASA Technical Reports Server (NTRS)
McClure, Mark B.; Greene, Benjamin
2012-01-01
Most propulsion systems are designed to be filled and flown, draining can be done but decontamination may be difficult. Transport of these systems may be difficult as well because flight weight vessels are not designed around DOT or UN shipping requirements. Repairs, failure analysis work or post firing inspections may be difficult or impossible to perform due to the hazards of residual propellants being present.
Experimental Replication of an Aeroengine Combustion Instability
NASA Technical Reports Server (NTRS)
Cohen, J. M.; Hibshman, J. R.; Proscia, W.; Rosfjord, T. J.; Wake, B. E.; McVey, J. B.; Lovett, J.; Ondas, M.; DeLaat, J.; Breisacher, K.
2000-01-01
Combustion instabilities in gas turbine engines are most frequently encountered during the late phases of engine development, at which point they are difficult and expensive to fix. The ability to replicate an engine-traceable combustion instability in a laboratory-scale experiment offers the opportunity to economically diagnose the problem (to determine the root cause), and to investigate solutions to the problem, such as active control. The development and validation of active combustion instability control requires that the causal dynamic processes be reproduced in experimental test facilities which can be used as a test bed for control system evaluation. This paper discusses the process through which a laboratory-scale experiment was designed to replicate an instability observed in a developmental engine. The scaling process used physically-based analyses to preserve the relevant geometric, acoustic and thermo-fluid features. The process increases the probability that results achieved in the single-nozzle experiment will be scalable to the engine.
Using AI and Semantic Web Technologies to attack Process Complexity in Open Systems
NASA Astrophysics Data System (ADS)
Thompson, Simon; Giles, Nick; Li, Yang; Gharib, Hamid; Nguyen, Thuc Duong
Recently many vendors and groups have advocated using BPEL and WS-BPEL as a workflow language to encapsulate business logic. While encapsulating workflow and process logic in one place is a sensible architectural decision the implementation of complex workflows suffers from the same problems that made managing and maintaining hierarchical procedural programs difficult. BPEL lacks constructs for logical modularity such as the requirements construct from the STL [12] or the ability to adapt constructs like pure abstract classes for the same purpose. We describe a system that uses semantic web and agent concepts to implement an abstraction layer for BPEL based on the notion of Goals and service typing. AI planning was used to enable process engineers to create and validate systems that used services and goals as first class concepts and compiled processes at run time for execution.
Schizophrenia as a human process.
Corradi, Richard B
2011-01-01
The patient with schizophrenia often appears to be living in an alien world, one of strange voices, bizarre beliefs, and disorganized speech and behavior. It is difficult to empathize with someone suffering from symptoms so remote from one's ordinary experience. However, examination of the disorder reveals not only symptoms of the psychosis itself but also an intensely human struggle against the disintegration of personality it can produce. Furthermore, examination of the individual's attempts to cope with a devastating psychotic process reveals familiar psychodynamic processes and defense mechanisms, however unsuccessful they may be. Knowing that behind the seemingly alien diagnostic features of schizophrenia is a person attempting to preserve his or her self-identity puts a human face on the illness. This article utilizes clinical material to describe some of the psychodynamic processes of schizophrenia. Its purpose is to facilitate understanding of an illness that requires comprehensive biopsychosocial treatment in which a therapeutic doctor-patient relationship is as necessary as antipsychotic medication.
Parallel processing architecture for H.264 deblocking filter on multi-core platforms
NASA Astrophysics Data System (ADS)
Prasad, Durga P.; Sonachalam, Sekar; Kunchamwar, Mangesh K.; Gunupudi, Nageswara Rao
2012-03-01
Massively parallel computing (multi-core) chips offer outstanding new solutions that satisfy the increasing demand for high resolution and high quality video compression technologies such as H.264. Such solutions not only provide exceptional quality but also efficiency, low power, and low latency, previously unattainable in software based designs. While custom hardware and Application Specific Integrated Circuit (ASIC) technologies may achieve lowlatency, low power, and real-time performance in some consumer devices, many applications require a flexible and scalable software-defined solution. The deblocking filter in H.264 encoder/decoder poses difficult implementation challenges because of heavy data dependencies and the conditional nature of the computations. Deblocking filter implementations tend to be fixed and difficult to reconfigure for different needs. The ability to scale up for higher quality requirements such as 10-bit pixel depth or a 4:2:2 chroma format often reduces the throughput of a parallel architecture designed for lower feature set. A scalable architecture for deblocking filtering, created with a massively parallel processor based solution, means that the same encoder or decoder will be deployed in a variety of applications, at different video resolutions, for different power requirements, and at higher bit-depths and better color sub sampling patterns like YUV, 4:2:2, or 4:4:4 formats. Low power, software-defined encoders/decoders may be implemented using a massively parallel processor array, like that found in HyperX technology, with 100 or more cores and distributed memory. The large number of processor elements allows the silicon device to operate more efficiently than conventional DSP or CPU technology. This software programing model for massively parallel processors offers a flexible implementation and a power efficiency close to that of ASIC solutions. This work describes a scalable parallel architecture for an H.264 compliant deblocking filter for multi core platforms such as HyperX technology. Parallel techniques such as parallel processing of independent macroblocks, sub blocks, and pixel row level are examined in this work. The deblocking architecture consists of a basic cell called deblocking filter unit (DFU) and dependent data buffer manager (DFM). The DFU can be used in several instances, catering to different performance needs the DFM serves the data required for the different number of DFUs, and also manages all the neighboring data required for future data processing of DFUs. This approach achieves the scalability, flexibility, and performance excellence required in deblocking filters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Shijun, E-mail: sj-xie@163.com; State Key Laboratory of Control and Simulation of Power System and Generation Equipment, Department of Electrical Engineering, Tsinghua University, Beijing 100084; Zeng, Rong
2015-08-15
Natural lightning flashes are stochastic and uncontrollable, and thus, it is difficult to observe the formation process of a downward negative stepped leader (NSL) directly and in detail. This situation has led to some dispute over the actual NSL formation mechanism, and thus has hindered improvements in the lightning shielding analysis model. In this paper, on the basis of controllable long air gap discharge experiments, the formation conditions required for NSLs in negative flashes have been studied. First, a series of simulation experiments on varying scales were designed and carried out. The NSL formation processes were observed, and several ofmore » the characteristic process parameters, including the scale, the propagation velocity, and the dark period, were obtained. By comparing the acquired formation processes and the characteristic parameters with those in natural lightning flashes, the similarity between the NSLs in the simulation experiments and those in natural flashes was proved. Then, based on the local thermodynamic equation and the space charge estimation method, the required NSL formation conditions were deduced, and the space background electric field (E{sub b}) was proposed as the primary parameter for NSL formation. Finally, the critical value of E{sub b} required for the formation of NSLs in natural flashes was determined to be approximately 75 kV/m by extrapolation of the results of the simulation experiments.« less
Explicit solution techniques for impact with contact constraints
NASA Technical Reports Server (NTRS)
Mccarty, Robert E.
1993-01-01
Modern military aircraft transparency systems, windshields and canopies, are complex systems which must meet a large and rapidly growing number of requirements. Many of these transparency system requirements are conflicting, presenting difficult balances which must be achieved. One example of a challenging requirements balance or trade is shaping for stealth versus aircrew vision. The large number of requirements involved may be grouped in a variety of areas including man-machine interface; structural integration with the airframe; combat hazards; environmental exposures; and supportability. Some individual requirements by themselves pose very difficult, severely nonlinear analysis problems. One such complex problem is that associated with the dynamic structural response resulting from high energy bird impact. An improved analytical capability for soft-body impact simulation was developed.
Explicit solution techniques for impact with contact constraints
NASA Astrophysics Data System (ADS)
McCarty, Robert E.
1993-08-01
Modern military aircraft transparency systems, windshields and canopies, are complex systems which must meet a large and rapidly growing number of requirements. Many of these transparency system requirements are conflicting, presenting difficult balances which must be achieved. One example of a challenging requirements balance or trade is shaping for stealth versus aircrew vision. The large number of requirements involved may be grouped in a variety of areas including man-machine interface; structural integration with the airframe; combat hazards; environmental exposures; and supportability. Some individual requirements by themselves pose very difficult, severely nonlinear analysis problems. One such complex problem is that associated with the dynamic structural response resulting from high energy bird impact. An improved analytical capability for soft-body impact simulation was developed.
Aspects of the Teaching of Russian.
ERIC Educational Resources Information Center
Baker, Robert L.
The process of learning Russian should be no more difficult than the process of learning other languages although it may take somewhat longer. The phonetic system should not present major difficulties with respect to individual sounds, but intonation may be difficult because Russian pitch patterns represent different intentions and emotions than…
Modeling Business Processes of the Social Insurance Fund in Information System Runa WFE
NASA Astrophysics Data System (ADS)
Kataev, M. Yu; Bulysheva, L. A.; Xu, Li D.; Loseva, N. V.
2016-08-01
Introduction - Business processes are gradually becoming a tool that allows you at a new level to put employees or to make more efficient document management system. In these directions the main work, and presents the largest possible number of publications. However, business processes are still poorly implemented in public institutions, where it is very difficult to formalize the main existing processes. Us attempts to build a system of business processes for such state agencies as the Russian social insurance Fund (SIF), where virtually all of the processes, when different inputs have the same output: public service. The parameters of the state services (as a rule, time limits) are set by state laws and regulations. The article provides a brief overview of the FSS, the formulation of requirements to business processes, the justification of the choice of software for modeling business processes and create models of work in the system Runa WFE and optimization models one of the main business processes of the FSS. The result of the work of Runa WFE is an optimized model of the business process of FSS.
Similarity-Dissimilarity Competition in Disjunctive Classification Tasks
Mathy, Fabien; Haladjian, Harry H.; Laurent, Eric; Goldstone, Robert L.
2013-01-01
Typical disjunctive artificial classification tasks require participants to sort stimuli according to rules such as “x likes cars only when black and coupe OR white and SUV.” For categories like this, increasing the salience of the diagnostic dimensions has two simultaneous effects: increasing the distance between members of the same category and increasing the distance between members of opposite categories. Potentially, these two effects respectively hinder and facilitate classification learning, leading to competing predictions for learning. Increasing saliency may lead to members of the same category to be considered less similar, while the members of separate categories might be considered more dissimilar. This implies a similarity-dissimilarity competition between two basic classification processes. When focusing on sub-category similarity, one would expect more difficult classification when members of the same category become less similar (disregarding the increase of between-category dissimilarity); however, the between-category dissimilarity increase predicts a less difficult classification. Our categorization study suggests that participants rely more on using dissimilarities between opposite categories than finding similarities between sub-categories. We connect our results to rule- and exemplar-based classification models. The pattern of influences of within- and between-category similarities are challenging for simple single-process categorization systems based on rules or exemplars. Instead, our results suggest that either these processes should be integrated in a hybrid model, or that category learning operates by forming clusters within each category. PMID:23403979
Random numbers certified by Bell's theorem.
Pironio, S; Acín, A; Massar, S; de la Giroday, A Boyer; Matsukevich, D N; Maunz, P; Olmschenk, S; Hayes, D; Luo, L; Manning, T A; Monroe, C
2010-04-15
Randomness is a fundamental feature of nature and a valuable resource for applications ranging from cryptography and gambling to numerical simulation of physical and biological systems. Random numbers, however, are difficult to characterize mathematically, and their generation must rely on an unpredictable physical process. Inaccuracies in the theoretical modelling of such processes or failures of the devices, possibly due to adversarial attacks, limit the reliability of random number generators in ways that are difficult to control and detect. Here, inspired by earlier work on non-locality-based and device-independent quantum information processing, we show that the non-local correlations of entangled quantum particles can be used to certify the presence of genuine randomness. It is thereby possible to design a cryptographically secure random number generator that does not require any assumption about the internal working of the device. Such a strong form of randomness generation is impossible classically and possible in quantum systems only if certified by a Bell inequality violation. We carry out a proof-of-concept demonstration of this proposal in a system of two entangled atoms separated by approximately one metre. The observed Bell inequality violation, featuring near perfect detection efficiency, guarantees that 42 new random numbers are generated with 99 per cent confidence. Our results lay the groundwork for future device-independent quantum information experiments and for addressing fundamental issues raised by the intrinsic randomness of quantum theory.
Panayi, Efstathios; Peters, Gareth W; Kyriakides, George
2017-01-01
Quantifying the effects of environmental factors over the duration of the growing process on Agaricus Bisporus (button mushroom) yields has been difficult, as common functional data analysis approaches require fixed length functional data. The data available from commercial growers, however, is of variable duration, due to commercial considerations. We employ a recently proposed regression technique termed Variable-Domain Functional Regression in order to be able to accommodate these irregular-length datasets. In this way, we are able to quantify the contribution of covariates such as temperature, humidity and water spraying volumes across the growing process, and for different lengths of growing processes. Our results indicate that optimal oxygen and temperature levels vary across the growing cycle and we propose environmental schedules for these covariates to optimise overall yields.
Panayi, Efstathios; Kyriakides, George
2017-01-01
Quantifying the effects of environmental factors over the duration of the growing process on Agaricus Bisporus (button mushroom) yields has been difficult, as common functional data analysis approaches require fixed length functional data. The data available from commercial growers, however, is of variable duration, due to commercial considerations. We employ a recently proposed regression technique termed Variable-Domain Functional Regression in order to be able to accommodate these irregular-length datasets. In this way, we are able to quantify the contribution of covariates such as temperature, humidity and water spraying volumes across the growing process, and for different lengths of growing processes. Our results indicate that optimal oxygen and temperature levels vary across the growing cycle and we propose environmental schedules for these covariates to optimise overall yields. PMID:28961254
A generalized framework for nucleosynthesis calculations
NASA Astrophysics Data System (ADS)
Sprouse, Trevor; Mumpower, Matthew; Aprahamian, Ani
2014-09-01
Simulating astrophysical events is a difficult process, requiring a detailed pairing of knowledge from both astrophysics and nuclear physics. Astrophysics guides the thermodynamic evolution of an astrophysical event. We present a nucleosynthesis framework written in Fortran that combines as inputs a thermodynamic evolution and nuclear data to time evolve the abundances of nuclear species. Through our coding practices, we have emphasized the applicability of our framework to any astrophysical event, including those involving nuclear fission. Because these calculations are often very complicated, our framework dynamically optimizes itself based on the conditions at each time step in order to greatly minimize total computation time. To highlight the power of this new approach, we demonstrate the use of our framework to simulate both Big Bang nucleosynthesis and r-process nucleosynthesis with speeds competitive with current solutions dedicated to either process alone.
[Clinical application of mass spectrometry in the pediatric field: current topics].
Yamaguchi, Seiji
2013-09-01
Mass spectrometry, including tandem mass spectrometry (MS/MS) and gas chromatography-mass spectrometry (GC/MS), is becoming prominent in the diagnosis of metabolic disorders in the pediatric field. It enables biochemical diagnosis of metabolic disorders from the metabolic profiles obtained by MS/MS and/or GC/MS. In neonatal mass screening for inherited metabolic disease (IMD) using MS/MS, amino acids and acylcarnitines on dried blood spots are analyzed. The target diseases include amino acidemia, urea cycle disorder, organic acidemia, and fatty acid oxidation disorder. In the MS/MS screening, organic acid analysis using GC/MS is required for differential and/or definite diagnosis of the IMDs. GC/MS data processing, however, is difficult, and metabolic diagnosis often requires the necessary skills and expertize. We developed an automated system of GC/MS data processing and autodiagnosis, and the biochemical diagnosis using GC/MS became markedly easier and user-friendly. Mass spectrometric techniques will expand from research laboratories to clinical laboratories in the near future.
He, Ying; Liang, Bin; Yang, Jun; Li, Shunzhi; He, Jin
2017-08-11
The Iterative Closest Points (ICP) algorithm is the mainstream algorithm used in the process of accurate registration of 3D point cloud data. The algorithm requires a proper initial value and the approximate registration of two point clouds to prevent the algorithm from falling into local extremes, but in the actual point cloud matching process, it is difficult to ensure compliance with this requirement. In this paper, we proposed the ICP algorithm based on point cloud features (GF-ICP). This method uses the geometrical features of the point cloud to be registered, such as curvature, surface normal and point cloud density, to search for the correspondence relationships between two point clouds and introduces the geometric features into the error function to realize the accurate registration of two point clouds. The experimental results showed that the algorithm can improve the convergence speed and the interval of convergence without setting a proper initial value.
Liang, Bin; Yang, Jun; Li, Shunzhi; He, Jin
2017-01-01
The Iterative Closest Points (ICP) algorithm is the mainstream algorithm used in the process of accurate registration of 3D point cloud data. The algorithm requires a proper initial value and the approximate registration of two point clouds to prevent the algorithm from falling into local extremes, but in the actual point cloud matching process, it is difficult to ensure compliance with this requirement. In this paper, we proposed the ICP algorithm based on point cloud features (GF-ICP). This method uses the geometrical features of the point cloud to be registered, such as curvature, surface normal and point cloud density, to search for the correspondence relationships between two point clouds and introduces the geometric features into the error function to realize the accurate registration of two point clouds. The experimental results showed that the algorithm can improve the convergence speed and the interval of convergence without setting a proper initial value. PMID:28800096
Jäger, B
1983-09-01
The technology of composting must guarantee the material-chemical, biological and physical-technical reaction conditions essential for the rotting process. In this, the constituents of the input material and the C/N ratio play an important role. Maintaining optimum decomposition conditions is rendered difficult by the fact that the physical-technical reaction parameters partly exclude each other. These are: optimum humidity, adequate air/oxygen supply, large active surface, loose structure with sufficient decomposition volume. The processing of the raw refuse required to maintain the physical-technical reaction parameters can be carried out either by the conventional method of preliminary fragmentizing, sieving and mixing or else in conjunction with separating recycling in adapted systems. The latter procedure obviates some drawbacks which mainly result from the high expenditure required for preliminary fragmentation of the raw refuse. Moreover, presorting affords the possibility of reducing the heavy-metal content of the organic composing fraction and this approaches a solution to the noxa disposal problem which at present stands in the way of being accepted as an ecological waste disposal method.
Inter-hemispheric interaction facilitates face processing.
Compton, Rebecca J
2002-01-01
Many recent studies have revealed that interaction between the left and right cerebral hemispheres can aid in task performance, but these studies have tended to examine perception of simple stimuli such as letters, digits or simple shapes, which may have limited naturalistic validity. The present study extends these prior findings to a more naturalistic face perception task. Matching tasks required subjects to indicate when a target face matched one of two probe faces. Matches could be either across-field, requiring inter-hemispheric interaction, or within-field, not requiring inter-hemispheric interaction. Subjects indicated when faces matched in emotional expression (Experiment 1; n=32) or in character identity (Experiment 2; n=32). In both experiments, across-field performance was significantly better than within-field performance, supporting the primary hypothesis. Further, this advantage was greater for the more difficult character identity task. Results offer qualified support for the hypothesis that inter-hemispheric interaction is especially advantageous as task demands increase.
Tailoring Thin Film-Lacquer Coatings for Space Application
NASA Technical Reports Server (NTRS)
Peters, Wanda C.; Harris, George; Miller, Grace; Petro, John
1998-01-01
Thin film coatings have the capability of obtaining a wide range of thermal radiative properties, but the development of thin film coatings can sometimes be difficult and costly when trying to achieve highly specular surfaces. Given any space mission's thermal control requirements, there is often a need for a variation of solar absorptance (Alpha(s)), emittance (epsilon) and/or highly specular surfaces. The utilization of thin film coatings is one process of choice for meeting challenging thermal control requirements because of its ability to provide a wide variety of Alpha(s)/epsilon ratios. Thin film coatings' radiative properties can be tailored to meet specific thermal control requirements through the use of different metals and the variation of dielectric layer thickness. Surface coatings can be spectrally selective to enhance radiative coupling and decoupling. The application of lacquer to a surface can also provide suitable specularity for thin film application without the cost and difficulty associated with polishing.
Tailoring Thin Film-Lacquer Coatings for Space Applications
NASA Technical Reports Server (NTRS)
Peters, Wanda C.; Harris, George; Miller, Grace; Petro, John
1998-01-01
Thin film coatings have the capability of obtaining a wide range of thermal radiative properties, but the development of thin film coatings can sometimes be difficult and costly when trying to achieve highly specular surfaces. Given any space mission's then-nal control requirements, there is often a need for a variation of solar absorptance (alpha(sub s)), emittance (epsilon) and/or highly specular surfaces. The utilization of thin film coatings is one process of choice for meeting challenging thermal control requirements because of its ability to provide a wide variety of alpha(sub s)/epsilon ratios. Thin film coatings' radiative properties can be tailored to meet specific thermal control requirements through the use of different metals and the variation of dielectric layer thickness. Surface coatings can be spectrally selective to enhance radiative coupling and decoupling. The application of lacquer to a surface can also provide suitable specularity for thin film application without the cost and difficulty associated with polishing.
Emotional development in adolescence: what can be learned from a high school theater program?
Larson, Reed W; Brown, Jane R
2007-01-01
Grounded-theory analyses were used to formulate propositions regarding the processes of adolescent emotional development. Progress in understanding this difficult topic requires close examination of emotional experience in context, and to do this the authors drew on qualitative data collected over the course of a high school theater production. Participants' (ages 14-17) accounts of experiences in this setting demonstrated their capacity to actively extract emotional knowledge and to develop strategies for managing emotions. These accounts suggested that youth's repeated "hot" experience of unfolding emotional episodes in the setting provided material for this active process of learning. Youth also learned by drawing on and internalizing the emotion culture of the setting, which provided concepts, strategies, and tools for managing emotional episodes.
Hospitals' strategies for orchestrating selection of physician preference items.
Montgomery, Kathleen; Schneller, Eugene S
2007-06-01
This article analyzes hospitals' strategies to shape physicians' behavior and counter suppliers' power in purchasing physician preference items. Two models of standardization are limitations on the range of manufacturers or products (the "formulary" model) and price ceilings for particular item categories (the "payment-cap" model), both requiring processes to define product equivalencies often with inadequate product comparison data. The formulary model is more difficult to implement because of physicians' resistance to top-down dictates. The payment-cap model is more feasible because it preserves physicians' choice while also restraining manufacturers' power. Hospitals may influence physicians' involvement through a process of orchestration that includes committing to improve clinical facilities, scheduling, and training and fostering a culture of mutual trust and respect.
Hospitals' Strategies for Orchestrating Selection of Physician Preference Items
Montgomery, Kathleen; Schneller, Eugene S
2007-01-01
This article analyzes hospitals' strategies to shape physicians' behavior and counter suppliers' power in purchasing physician preference items. Two models of standardization are limitations on the range of manufacturers or products (the “formulary” model) and price ceilings for particular item categories (the “payment-cap” model), both requiring processes to define product equivalencies often with inadequate product comparison data. The formulary model is more difficult to implement because of physicians' resistance to top-down dictates. The payment-cap model is more feasible because it preserves physicians' choice while also restraining manufacturers' power. Hospitals may influence physicians' involvement through a process of orchestration that includes committing to improve clinical facilities, scheduling, and training and fostering a culture of mutual trust and respect. PMID:17517118
Sustainable design for automotive products: dismantling and recycling of end-of-life vehicles.
Tian, Jin; Chen, Ming
2014-02-01
The growth in automotive production has increased the number of end-of-life vehicles (ELVs) annually. The traditional approach ELV processing involves dismantling, shredding, and landfill disposal. The "3R" (i.e., reduce, reuse, and recycle) principle has been increasingly employed in processing ELVs, particularly ELV parts, to promote sustainable development. The first step in processing ELVs is dismantling. However, certain parts of the vehicle are difficult to disassemble and use in practice. The extended producer responsibility policy requires carmakers to contribute in the processing of scrap cars either for their own developmental needs or for social responsibility. The design for dismantling approach can be an effective solution to the existing difficulties in dismantling ELVs. This approach can also provide guidelines in the design of automotive products. This paper illustrates the difficulty of handling polymers in dashboards. The physical properties of polymers prevent easy separation and recycling by using mechanical methods. Thus, dealers have to rely on chemical methods such as pyrolysis. Therefore, car designers should use a single material to benefit dealers. The use of materials for effective end-of-life processing without sacrificing the original performance requirements of the vehicle should be explored. Copyright © 2013 Elsevier Ltd. All rights reserved.
A child with a difficult airway: what do I do next?
Engelhardt, Thomas; Weiss, Markus
2012-06-01
Difficulties in pediatric airway management are common and continue to result in significant morbidity and mortality. This review reports on current concepts in approaching a child with a difficult airway. Routine airway management in healthy children with normal airways is simple in experienced hands. Mask ventilation (oxygenation) is always possible and tracheal intubation normally simple. However, transient hypoxia is common in these children usually due to unexpected anatomical and functional airway problems or failure to ventilate during rapid sequence induction. Anatomical airway problems (upper airway collapse and adenoid hypertrophy) and functional airway problems (laryngospasm, bronchospasm, insufficient depth of anesthesia and muscle rigidity, gastric hyperinflation, and alveolar collapse) require urgent recognition and treatment algorithms due to insufficient oxygen reserves. Early muscle paralysis and epinephrine administration aids resolution of these functional airway obstructions. Children with an 'impaired' normal (foreign body, allergy, and inflammation) or an expected difficult (scars, tumors, and congenital) airway require careful planning and expertise. Training in the recognition and management of these different situations as well as a suitably equipped anesthesia workstation and trained personnel are essential. The healthy child with an unexpected airway problem requires clear strategies. The 'impaired' normal pediatric airway may be handled by anesthetists experienced with children, whereas the expected difficult pediatric airway requires dedicated pediatric anesthesia specialist care and should only be managed in specialized centers.
Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N. A.; Zaheer, Kashif Bin
2015-01-01
Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called ‘StakeMeter’. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error. PMID:25799490
NASA Astrophysics Data System (ADS)
Sahu, Anshuman Kumar; Chatterjee, Suman; Nayak, Praveen Kumar; Sankar Mahapatra, Siba
2018-03-01
Electrical discharge machining (EDM) is a non-traditional machining process which is widely used in machining of difficult-to-machine materials. EDM process can produce complex and intrinsic shaped component made of difficult-to-machine materials, largely applied in aerospace, biomedical, die and mold making industries. To meet the required applications, the EDMed components need to possess high accuracy and excellent surface finish. In this work, EDM process is performed using Nitinol as work piece material and AlSiMg prepared by selective laser sintering (SLS) as tool electrode along with conventional copper and graphite electrodes. The SLS is a rapid prototyping (RP) method to produce complex metallic parts by additive manufacturing (AM) process. Experiments have been carried out varying different process parameters like open circuit voltage (V), discharge current (Ip), duty cycle (τ), pulse-on-time (Ton) and tool material. The surface roughness parameter like average roughness (Ra), maximum height of the profile (Rt) and average height of the profile (Rz) are measured using surface roughness measuring instrument (Talysurf). To reduce the number of experiments, design of experiment (DOE) approach like Taguchi’s L27 orthogonal array has been chosen. The surface properties of the EDM specimen are optimized by desirability function approach and the best parametric setting is reported for the EDM process. Type of tool happens to be the most significant parameter followed by interaction of tool type and duty cycle, duty cycle, discharge current and voltage. Better surface finish of EDMed specimen can be obtained with low value of voltage (V), discharge current (Ip), duty cycle (τ) and pulse on time (Ton) along with the use of AlSiMg RP electrode.
Yang, Jae-Seong; Kwon, Oh Sung; Kim, Sanguk; Jang, Sung Key
2013-01-01
Successful viral infection requires intimate communication between virus and host cell, a process that absolutely requires various host proteins. However, current efforts to discover novel host proteins as therapeutic targets for viral infection are difficult. Here, we developed an integrative-genomics approach to predict human genes involved in the early steps of hepatitis C virus (HCV) infection. By integrating HCV and human protein associations, co-expression data, and tight junction-tetraspanin web specific networks, we identified host proteins required for the early steps in HCV infection. Moreover, we validated the roles of newly identified proteins in HCV infection by knocking down their expression using small interfering RNAs. Specifically, a novel host factor CD63 was shown to directly interact with HCV E2 protein. We further demonstrated that an antibody against CD63 blocked HCV infection, indicating that CD63 may serve as a new therapeutic target for HCV-related diseases. The candidate gene list provides a source for identification of new therapeutic targets. PMID:23593195
Identifying Opportunities for Vertical Integration of Biochemistry and Clinical Medicine.
Wendelberger, Karen J.; Burke, Rebecca; Haas, Arthur L.; Harenwattananon, Marisa; Simpson, Deborah
1998-01-01
Objectives: Retention of basic science knowledge, as judged by National Board of Medical Examiners' (NBME) data, suffers due to lack of apparent relevance and isolation of instruction from clinical application, especially in biochemistry. However, the literature reveals no systematic process for identifying key biochemical concepts and associated clinical conditions. This study systematically identified difficult biochemical concepts and their common clinical conditions as a critical step towards enhancing relevance and retention of biochemistry.Methods: A multi-step/ multiple stakeholder process was used to: (1) identify important biochemistry concepts; (2) determine students' perceptions of concept difficulty; (3) assess biochemistry faculty, student, and clinical teaching scholars' perceived relevance of identified concepts; and (4) identify associated common clinical conditions for relevant and difficult concepts. Surveys and a modified Delphi process were used to gather data, subsequently analyzed using SPSS for Windows.Results: Sixteen key biochemical concepts were identified. Second year medical students rated 14/16 concepts as extremely difficult while fourth year students rated nine concepts as moderately to extremely difficult. On average, each teaching scholar generated common clinical conditions for 6.2 of the 16 concepts, yielding a set of seven critical concepts and associated clinical conditions.Conclusions: Key stakeholders in the instructional process struggle to identify biochemistry concepts that are critical, difficult to learn and associated with common clinical conditions. However, through a systematic process beginning with identification of concepts and associated clinical conditions, relevance of basic science instruction can be enhanced.
Purdon, Patrick L.; Millan, Hernan; Fuller, Peter L.; Bonmassar, Giorgio
2008-01-01
Simultaneous recording of electrophysiology and functional magnetic resonance imaging (fMRI) is a technique of growing importance in neuroscience. Rapidly evolving clinical and scientific requirements have created a need for hardware and software that can be customized for specific applications. Hardware may require customization to enable a variety of recording types (e.g., electroencephalogram, local field potentials, or multi-unit activity) while meeting the stringent and costly requirements of MRI safety and compatibility. Real-time signal processing tools are an enabling technology for studies of learning, attention, sleep, epilepsy, neurofeedback, and neuropharmacology, yet real-time signal processing tools are difficult to develop. We describe an open source system for simultaneous electrophysiology and fMRI featuring low-noise (< 0.6 uV p-p input noise), electromagnetic compatibility for MRI (tested up to 7 Tesla), and user-programmable real-time signal processing. The hardware distribution provides the complete specifications required to build an MRI-compatible electrophysiological data acquisition system, including circuit schematics, print circuit board (PCB) layouts, Gerber files for PCB fabrication and robotic assembly, a bill of materials with part numbers, data sheets, and vendor information, and test procedures. The software facilitates rapid implementation of real-time signal processing algorithms. This system has used in human EEG/fMRI studies at 3 and 7 Tesla examining the auditory system, visual system, sleep physiology, and anesthesia, as well as in intracranial electrophysiological studies of the non-human primate visual system during 3 Tesla fMRI, and in human hyperbaric physiology studies at depths of up to 300 feet below sea level. PMID:18761038
Purdon, Patrick L; Millan, Hernan; Fuller, Peter L; Bonmassar, Giorgio
2008-11-15
Simultaneous recording of electrophysiology and functional magnetic resonance imaging (fMRI) is a technique of growing importance in neuroscience. Rapidly evolving clinical and scientific requirements have created a need for hardware and software that can be customized for specific applications. Hardware may require customization to enable a variety of recording types (e.g., electroencephalogram, local field potentials, or multi-unit activity) while meeting the stringent and costly requirements of MRI safety and compatibility. Real-time signal processing tools are an enabling technology for studies of learning, attention, sleep, epilepsy, neurofeedback, and neuropharmacology, yet real-time signal processing tools are difficult to develop. We describe an open-source system for simultaneous electrophysiology and fMRI featuring low-noise (<0.6microV p-p input noise), electromagnetic compatibility for MRI (tested up to 7T), and user-programmable real-time signal processing. The hardware distribution provides the complete specifications required to build an MRI-compatible electrophysiological data acquisition system, including circuit schematics, print circuit board (PCB) layouts, Gerber files for PCB fabrication and robotic assembly, a bill of materials with part numbers, data sheets, and vendor information, and test procedures. The software facilitates rapid implementation of real-time signal processing algorithms. This system has been used in human EEG/fMRI studies at 3 and 7T examining the auditory system, visual system, sleep physiology, and anesthesia, as well as in intracranial electrophysiological studies of the non-human primate visual system during 3T fMRI, and in human hyperbaric physiology studies at depths of up to 300 feet below sea level.
N-ViroTech--a novel process for the treatment of nutrient limited wastewaters.
Slade, A H; Gapes, D J; Stuthridge, T R; Anderson, S M; Dare, P H; Pearson, H G W; Dennis, M
2004-01-01
As pulp and paper wastewaters are mostly deficient in nitrogen and phosphorus, historical practice has dictated that they cannot be effectively treated using microbiological processes without the addition of supplementary nutrients, such as urea and phosphoric acid. Supplementation is a difficult step to manage efficiently, requiring extensive post-treatment monitoring and some degree of overdosing to ensure sufficient nutrient availability under all conditions. As a result, treated wastewaters usually contain excess amounts of both nutrients, leading to potential impacts on the receiving waters such as eutrophication. N-ViroTech is a highly effective alternative treatment technology which overcomes this nutrient deficiency/excess paradox. The process relies on communities of nitrogen-fixing bacteria, which are able to directly fix nitrogen from the atmosphere, thus satisfying their cellular nitrogen requirements. The process relies on manipulation of growth conditions within the biological system to maintain a nitrogen-fixing population whilst achieving target wastewater treatment performance. The technology has significant advantages over conventional activated sludge operation, including: Improved environmental performance. Nutrient loadings in the final treated effluent for selected nitrogen and phosphorus species (particularly ammonium and orthophosphate) may be reduced by over 90% compared to conventional systems; Elimination of nitrogen supplementation, and minimisation of phosphorus supplementation, thus achieving significant chemical savings and resulting in between 25% and 35% savings in operational costs for a typical system; Self-regulation of nutrient requirements, as the bacteria only use as much nitrogen as they require, allowing for substantially less operator intervention and monitoring. This paper will summarise critical performance outcomes of the N-ViroTech process utilising results from laboratory-, pilot-scale and recent alpha-adopter, full-scale trials.
NASA Astrophysics Data System (ADS)
Costa, D.; Pomeroy, J. W.; Wheater, H. S.
2017-12-01
Early ionic pulses in spring snowmelt can cause the temporary acidification of streams and account for a significant portion of the total annual nutrient export, particularly in seasonally snow-covered areas where the frozen ground may limit runoff-soil contact and cause the rapid delivery of these ions to streams. Ionic pulses are a consequence of snow ion exclusion, a process induced by snow metamorphism where ions are segregated from the snow grains losing mass to the surface of the grains gaining mass. While numerous studies have been successful in providing quantitative evidence of this process, few mechanistic mathematical models have been proposed for diagnostic and prediction. A few early modelling attempts have been successful in capturing this process assuming transport through porous media with variable porosity, however their implementation is difficult because they require complex models of snow physics to resolve the evolution of in-snow properties and processes during snowmelt, such as heat conduction, metamorphism, melt and water flow. Furthermore, initial snowpack to snow-surface ion concentration ratios are difficult to measure but are required to initiate these models and ion exclusion processes are not represented in a physically-based transparent fashion. In this research, a standalone numerical model has been developed to capture ionic pulses in snowmelt by emulating solute leaching from snow grains during melt and its subsequent transport by the percolating meltwater. Estimating snow porosity and water content dynamics is shown to be a viable alternative to deployment of complex snow physics models for this purpose. The model was applied to four study sites located in the Arctic and in Sierra Nevada to test for different climatic and hydrological conditions. The model compares very well with observations and could capture both the timing and magnitude of early melt ionic pulses accurately. This study demonstrates how physically based approaches can provide successful simulations of the spatial and temporal fluxes of snowmelt ions, which can be used to improve the prediction of nutrient export in cold regions for the spring freshet.
NASA Astrophysics Data System (ADS)
Clark, M. P.; Nijssen, B.; Lundquist, J. D.; Luce, C. H.; Musselman, K. N.; Wayand, N. E.; Ou, M.; Lapo, K. E.
2016-12-01
Early ionic pulses in spring snowmelt can cause the temporary acidification of streams and account for a significant portion of the total annual nutrient export, particularly in seasonally snow-covered areas where the frozen ground may limit runoff-soil contact and cause the rapid delivery of these ions to streams. Ionic pulses are a consequence of snow ion exclusion, a process induced by snow metamorphism where ions are segregated from the snow grains losing mass to the surface of the grains gaining mass. While numerous studies have been successful in providing quantitative evidence of this process, few mechanistic mathematical models have been proposed for diagnostic and prediction. A few early modelling attempts have been successful in capturing this process assuming transport through porous media with variable porosity, however their implementation is difficult because they require complex models of snow physics to resolve the evolution of in-snow properties and processes during snowmelt, such as heat conduction, metamorphism, melt and water flow. Furthermore, initial snowpack to snow-surface ion concentration ratios are difficult to measure but are required to initiate these models and ion exclusion processes are not represented in a physically-based transparent fashion. In this research, a standalone numerical model has been developed to capture ionic pulses in snowmelt by emulating solute leaching from snow grains during melt and its subsequent transport by the percolating meltwater. Estimating snow porosity and water content dynamics is shown to be a viable alternative to deployment of complex snow physics models for this purpose. The model was applied to four study sites located in the Arctic and in Sierra Nevada to test for different climatic and hydrological conditions. The model compares very well with observations and could capture both the timing and magnitude of early melt ionic pulses accurately. This study demonstrates how physically based approaches can provide successful simulations of the spatial and temporal fluxes of snowmelt ions, which can be used to improve the prediction of nutrient export in cold regions for the spring freshet.
Transboundary environmental assessment: lessons from OTAG. The Ozone Transport Assessment Group.
Farrell, Alexander E; Keating, Terry J
2002-06-15
The nature and role of assessments in creating policy for transboundary environmental problems is discussed. Transboundary environmental problems are particularly difficult to deal with because they typically require cooperation among independent political jurisdictions (e.g., states or nations) which face differing costs and benefits and which often have different technical capabilities and different interests. In particular, transboundary pollution issues generally involve the problem of an upstream source and a downstream receptor on opposite sides of a relevant political boundary, making it difficult for the jurisdiction containing the receptor to obtain relief from the pollution problem. The Ozone Transport Assessment Group (OTAG) addressed such a transboundary problem: the long-range transport of tropospheric ozone (i.e., photochemical smog) across the eastern United States. The evolution of the science and policy that led to OTAG, the OTAG process, and its outcomes are presented. Lessons that are available to be learned from the OTAG experience, particularly for addressing similar transboundary problems such as regional haze, are discussed.
Nijman, Rien J M
2008-09-01
The ability to maintain normal continence for urine and stools is not achievable in all children by a certain age. Gaining control of urinary and fecal continence is a complex process, and not all steps and factors involved are fully understood. While normal development of anatomy and physiology are prerequisites to becoming fully continent, anatomic abnormalities, such as bladder exstrophy, epispadias, ectopic ureters, and neurogenic disturbances that can usually be recognized at birth and cause incontinence, will require specialist treatment, not only to restore continence but also to preserve renal function. Most forms of urinary incontinence are not caused by an anatomic or physiologic abnormality and, hence, are more difficult to diagnose and their management requires a sound knowledge of bladder and bowel function.
Toward Scalable Trustworthy Computing Using the Human-Physiology-Immunity Metaphor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hively, Lee M; Sheldon, Frederick T
The cybersecurity landscape consists of an ad hoc patchwork of solutions. Optimal cybersecurity is difficult for various reasons: complexity, immense data and processing requirements, resource-agnostic cloud computing, practical time-space-energy constraints, inherent flaws in 'Maginot Line' defenses, and the growing number and sophistication of cyberattacks. This article defines the high-priority problems and examines the potential solution space. In that space, achieving scalable trustworthy computing and communications is possible through real-time knowledge-based decisions about cyber trust. This vision is based on the human-physiology-immunity metaphor and the human brain's ability to extract knowledge from data and information. The article outlines future steps towardmore » scalable trustworthy systems requiring a long-term commitment to solve the well-known challenges.« less
Analysis of Sulfidation Routes for Processing Weathered Ilmenite Concentrates Containing Impurities
NASA Astrophysics Data System (ADS)
Ahmad, Sazzad; Rhamdhani, M. Akbar; Pownceby, Mark I.; Bruckard, Warren J.
Rutile is the preferred feedstock for producing high-grade TiO2 pigment but due to decreasing resources, alternative materials such as ilmenite is now used to produce a synthetic rutile (SR) feedstock. This requires removal of impurities (e.g. Fe, Mg, Mn) which, for a primary ilmenite is straightforward process. Processing of weathered ilmenite however, is complex, especially when chrome-bearing impurities are present since minor chromium downgrades the SR market value as it imparts color to the final TiO2 pigment, Chrome-bearing spinels are a problem in weathered ilmenites from the Murray Basin, Australia as their physical and chemical properties overlap with ilmenite making separation difficult. In this paper, different sulfidation process routes for weathered ilmenites are analyzed for their applicability to Murray Basin deposits as a mean of remove chrome spinel impurities. Thermodynamic and experimental studies indicated that selective sulfidation of chrome-bearing spinel can be achieved under controlled pO2 and pS2 processing conditions thereby making them amenable to separation.
Rani, D Amutha; Boccaccini, A R; Deegan, D; Cheeseman, C R
2008-11-01
Current disposal options for APC residues in the UK and alternative treatment technologies developed world-wide have been reviewed. APC residues are currently landfilled in the UK where they undergo in situ solidification, although the future acceptability of this option is uncertain because the EU waste acceptance criteria (WAC) introduce strict limits on leaching that are difficult to achieve. Other APC residue treatment processes have been developed which are reported to reduce leaching to below relevant regulatory limits. The Ferrox process, the VKI process, the WES-PHix process, stabilisation/solidification using cementitious binders and a range of thermal treatment processes are reviewed. Thermal treatment technologies convert APC residues combined with other wastes into inert glass or glass-ceramics that encapsulate heavy metals. The waste management industry will inevitably use the cheapest available option for treating APC residues and strict interpretation and enforcement of waste legislation is required if new, potentially more sustainable technologies are to become commercially viable.
Synthetic Biology: Tools to Design, Build, and Optimize Cellular Processes
Young, Eric; Alper, Hal
2010-01-01
The general central dogma frames the emergent properties of life, which make biology both necessary and difficult to engineer. In a process engineering paradigm, each biological process stream and process unit is heavily influenced by regulatory interactions and interactions with the surrounding environment. Synthetic biology is developing the tools and methods that will increase control over these interactions, eventually resulting in an integrative synthetic biology that will allow ground-up cellular optimization. In this review, we attempt to contextualize the areas of synthetic biology into three tiers: (1) the process units and associated streams of the central dogma, (2) the intrinsic regulatory mechanisms, and (3) the extrinsic physical and chemical environment. Efforts at each of these three tiers attempt to control cellular systems and take advantage of emerging tools and approaches. Ultimately, it will be possible to integrate these approaches and realize the vision of integrative synthetic biology when cells are completely rewired for biotechnological goals. This review will highlight progress towards this goal as well as areas requiring further research. PMID:20150964
Dynamic behavior of particles in spacecraft
NASA Technical Reports Server (NTRS)
Perrine, B. S.
1981-01-01
The behavior of particles relative to a spacecraft frame of reference was examined. Significant spatial excursions of particles in space can occur relative to the spacecraft frame of reference as a result of drag deceleration of the vehicle. These vehicle excursions tend to be large as time increases. Thus, if the particle is required to remain in a specified volume, constraints may be required. Thus, for example, in levitation experiments it may be extremely difficult to turn off the forces of constraint which keep the particles in a specified region. This means experiments which are sensitive to disturbances may be very difficult to perform if perturbation forces are required to be absent.
Chahal, C; van den Akker, B; Young, F; Franco, C; Blackbeard, J; Monis, P
2016-01-01
Disinfection guidelines exist for pathogen inactivation in potable water and recycled water, but wastewater with high numbers of particles can be more difficult to disinfect, making compliance with the guidelines problematic. Disinfection guidelines specify that drinking water with turbidity ≥1 Nephelometric Turbidity Units (NTU) is not suitable for disinfection and therefore not fit for purpose. Treated wastewater typically has higher concentrations of particles (1-10NTU for secondary treated effluent). Two processes widely used for disinfecting wastewater are chlorination and ultraviolet radiation. In both cases, particles in wastewater can interfere with disinfection and can significantly increase treatment costs by increasing operational expenditure (chemical demand, power consumption) or infrastructure costs by requiring additional treatment processes to achieve the required levels of pathogen inactivation. Many microorganisms (viruses, bacteria, protozoans) associate with particles, which can allow them to survive disinfection processes and cause a health hazard. Improved understanding of this association will enable development of cost-effective treatment, which will become increasingly important as indirect and direct potable reuse of wastewater becomes more widespread in both developed and developing countries. This review provides an overview of wastewater and associated treatment processes, the pathogens in wastewater, the nature of particles in wastewater and how they interact with pathogens, and how particles can impact disinfection processes. Copyright © 2016 Elsevier Inc. All rights reserved.
Research Spotlight: New method to assess coral reef health
NASA Astrophysics Data System (ADS)
Tretkoff, Ernie
2011-03-01
Coral reefs around the world are becoming stressed due to rising temperatures, ocean acidification, overfishing, and other factors. Measuring community level rates of photosynthesis, respiration, and biogenic calcification is essential to assessing the health of coral reef ecosystems because the balance between these processes determines the potential for reef growth and the export of carbon. Measurements of biological productivity have typically been made by tracing changes in dissolved oxygen in seawater as it passes over a reef. However, this is a labor-intensive and difficult method, requiring repeated measurements. (Geophysical Research Letters, doi:10.1029/2010GL046179, 2011)
Solid organ transplantation: referral, management, and outcomes in HIV-infected patients.
Roland, Michelle E; Carlson, Laurie L; Frassetto, Lynda A; Stock, Peter G
2006-12-01
Advances in HIV management make it difficult to deny solid organ transplantation to HIV-infected patients based on futility arguments. Preliminary studies suggest that both patient and graft survival are similar in HIV-negative and HIV-positive transplant recipients. While there has been no significant HIV disease progression, substantial interactions between immunosuppressants and antiretroviral drugs necessitate careful monitoring. The evaluation and management of HIV-infected transplant candidates and recipients require excellent communication among a multidisciplinary team, the primary HIV care provider, and the patient. Timely referral for transplant evaluation will prevent unnecessary mortality during the pre-transplant evaluation process.
Semantic integration of information about orthologs and diseases: the OGO system.
Miñarro-Gimenez, Jose Antonio; Egaña Aranguren, Mikel; Martínez Béjar, Rodrigo; Fernández-Breis, Jesualdo Tomás; Madrid, Marisa
2011-12-01
Semantic Web technologies like RDF and OWL are currently applied in life sciences to improve knowledge management by integrating disparate information. Many of the systems that perform such task, however, only offer a SPARQL query interface, which is difficult to use for life scientists. We present the OGO system, which consists of a knowledge base that integrates information of orthologous sequences and genetic diseases, providing an easy to use ontology-constrain driven query interface. Such interface allows the users to define SPARQL queries through a graphical process, therefore not requiring SPARQL expertise. Copyright © 2011 Elsevier Inc. All rights reserved.
Apparatus for electroplating particles of small dimension
Yu, C.M.; Illige, J.D.
1980-09-19
The thickness, uniformity, and surface smoothness requirements for surface coatings of glass microspheres for use as targets for laser fusion research are critical. Because of thier minute size, the microspheres are difficult to manipulate and control in electroplating systems. The electroplating apparatus of the present invention addresses these problems by providing a cathode cell having a cell chamber, a cathode and an anode electrically isolated from each other and connected to an electrical power source. During the plating process, the cathode is controllably vibrated along with solution pulse to maintain the particles in random free motion so as to attain the desired properties.
Joelsson, Daniel; Moravec, Phil; Troutman, Matthew; Pigeon, Joseph; DePhillips, Pete
2008-08-20
Transferring manual ELISAs to automated platforms requires optimizing the assays for each particular robotic platform. These optimization experiments are often time consuming and difficult to perform using a traditional one-factor-at-a-time strategy. In this manuscript we describe the development of an automated process using statistical design of experiments (DOE) to quickly optimize immunoassays for precision and robustness on the Tecan EVO liquid handler. By using fractional factorials and a split-plot design, five incubation time variables and four reagent concentration variables can be optimized in a short period of time.
Ennis, William J; Lee, Claudia; Plummer, Malgorzata; Meneses, Patricio
2011-01-01
Wound healing is a complex pathway that requires cells, an appropriate biochemical environment (i.e., cytokines, chemokines), an extracellular matrix, perfusion, and the application of both macrostrain and microstrain. The process is both biochemically complex and energy dependent. Healing can be assisted in difficult cases through the use of physical modalities. In the current literature, there is much debate over which treatment modality, dosage level, and timing is optimal. The mechanism of action for both electrical stimulation and ultrasound are reviewed along with possible clinical applications for the plastic surgeon.
Fatigue Crack Growth in Peened Friction Stir Welds
NASA Technical Reports Server (NTRS)
Forth, Scott C.; Hatamleh, Omar
2008-01-01
Friction stir welding induces residual stresses that accelerates fatigue crack growth in the weld nugget. Shot peening over the weld had little effect on growth rate. Laser peening over the weld retarded the growth rate: Final crack growth rate was comparable to the base, un-welded material. Crack tunneling evident from residual compressive stresses. 2195-T8 fracture surfaces were highly textured. Texturing makes comparisons difficult as the material system is affecting the data as much as the processing. Material usage becoming more common in space applications requiring additional work to develop useful datasets for damage tolerance analyses.
ERIC Educational Resources Information Center
Arnon, Inbal
2010-01-01
Children find object relative clauses difficult. They show poor comprehension that lags behind production into their fifth year. This finding has shaped models of relative clause acquisition, with appeals to processing heuristics or syntactic preferences to explain why object relatives are more difficult than subject relatives. Two studies here…
Simplified Interval Observer Scheme: A New Approach for Fault Diagnosis in Instruments
Martínez-Sibaja, Albino; Astorga-Zaragoza, Carlos M.; Alvarado-Lassman, Alejandro; Posada-Gómez, Rubén; Aguila-Rodríguez, Gerardo; Rodríguez-Jarquin, José P.; Adam-Medina, Manuel
2011-01-01
There are different schemes based on observers to detect and isolate faults in dynamic processes. In the case of fault diagnosis in instruments (FDI) there are different diagnosis schemes based on the number of observers: the Simplified Observer Scheme (SOS) only requires one observer, uses all the inputs and only one output, detecting faults in one detector; the Dedicated Observer Scheme (DOS), which again uses all the inputs and just one output, but this time there is a bank of observers capable of locating multiple faults in sensors, and the Generalized Observer Scheme (GOS) which involves a reduced bank of observers, where each observer uses all the inputs and m-1 outputs, and allows the localization of unique faults. This work proposes a new scheme named Simplified Interval Observer SIOS-FDI, which does not requires the measurement of any input and just with just one output allows the detection of unique faults in sensors and because it does not require any input, it simplifies in an important way the diagnosis of faults in processes in which it is difficult to measure all the inputs, as in the case of biologic reactors. PMID:22346593
43 CFR 11.83 - Damage determination phase-use value methodologies.
Code of Federal Regulations, 2010 CFR
2010-10-01
... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...
43 CFR 11.83 - Damage determination phase-use value methodologies.
Code of Federal Regulations, 2013 CFR
2013-10-01
... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...
43 CFR 11.83 - Damage determination phase-use value methodologies.
Code of Federal Regulations, 2014 CFR
2014-10-01
... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...
43 CFR 11.83 - Damage determination phase-use value methodologies.
Code of Federal Regulations, 2012 CFR
2012-10-01
... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...
43 CFR 11.83 - Damage determination phase-use value methodologies.
Code of Federal Regulations, 2011 CFR
2011-10-01
... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...
Flight dynamics facility operational orbit determination support for the ocean topography experiment
NASA Technical Reports Server (NTRS)
Bolvin, D. T.; Schanzle, A. F.; Samii, M. V.; Doll, C. E.
1991-01-01
The Ocean Topography Experiment (TOPEX/POSEIDON) mission is designed to determine the topography of the Earth's sea surface across a 3 yr period, beginning with launch in June 1992. The Goddard Space Flight Center Dynamics Facility has the capability to operationally receive and process Tracking and Data Relay Satellite System (TDRSS) tracking data. Because these data will be used to support orbit determination (OD) aspects of the TOPEX mission, the Dynamics Facility was designated to perform TOPEX operational OD. The scientific data require stringent OD accuracy in navigating the TOPEX spacecraft. The OD accuracy requirements fall into two categories: (1) on orbit free flight; and (2) maneuver. The maneuver OD accuracy requirements are of two types; premaneuver planning and postmaneuver evaluation. Analysis using the Orbit Determination Error Analysis System (ODEAS) covariance software has shown that, during the first postlaunch mission phase of the TOPEX mission, some postmaneuver evaluation OD accuracy requirements cannot be met. ODEAS results also show that the most difficult requirements to meet are those that determine the change in the components of velocity for postmaneuver evaluation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Xueyun; Smith, Richard D.; Baker, Erin S.
Lipids are a vital class of molecules that play important and varied roles in biological processes. Fully understanding lipid roles, however, is extremely difficult since the number and diversity of lipid species is immense, with cells expressing hundreds of enzymes that synthesize tens of thousands of different lipids. While recent advances in chromatography and high resolution mass spectrometry have greatly progressed the understanding of lipid species and functions, effectively separating many lipids still remains problematic. Isomeric lipids have made lipid characterization especially difficult and occur due to subclasses having the same chemical composition, or species having multiple acyl chains connectivitiesmore » (sn-1, sn-2, or sn-3), double bond positions and orientations (cis or trans), and functional group stereochemistry (R versus S). Fully understanding the roles of lipids in biological processes therefore requires separating and evaluating how isomers change in biological and environmental samples. To address this challenge, ion mobility spectrometry separations, ion-molecule reactions and fragmentation techniques have increasingly been added to lipid analysis workflows to improve identifications. In this manuscript, we review the current state of these approaches and their capabilities for improving the identification of specific lipid species.« less
Design and application of the falling vertical sorting machine
NASA Astrophysics Data System (ADS)
Zuo, Ping; Peng, Tao; Yang, Hai
2018-04-01
In the process of tobacco production, it is necessary to pack the smoke according to the needs of different customers. A sorting machine is used to pick up the cigarette at present, there is a launch channel machine, a percussible vertical machine, But in the sorting process, the rolling channel machine is different in terms of the quality of smoke and the frictional force. It is difficult to ensure the location and posture of the belt sorting line, which causes the manipulator to not grasp. The strike type vertical machine is difficult to control the parallelism of the smoke. Now this team has developed a falling sorting machine, which has solved the smoke drop of a cigarette to the transmission belt. There will not be no code, can satisfy most of the different types of smoke sorting and no damage to smoke status. The dynamic characteristics such as the angular error of the opening and closing mechanism are carried out by ADAMS software. The simulation results show that the maximum angular error is 0.016rad. Through the test of the device, the goods falling speed is 7031/hour, the good of the falling position error within 2mm, meet the crawl accuracy requirements of the palletizing robot.
Morgan, Steven G; Thomson, Paige A; Daw, Jamie R; Friesen, Melissa K
2013-01-31
Confidential product listing agreements (PLAs) negotiated between pharmaceutical manufacturers and individual health care payers may contribute to unwanted price disparities, high administrative costs, and unequal bargaining power within and across jurisdictions. In the context of Canada's decentralized health system, we aimed to document provincial policy makers' perceptions about collaborative PLA negotiations. We conducted semi-structured telephone interviews with a senior policy maker from nine of the ten Canadian provinces. We conducted a thematic analysis of interview transcripts to identify benefits, drawbacks, and barriers to routine collaboration on PLA negotiations. Canadian policy makers expressed support for joint negotiations of PLAs in principle, citing benefits of increased bargaining power and reduced inter-jurisdictional inequities in drug prices and formulary listings. However, established policy institutions and the politics of individual jurisdictional authority are formidable barriers to routine PLA collaboration. Achieving commitment to a joint process may be difficult to sustain among heterogeneous and autonomous partners. Though collaboration on PLA negotiation is an extension of collaboration on health technology assessment, it is a very significant next step that requires harmonization of the outcomes of decision-making processes. Views of policy makers in Canada suggest that sustaining routine collaborations on PLA negotiations may be difficult unless participating jurisdictions have similar policy institutions, capacities to implement coverage decisions, and local political priorities.
The S-Process Branching-Point at 205PB
NASA Astrophysics Data System (ADS)
Tonchev, Anton; Tsoneva, N.; Bhatia, C.; Arnold, C. W.; Goriely, S.; Hammond, S. L.; Kelley, J. H.; Kwan, E.; Lenske, H.; Piekarewicz, J.; Raut, R.; Rusev, G.; Shizuma, T.; Tornow, W.
2017-09-01
Accurate neutron-capture cross sections for radioactive nuclei near the line of beta stability are crucial for understanding s-process nucleosynthesis. However, neutron-capture cross sections for short-lived radionuclides are difficult to measure due to the fact that the measurements require both highly radioactive samples and intense neutron sources. We consider photon scattering using monoenergetic and 100% linearly polarized photon beams to obtain the photoabsorption cross section on 206Pb below the neutron separation energy. This observable becomes an essential ingredient in the Hauser-Feshbach statistical model for calculations of capture cross sections on 205Pb. The newly obtained photoabsorption information is also used to estimate the Maxwellian-averaged radiative cross section of 205Pb(n,g)206Pb at 30 keV. The astrophysical impact of this measurement on s-process nucleosynthesis will be discussed. This work was performed under the auspices of US DOE by LLNL under Contract DE-AC52-07NA27344.
Teaching thoughtful practice: narrative pedagogy in addictions education.
Vandermause, Roxanne K; Townsend, Ryan P
2010-07-01
Preparing practitioners for this rapidly changing and demanding health care environment is challenging. A surge in knowledge development and scientific advancement has placed a priority on technical skill and a focus on content driven educational processes that prepare students for evidence-based practice. However, the most difficult health care scenarios require thinking-in-action and thoughtfulness as well as didactic knowledge. It is our contention that interpretive educational methods, like narrative pedagogy, will promote judgment-based practice that includes use of evidence and delivery of thoughtful care. In this article, we describe and interpret a narrative approach to addictions content and teaching thoughtful practice. We present our pedagogical process, including observations and field notes, to show how interpretive pedagogies can be introduced into nursing curricula. By presenting this process, the reader is invited to consider interpretive methods as a way to inspire and habituate thoughtful practice and judgment-based care. Copyright 2009 Elsevier Ltd. All rights reserved.
The value and validation of broad spectrum biosensors for diagnosis and biodefense
Metzgar, David; Sampath, Rangarajan; Rounds, Megan A; Ecker, David J
2013-01-01
Broad spectrum biosensors capable of identifying diverse organisms are transitioning from the realm of research into the clinic. These technologies simultaneously capture signals from a wide variety of biological entities using universal processes. Specific organisms are then identified through bioinformatic signature-matching processes. This is in contrast to currently accepted molecular diagnostic technologies, which utilize unique reagents and processes to detect each organism of interest. This paradigm shift greatly increases the breadth of molecular diagnostic tools with little increase in biochemical complexity, enabling simultaneous diagnostic, epidemiologic, and biothreat surveillance capabilities at the point of care. This, in turn, offers the promise of increased biosecurity and better antimicrobial stewardship. Efficient realization of these potential gains will require novel regulatory paradigms reflective of the generalized, information-based nature of these assays, allowing extension of empirical data obtained from readily available organisms to support broader reporting of rare, difficult to culture, or extremely hazardous organisms. PMID:24128433
Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.
Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed
2015-02-01
Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages. Copyright © 2014 Elsevier Inc. All rights reserved.
Real-time digital holographic microscopy using the graphic processing unit.
Shimobaba, Tomoyoshi; Sato, Yoshikuni; Miura, Junya; Takenouchi, Mai; Ito, Tomoyoshi
2008-08-04
Digital holographic microscopy (DHM) is a well-known powerful method allowing both the amplitude and phase of a specimen to be simultaneously observed. In order to obtain a reconstructed image from a hologram, numerous calculations for the Fresnel diffraction are required. The Fresnel diffraction can be accelerated by the FFT (Fast Fourier Transform) algorithm. However, real-time reconstruction from a hologram is difficult even if we use a recent central processing unit (CPU) to calculate the Fresnel diffraction by the FFT algorithm. In this paper, we describe a real-time DHM system using a graphic processing unit (GPU) with many stream processors, which allows use as a highly parallel processor. The computational speed of the Fresnel diffraction using the GPU is faster than that of recent CPUs. The real-time DHM system can obtain reconstructed images from holograms whose size is 512 x 512 grids in 24 frames per second.
Human Factors Checklist: Think Human Factors - Focus on the People
NASA Technical Reports Server (NTRS)
Miller, Darcy; Stelges, Katrine; Barth, Timothy; Stambolian, Damon; Henderson, Gena; Dischinger, Charles; Kanki, Barbara; Kramer, Ian
2016-01-01
A quick-look Human Factors (HF) Checklist condenses industry and NASA Agency standards consisting of thousands of requirements into 14 main categories. With support from contractor HF and Safety Practitioners, NASA developed a means to share key HF messages with Design, Engineering, Safety, Project Management, and others. It is often difficult to complete timely assessments due to the large volume of HF information. The HF Checklist evolved over time into a simple way to consider the most important concepts. A wide audience can apply the checklist early in design or through planning phases, even before hardware or processes are finalized or implemented. The checklist is a good place to start to supplement formal HF evaluation. The HF Checklist was based on many Space Shuttle processing experiences and lessons learned. It is now being applied to ground processing of new space vehicles and adjusted for new facilities and systems.
A strategy to improve priority setting in developing countries.
Kapiriri, Lydia; Martin, Douglas K
2007-09-01
Because the demand for health services outstrips the available resources, priority setting is one of the most difficult issues faced by health policy makers, particularly those in developing countries. Priority setting in developing countries is fraught with uncertainty due to lack of credible information, weak priority setting institutions, and unclear priority setting processes. Efforts to improve priority setting in these contexts have focused on providing information and tools. In this paper we argue that priority setting is a value laden and political process, and although important, the available information and tools are not sufficient to address the priority setting challenges in developing countries. Additional complementary efforts are required. Hence, a strategy to improve priority setting in developing countries should also include: (i) capturing current priority setting practices, (ii) improving the legitimacy and capacity of institutions that set priorities, and (iii) developing fair priority setting processes.
Foot fractures frequently misdiagnosed as ankle sprains.
Judd, Daniel B; Kim, David H
2002-09-01
Most ankle injuries are straightforward ligamentous injuries. However, the clinical presentation of subtle fractures can be similar to that of ankle sprains, and these fractures are frequently missed on initial examination. Fractures of the talar dome may be medial or lateral, and they are usually the result of inversion injuries, although medial injuries may be atraumatic. Lateral talar process fractures are characterized by point tenderness over the lateral process. Posterior talar process fractures are often associated with tenderness to deep palpation anterior to the Achilles tendon over the posterolateral talus, and plantar flexion may exacerbate the pain. These fractures can often be managed nonsurgically with nonweight-bearing status and a short leg cast worn for approximately four weeks. Delays in treatment can result in long-term disability and surgery. Computed tomographic scans or magnetic resonance imaging may be required because these fractures are difficult to detect on plain films.
QoS support for end users of I/O-intensive applications using shared storage systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Marion Kei; Zhang, Xuechen; Jiang, Song
2011-01-19
I/O-intensive applications are becoming increasingly common on today's high-performance computing systems. While performance of compute-bound applications can be effectively guaranteed with techniques such as space sharing or QoS-aware process scheduling, it remains a challenge to meet QoS requirements for end users of I/O-intensive applications using shared storage systems because it is difficult to differentiate I/O services for different applications with individual quality requirements. Furthermore, it is difficult for end users to accurately specify performance goals to the storage system using I/O-related metrics such as request latency or throughput. As access patterns, request rates, and the system workload change in time,more » a fixed I/O performance goal, such as bounds on throughput or latency, can be expensive to achieve and may not lead to a meaningful performance guarantees such as bounded program execution time. We propose a scheme supporting end-users QoS goals, specified in terms of program execution time, in shared storage environments. We automatically translate the users performance goals into instantaneous I/O throughput bounds using a machine learning technique, and use dynamically determined service time windows to efficiently meet the throughput bounds. We have implemented this scheme in the PVFS2 parallel file system and have conducted an extensive evaluation. Our results show that this scheme can satisfy realistic end-user QoS requirements by making highly efficient use of the I/O resources. The scheme seeks to balance programs attainment of QoS requirements, and saves as much of the remaining I/O capacity as possible for best-effort programs.« less
Advanced Simulation and Computing Fiscal Year 14 Implementation Plan, Rev. 0.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meisner, Robert; McCoy, Michel; Archer, Bill
2013-09-11
The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities andmore » computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Moreover, ASC’s business model is integrated and focused on requirements-driven products that address long-standing technical questions related to enhanced predictive capability in the simulation tools.« less
Cost-effective masks for deep x-ray lithography
NASA Astrophysics Data System (ADS)
Scheunemann, Heinz-Ulrich; Loechel, Bernd; Jian, Linke; Schondelmaier, Daniel; Desta, Yohannes M.; Goettert, Jost
2003-04-01
The production of X-ray masks is one of the key techniques for X-ray lithography and the LIGA process. Different ways for the fabrication of X-ray masks has been established. Very sophisticated, difficult and expensive procedures are required to produce high precision and high quality X-ray masks. In order to minimize the cost of an X-ray mask, the mask blank must be inexpensive and readily available. The steps involved in the fabrication process must also be minimal. In the past, thin membranes made of titanium, silicon carbide, silicon nitride (2-5μm) or thick beryllium substrates (500μm) have been used as mask blanks. Thin titanium and silicon compounds have very high transparency for X-rays; therefore, these materials are predestined for use as mask membrane material. However, the handling and fabrication of thin membranes is very difficult, thus expensive. Beryllium is highly transparent to X-rays, but the processing and use of beryllium is risky due to potential toxicity. During the past few years graphite based X-ray masks have been in use at various research centers, but the sidewall quality of the generated resist patterns is in the range of 200-300 nm Ra. We used polished graphite to improve the sidewall roughness, but polished graphite causes other problems in the fabrication of X-ray masks. This paper describes the advantages associated with the use of polished graphite as mask blank as well as the fabrication process for this low cost X-ray mask. Alternative membrane materials will also be discussed.
Herpes Simplex Virus DNA Packaging without Measurable DNA Synthesis
Church, Geoffrey A.; Dasgupta, Anindya; Wilson, Duncan W.
1998-01-01
Herpes simplex virus (HSV) type 1 DNA synthesis and packaging occur within the nuclei of infected cells; however, the extent to which the two processes are coupled remains unclear. Correct packaging is thought to be dependent upon DNA debranching or other repair processes, and such events commonly involve new DNA synthesis. Furthermore, the HSV UL15 gene product, essential for packaging, nevertheless localizes to sites of active DNA replication and may link the two events. It has previously been difficult to determine whether packaging requires concomitant DNA synthesis due to the complexity of these processes and of the viral life cycle; however, we have recently described a model system which simplifies the study of HSV assembly. Cells infected with HSV strain tsProt.A accumulate unpackaged capsids at the nonpermissive temperature of 39°C. Following release of the temperature block, these capsids proceed to package viral DNA in a single, synchronous wave. Here we report that, when DNA replication was inhibited prior to release of the temperature block, DNA packaging and later events in viral assembly nevertheless occurred at near-normal levels. We conclude that, under our conditions, HSV DNA packaging does not require detectable levels of DNA synthesis. PMID:9525593
MAPI: a software framework for distributed biomedical applications
2013-01-01
Background The amount of web-based resources (databases, tools etc.) in biomedicine has increased, but the integrated usage of those resources is complex due to differences in access protocols and data formats. However, distributed data processing is becoming inevitable in several domains, in particular in biomedicine, where researchers face rapidly increasing data sizes. This big data is difficult to process locally because of the large processing, memory and storage capacity required. Results This manuscript describes a framework, called MAPI, which provides a uniform representation of resources available over the Internet, in particular for Web Services. The framework enhances their interoperability and collaborative use by enabling a uniform and remote access. The framework functionality is organized in modules that can be combined and configured in different ways to fulfil concrete development requirements. Conclusions The framework has been tested in the biomedical application domain where it has been a base for developing several clients that are able to integrate different web resources. The MAPI binaries and documentation are freely available at http://www.bitlab-es.com/mapi under the Creative Commons Attribution-No Derivative Works 2.5 Spain License. The MAPI source code is available by request (GPL v3 license). PMID:23311574
Gsflow-py: An integrated hydrologic model development tool
NASA Astrophysics Data System (ADS)
Gardner, M.; Niswonger, R. G.; Morton, C.; Henson, W.; Huntington, J. L.
2017-12-01
Integrated hydrologic modeling encompasses a vast number of processes and specifications, variable in time and space, and development of model datasets can be arduous. Model input construction techniques have not been formalized or made easily reproducible. Creating the input files for integrated hydrologic models (IHM) requires complex GIS processing of raster and vector datasets from various sources. Developing stream network topology that is consistent with the model resolution digital elevation model is important for robust simulation of surface water and groundwater exchanges. Distribution of meteorologic parameters over the model domain is difficult in complex terrain at the model resolution scale, but is necessary to drive realistic simulations. Historically, development of input data for IHM models has required extensive GIS and computer programming expertise which has restricted the use of IHMs to research groups with available financial, human, and technical resources. Here we present a series of Python scripts that provide a formalized technique for the parameterization and development of integrated hydrologic model inputs for GSFLOW. With some modifications, this process could be applied to any regular grid hydrologic model. This Python toolkit automates many of the necessary and laborious processes of parameterization, including stream network development and cascade routing, land coverages, and meteorological distribution over the model domain.
NASA Technical Reports Server (NTRS)
Langhoff, Stephen; Bauschlicher, Charles; Jaffe, Richard
1992-01-01
One of the primary goals of NASA's high-speed research program is to determine the feasibility of designing an environmentally safe commercial supersonic transport airplane. The largest environmental concern is focused on the amount of ozone destroying nitrogen oxides (NO(x)) that would be injected into the lower stratosphere during the cruise portion of the flight. The limitations placed on NO(x) emission require more than an order of magnitude reduction over current engine designs. To develop strategies to meet this goal requires first gaining a fundamental understanding of the combustion chemistry. To accurately model the combustor requires a computational fluid dynamics approach that includes both turbulence and chemistry. Since many of the important chemical processes in this regime involve highly reactive radicals, an experimental determination of the required thermodynamic data and rate constants is often very difficult. Unlike experimental approaches, theoretical methods are as applicable to highly reactive species as stable ones. Also our approximation of treating the dynamics classically becomes more accurate with increasing temperature. In this article we review recent progress in generating thermodynamic properties and rate constants that are required to understand NO(x) formation in the combustion process. We also describe our one-dimensional modeling efforts to validate an NH3 combustion reaction mechanism. We have been working in collaboration with researchers at LeRC, to ensure that our theoretical work is focused on the most important thermodynamic quantities and rate constants required in the chemical data base.
Addressing diversity and moving toward equity in hospital care.
Cordova, Richard D; Beaudin, Christy L; Iwanabe, Kelly E
2010-01-01
Healthcare disparities are a major challenge for hospital and healthcare system leadership. Leaders must possess vision, visibility, and ability to drive organizational change toward an environment that fosters diversity and cultural competence. As challenging economic conditions force management to make difficult budgetary decisions, the integration of equity into the organization's core mission and strategic process is essential for sustainability. Building organizational capacity requires systematic actions including workforce composition, training and development, and policy advocacy. This article offers perspectives on the current state of diversity in hospitals, exemplars from pediatric hospitals, and considerations for the future. Healthcare leaders are influential in shaping the future of the organization through strategic planning and resource allocation to those efforts that enhance services, programs, and processes that support a culture of diversity and equity.
RAVE: Rapid Visualization Environment
NASA Technical Reports Server (NTRS)
Klumpar, D. M.; Anderson, Kevin; Simoudis, Avangelos
1994-01-01
Visualization is used in the process of analyzing large, multidimensional data sets. However, the selection and creation of visualizations that are appropriate for the characteristics of a particular data set and the satisfaction of the analyst's goals is difficult. The process consists of three tasks that are performed iteratively: generate, test, and refine. The performance of these tasks requires the utilization of several types of domain knowledge that data analysts do not often have. Existing visualization systems and frameworks do not adequately support the performance of these tasks. In this paper we present the RApid Visualization Environment (RAVE), a knowledge-based system that interfaces with commercial visualization frameworks and assists a data analyst in quickly and easily generating, testing, and refining visualizations. RAVE was used for the visualization of in situ measurement data captured by spacecraft.
Sensitivity analysis of limit state functions for probability-based plastic design
NASA Technical Reports Server (NTRS)
Frangopol, D. M.
1984-01-01
The evaluation of the total probability of a plastic collapse failure P sub f for a highly redundant structure of random interdependent plastic moments acted on by random interdepedent loads is a difficult and computationally very costly process. The evaluation of reasonable bounds to this probability requires the use of second moment algebra which involves man statistical parameters. A computer program which selects the best strategy for minimizing the interval between upper and lower bounds of P sub f is now in its final stage of development. The relative importance of various uncertainties involved in the computational process on the resulting bounds of P sub f, sensitivity is analyzed. Response sensitivities for both mode and system reliability of an ideal plastic portal frame are shown.
Replacement/Refurbishment of JSC/NASA POD Specimens
NASA Technical Reports Server (NTRS)
Castner, Willard L.
2010-01-01
The NASA Special NDE certification process requires demonstration of NDE capability by test per NASA-STD-5009. This test is performed with fatigue cracked specimens containing very small cracks. The certification test results are usually based on binomial statistics and must meet a 90/95 Probability of Detection (POD). The assumption is that fatigue cracks are tightly closed, difficult to detect, and inspectors and processes passing such a test are well qualified for inspecting NASA fracture critical hardware. The JSC NDE laboratory has what may be the largest inventory that exists of such fatigue cracked NDE demonstration specimens. These specimens were produced by the hundreds in the late 1980s and early 1990s. None have been produced since that time and the condition and usability of the specimens are questionable.
Environmental DNA for wildlife biology and biodiversity monitoring.
Bohmann, Kristine; Evans, Alice; Gilbert, M Thomas P; Carvalho, Gary R; Creer, Simon; Knapp, Michael; Yu, Douglas W; de Bruyn, Mark
2014-06-01
Extraction and identification of DNA from an environmental sample has proven noteworthy recently in detecting and monitoring not only common species, but also those that are endangered, invasive, or elusive. Particular attributes of so-called environmental DNA (eDNA) analysis render it a potent tool for elucidating mechanistic insights in ecological and evolutionary processes. Foremost among these is an improved ability to explore ecosystem-level processes, the generation of quantitative indices for analyses of species, community diversity, and dynamics, and novel opportunities through the use of time-serial samples and unprecedented sensitivity for detecting rare or difficult-to-sample taxa. Although technical challenges remain, here we examine the current frontiers of eDNA, outline key aspects requiring improvement, and suggest future developments and innovations for research. Copyright © 2014 Elsevier Ltd. All rights reserved.
Modeling of dislocation dynamics in germanium Czochralski growth
NASA Astrophysics Data System (ADS)
Artemyev, V. V.; Smirnov, A. D.; Kalaev, V. V.; Mamedov, V. M.; Sidko, A. P.; Podkopaev, O. I.; Kravtsova, E. D.; Shimansky, A. F.
2017-06-01
Obtaining very high-purity germanium crystals with low dislocation density is a practically difficult problem, which requires knowledge and experience in growth processes. Dislocation density is one of the most important parameters defining the quality of germanium crystal. In this paper, we have performed experimental study of dislocation density during 4-in. germanium crystal growth using the Czochralski method and comprehensive unsteady modeling of the same crystal growth processes, taking into account global heat transfer, melt flow and melt/crystal interface shape evolution. Thermal stresses in the crystal and their relaxation with generation of dislocations within the Alexander-Haasen model have been calculated simultaneously with crystallization dynamics. Comparison to experimental data showed reasonable agreement for the temperature, interface shape and dislocation density in the crystal between calculation and experiment.
Salmikangas, Paula; Menezes-Ferreira, Margarida; Reischl, Ilona; Tsiftsoglou, Asterios; Kyselovic, Jan; Borg, John Joseph; Ruiz, Sol; Flory, Egbert; Trouvin, Jean-Hugues; Celis, Patrick; Ancans, Janis; Timon, Marcos; Pante, Guido; Sladowski, Dariusz; Lipnik-Stangelj, Metoda; Schneider, Christian K
2015-01-01
During the past decade, a large number of cell-based medicinal products have been tested in clinical trials for the treatment of various diseases and tissue defects. However, licensed products and those approaching marketing authorization are still few. One major area of challenge is the manufacturing and quality development of these complex products, for which significant manipulation of cells might be required. While the paradigms of quality, safety and efficacy must apply also to these innovative products, their demonstration may be demanding. Demonstration of comparability between production processes and batches may be difficult for cell-based medicinal products. Thus, the development should be built around a well-controlled manufacturing process and a qualified product to guarantee reproducible data from nonclinical and clinical studies.
Off-line real-time FTIR analysis of a process step in imipenem production
NASA Astrophysics Data System (ADS)
Boaz, Jhansi R.; Thomas, Scott M.; Meyerhoffer, Steven M.; Staskiewicz, Steven J.; Lynch, Joseph E.; Egan, Richard S.; Ellison, Dean K.
1992-08-01
We have developed an FT-IR method, using a Spectra-Tech Monit-IR 400 systems, to monitor off-line the completion of a reaction in real-time. The reaction is moisture-sensitive and analysis by more conventional methods (normal-phase HPLC) is difficult to reproduce. The FT-IR method is based on the shift of a diazo band when a conjugated beta-diketone is transformed into a silyl enol ether during the reaction. The reaction mixture is examined directly by IR and does not require sample workup. Data acquisition time is less than one minute. The method has been validated for specificity, precision and accuracy. The results obtained by the FT-IR method for known mixtures and in-process samples compare favorably with those from a normal-phase HPLC method.
NASA Astrophysics Data System (ADS)
Comǎneci, Radu Ioachim; Nedelcu, Dumitru; Bujoreanu, Leandru Gheorghe
2017-10-01
Equal channel angular pressing (ECAP) is a well-established method for grain refinement in metallic materials by large shear plastic deformation, being the most promising and effective severe plastic deformation (SPD) technique. ECAP is a discontinuous process, so the billet removal implies a new development of the procedure: the new sample pushes out the previous sample. In resuming the process the head and the tail ends of the work piece which becomes strongly distorted and receiving different amount of strain have to be removed. Due to the path difference in material flow between upper and lower region of the outlet channel, a non-uniform strain and stress distribution across the width of the workpiece leaving the plastic deformation zone (PDZ) is achieved. A successful ECAP requires surpassing two obstacles: the necessary load level which directly affects tools and a favorable stress distribution so the material withstanding the accumulated strain of repeated deformation. Under back pressure (BP), materials have shown to be able to withstand more passes. As soon as the billet passes the PDZ along the bisector plane of the two channels, the compressive mean stress changes to tensile (leading to crack initiation), while in the presence of BP, a negative (compressive) stress is applied during the process. In this paper a comparative tridimensional finite element analysis (FEA) is performed to evaluate the behavior of a difficult-to-work Al-Mg alloy depending on tools geometry and process parameters. The results in terms of load level and strain distribution show the influence of the punch geometry and BP on the material behavior.
Automated Generation of Technical Documentation and Provenance for Reproducible Research
NASA Astrophysics Data System (ADS)
Jolly, B.; Medyckyj-Scott, D.; Spiekermann, R.; Ausseil, A. G.
2017-12-01
Data provenance and detailed technical documentation are essential components of high-quality reproducible research, however are often only partially addressed during a research project. Recording and maintaining this information during the course of a project can be a difficult task to get right as it is a time consuming and often boring process for the researchers involved. As a result, provenance records and technical documentation provided alongside research results can be incomplete or may not be completely consistent with the actual processes followed. While providing access to the data and code used by the original researchers goes some way toward enabling reproducibility, this does not count as, or replace, data provenance. Additionally, this can be a poor substitute for good technical documentation and is often more difficult for a third-party to understand - particularly if they do not understand the programming language(s) used. We present and discuss a tool built from the ground up for the production of well-documented and reproducible spatial datasets that are created by applying a series of classification rules to a number of input layers. The internal model of the classification rules required by the tool to process the input data is exploited to also produce technical documentation and provenance records with minimal additional user input. Available provenance records that accompany input datasets are incorporated into those that describe the current process. As a result, each time a new iteration of the analysis is performed the documentation and provenance records are re-generated to provide an accurate description of the exact process followed. The generic nature of this tool, and the lessons learned during its creation, have wider application to other fields where the production of derivative datasets must be done in an open, defensible, and reproducible way.
Translation of proteomic biomarkers into FDA approved cancer diagnostics: issues and challenges
2013-01-01
Tremendous efforts have been made over the past few decades to discover novel cancer biomarkers for use in clinical practice. However, a striking discrepancy exists between the effort directed toward biomarker discovery and the number of markers that make it into clinical practice. One of the confounding issues in translating a novel discovery into clinical practice is that quite often the scientists working on biomarker discovery have limited knowledge of the analytical, diagnostic, and regulatory requirements for a clinical assay. This review provides an introduction to such considerations with the aim of generating more extensive discussion for study design, assay performance, and regulatory approval in the process of translating new proteomic biomarkers from discovery into cancer diagnostics. We first describe the analytical requirements for a robust clinical biomarker assay, including concepts of precision, trueness, specificity and analytical interference, and carryover. We next introduce the clinical considerations of diagnostic accuracy, receiver operating characteristic analysis, positive and negative predictive values, and clinical utility. We finish the review by describing components of the FDA approval process for protein-based biomarkers, including classification of biomarker assays as medical devices, analytical and clinical performance requirements, and the approval process workflow. While we recognize that the road from biomarker discovery, validation, and regulatory approval to the translation into the clinical setting could be long and difficult, the reward for patients, clinicians and scientists could be rather significant. PMID:24088261
Recent developments in turning hardened steels - A review
NASA Astrophysics Data System (ADS)
Sivaraman, V.; Prakash, S.
2017-05-01
Hard materials ranging from HRC 45 - 68 such as hardened AISI H13, AISI 4340, AISI 52100, D2 STL, D3 STEEL Steel etc., need super hard tool materials to machine. Turning of these hard materials is termed as hard turning. Hard turning makes possible direct machining of the hard materials and also eliminates the lubricant requirement and thus favoring dry machining. Hard turning is a finish turning process and hence conventional grinding is not required. Development of the new advanced super hard tool materials such as ceramic inserts, Cubic Boron Nitride, Polycrystalline Cubic Boron Nitride etc. enabled the turning of these materials. PVD and CVD methods of coating have made easier the production of single and multi layered coated tool inserts. Coatings of TiN, TiAlN, TiC, Al2O3, AlCrN over cemented carbide inserts has lead to the machining of difficult to machine materials. Advancement in the process of hard machining paved way for better surface finish, long tool life, reduced tool wear, cutting force and cutting temperatures. Micro and Nano coated carbide inserts, nanocomposite coated PCBN inserts, micro and nano CBN coated carbide inserts and similar developments have made machining of hardened steels much easier and economical. In this paper, broad literature review on turning of hardened steels including optimizing process parameters, cooling requirements, different tool materials etc., are done.
Structural and temporal requirements for geomagnetic field reversal deduced from lava flows.
Singer, Brad S; Hoffman, Kenneth A; Coe, Robert S; Brown, Laurie L; Jicha, Brian R; Pringle, Malcolm S; Chauvin, Annick
2005-03-31
Reversals of the Earth's magnetic field reflect changes in the geodynamo--flow within the outer core--that generates the field. Constraining core processes or mantle properties that induce or modulate reversals requires knowing the timing and morphology of field changes that precede and accompany these reversals. But the short duration of transitional field states and fragmentary nature of even the best palaeomagnetic records make it difficult to provide a timeline for the reversal process. 40Ar/39Ar dating of lavas on Tahiti, long thought to record the primary part of the most recent 'Matuyama-Brunhes' reversal, gives an age of 795 +/- 7 kyr, indistinguishable from that of lavas in Chile and La Palma that record a transition in the Earth's magnetic field, but older than the accepted age for the reversal. Only the 'transitional' lavas on Maui and one from La Palma (dated at 776 +/- 2 kyr), agree with the astronomical age for the reversal. Here we propose that the older lavas record the onset of a geodynamo process, which only on occasion would result in polarity change. This initial instability, associated with the first of two decreases in field intensity, began approximately 18 kyr before the actual polarity switch. These data support the claim that complete reversals require a significant period for magnetic flux to escape from the solid inner core and sufficiently weaken its stabilizing effect.
Beyond Coordination: Joint Planning and Program Execution. The IHPRPT Materials Working Group
NASA Technical Reports Server (NTRS)
Stropki, Michael A.; Cleyrat, Danial A.; Clinton, Raymond G., Jr.; Rogacki, John R. (Technical Monitor)
2000-01-01
"Partnership is more than just coordination," stated then-Commander of the Air Force Research Laboratory (AFRL), Major General Dick Paul (USAF-Ret), at this year's National Space and Missile Materials Symposium. His comment referred to the example of the joint planning and program execution provided by the Integrated High Payoff Rocket Propulsion Technology (IHPRPT) Materials Working Group (IMWG). Most people agree that fiscal pressures imposed by shrinking budgets have made it extremely difficult to build upon our existing technical capabilities. In times of sufficient budgets, building advanced systems poses no major difficulties. However, with today's budgets, realizing enhanced capabilities and developing advanced systems often comes at an unaffordable cost. Overcoming this problem represents both a challenge and an opportunity to develop new business practices that allow us to develop advanced technologies within the restrictions imposed by current funding levels. Coordination of technology developments between different government agencies and organizations is a valuable tool for technology transfer. However, rarely do the newly developed technologies have direct applicability to other ongoing programs. Technology requirements are typically determined up-front during the program planning stage so that schedule risk can be minimized. The problem with this process is that the costs associated with the technology development are often borne by a single program. Additionally, the potential exists for duplication of technical effort. Changing this paradigm is a difficult process but one that can be extremely worthwhile should the right opportunity arise. The IMWG is one such example where NASA, the DoD, and industry have developed joint requirements that are intended to satisfy multiple program needs. More than mere coordination, the organizations comprising the group come together as partners, sharing information and resources, proceeding from a joint roadmap.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elhadj, S.; Steele, W. A.; VanBlarcom, D. S.
Here, we investigate an approach for the recycling of laser-damaged large-aperture deuterated potassium dihydrogen phosphate (DKDP) crystals used for optical switching (KDP) and for frequency conversion (DKDP) in megajoule-class high-power laser systems. The approach consists of micromachining the surface laser damage sites (mitigation), combined with multiple soaks and ultrasonication steps in a coating solvent to remove, synergistically, both the highly adherent machining debris and the laser-damage-affected antireflection coating. We then identify features of the laser-damage-affected coating, such as the “solvent-persistent” coating and the “burned-in” coating, that are difficult to remove by conventional approaches without damaging the surface. We also providemore » a solution to the erosion problem identified in this work when colloidal coatings are processed during ultrasonication. Finally, we provide a proof of principle of the approach by testing the full process that includes laser damage mitigation of DKDP test parts, coat stripping, reapplication of a new antireflective coat, and a laser damage test demonstrating performance up to at least 12 J/cm 2 at UV wavelengths, which is well above current requirements. Our approach ultimately provides a potential path to a scalable recycling loop for the management of optics in large, high-power laser systems that can reduce cost and extend lifetime of highly valuable and difficult to grow large DKDP crystals.« less
Elhadj, S.; Steele, W. A.; VanBlarcom, D. S.; ...
2017-03-07
Here, we investigate an approach for the recycling of laser-damaged large-aperture deuterated potassium dihydrogen phosphate (DKDP) crystals used for optical switching (KDP) and for frequency conversion (DKDP) in megajoule-class high-power laser systems. The approach consists of micromachining the surface laser damage sites (mitigation), combined with multiple soaks and ultrasonication steps in a coating solvent to remove, synergistically, both the highly adherent machining debris and the laser-damage-affected antireflection coating. We then identify features of the laser-damage-affected coating, such as the “solvent-persistent” coating and the “burned-in” coating, that are difficult to remove by conventional approaches without damaging the surface. We also providemore » a solution to the erosion problem identified in this work when colloidal coatings are processed during ultrasonication. Finally, we provide a proof of principle of the approach by testing the full process that includes laser damage mitigation of DKDP test parts, coat stripping, reapplication of a new antireflective coat, and a laser damage test demonstrating performance up to at least 12 J/cm 2 at UV wavelengths, which is well above current requirements. Our approach ultimately provides a potential path to a scalable recycling loop for the management of optics in large, high-power laser systems that can reduce cost and extend lifetime of highly valuable and difficult to grow large DKDP crystals.« less
Heat transfer phenomena during thermal processing of liquid particulate mixtures-A review.
Singh, Anubhav Pratap; Singh, Anika; Ramaswamy, Hosahalli S
2017-05-03
During the past few decades, food industry has explored various novel thermal and non-thermal processing technologies to minimize the associated high-quality loss involved in conventional thermal processing. Among these are the novel agitation systems that permit forced convention in canned particulate fluids to improve heat transfer, reduce process time, and minimize heat damage to processed products. These include traditional rotary agitation systems involving end-over-end, axial, or biaxial rotation of cans and the more recent reciprocating (lateral) agitation. The invention of thermal processing systems with induced container agitation has made heat transfer studies more difficult due to problems in tracking the particle temperatures due to their dynamic motion during processing and complexities resulting from the effects of forced convection currents within the container. This has prompted active research on modeling and characterization of heat transfer phenomena in such systems. This review brings to perspective, the current status on thermal processing of particulate foods, within the constraints of lethality requirements from safety view point, and discusses available techniques of data collection, heat transfer coefficient evaluation, and the critical processing parameters that affect these heat transfer coefficients, especially under agitation processing conditions.
Surface processing for bulk niobium superconducting radio frequency cavities
NASA Astrophysics Data System (ADS)
Kelly, M. P.; Reid, T.
2017-04-01
The majority of niobium cavities for superconducting particle accelerators continue to be fabricated from thin-walled (2-4 mm) polycrystalline niobium sheet and, as a final step, require material removal from the radio frequency (RF) surface in order to achieve performance needed for use as practical accelerator devices. More recently bulk niobium in the form of, single- or large-grain slices cut from an ingot has become a viable alternative for some cavity types. In both cases the so-called damaged layer must be chemically etched or electrochemically polished away. The methods for doing this date back at least four decades, however, vigorous empirical studies on real cavities and more fundamental studies on niobium samples at laboratories worldwide have led to seemingly modest improvements that, when taken together, constitute a substantial advance in the reproducibility for surface processing techniques and overall cavity performance. This article reviews the development of niobium cavity surface processing, and summarizes results of recent studies. We place some emphasis on practical details for real cavity processing systems which are difficult to find in the literature but are, nonetheless, crucial for achieving the good and reproducible cavity performance. New approaches for bulk niobium surface treatment which aim to reduce cost or increase performance, including alternate chemical recipes, barrel polishing and ‘nitrogen doping’ of the RF surface, continue to be pursued and are closely linked to the requirements for surface processing.
Surface processing for bulk niobium superconducting radio frequency cavities
Kelly, M. P.; Reid, T.
2017-02-21
The majority of niobium cavities for superconducting particle accelerators continue to be fabricated from thin-walled (2-4mm) polycrystalline niobium sheet and, as a final step, require material removal from the radio frequency (RF) surface in order to achieve performance needed for use as practical accelerator devices. More recently bulk niobium in the form of, single-or large-grain slices cut from an ingot has become a viable alternative for some cavity types. In both cases the so-called damaged layer must be chemically etched or electrochemically polished away. The methods for doing this date back at least four decades, however, vigorous empirical studies onmore » real cavities and more fundamental studies on niobium samples at laboratories worldwide have led to seemingly modest improvements that, when taken together, constitute a substantial advance in the reproducibility for surface processing techniques and overall cavity performance. This article reviews the development of niobium cavity surface processing, and summarizes results of recent studies. We place some emphasis on practical details for real cavity processing systems which are difficult to find in the literature but are, nonetheless, crucial for achieving the good and reproducible cavity performance. New approaches for bulk niobium surface treatment which aim to reduce cost or increase performance, including alternate chemical recipes, barrel polishing and 'nitrogen doping' of the RF surface, continue to be pursued and are closely linked to the requirements for surface processing.« less
Surface processing for bulk niobium superconducting radio frequency cavities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kelly, M. P.; Reid, T.
The majority of niobium cavities for superconducting particle accelerators continue to be fabricated from thin-walled (2-4mm) polycrystalline niobium sheet and, as a final step, require material removal from the radio frequency (RF) surface in order to achieve performance needed for use as practical accelerator devices. More recently bulk niobium in the form of, single-or large-grain slices cut from an ingot has become a viable alternative for some cavity types. In both cases the so-called damaged layer must be chemically etched or electrochemically polished away. The methods for doing this date back at least four decades, however, vigorous empirical studies onmore » real cavities and more fundamental studies on niobium samples at laboratories worldwide have led to seemingly modest improvements that, when taken together, constitute a substantial advance in the reproducibility for surface processing techniques and overall cavity performance. This article reviews the development of niobium cavity surface processing, and summarizes results of recent studies. We place some emphasis on practical details for real cavity processing systems which are difficult to find in the literature but are, nonetheless, crucial for achieving the good and reproducible cavity performance. New approaches for bulk niobium surface treatment which aim to reduce cost or increase performance, including alternate chemical recipes, barrel polishing and 'nitrogen doping' of the RF surface, continue to be pursued and are closely linked to the requirements for surface processing.« less
Evaluation Of Sludge Heel Dissolution Efficiency With Oxalic Acid Cleaning At Savannah River Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sudduth, Christie; Vitali, Jason; Keefer, Mark
The chemical cleaning process baseline strategy at the Savannah River Site was revised to improve efficiency during future execution of the process based on lessons learned during previous bulk oxalic acid cleaning activities and to account for operational constraints imposed by safety basis requirements. These improvements were also intended to transcend the difficulties that arise from waste removal in higher rheological yield stress sludge tanks. Tank 12 implemented this improved strategy and the bulk oxalic acid cleaning efforts concluded in July 2013. The Tank 12 radiological removal results were similar to previous bulk oxalic acid cleaning campaigns despite the factmore » that Tank 12 contained higher rheological yield stress sludge that would make removal more difficult than the sludge treated in previous cleaning campaigns. No appreciable oxalate precipitation occurred during the cleaning process in Tank 12 compared to previous campaigns, which aided in the net volume reduction of 75-80%. Overall, the controls established for Tank 12 provide a template for an improved cleaning process.« less
Intelligence related upper alpha desynchronization in a semantic memory task.
Doppelmayr, M; Klimesch, W; Hödlmoser, K; Sauseng, P; Gruber, W
2005-07-30
Recent evidence shows that event-related (upper) alpha desynchronization (ERD) is related to cognitive performance. Several studies observed a positive, some a negative relationship. The latter finding, interpreted in terms of the neural efficiency hypothesis, suggests that good performance is associated with a more 'efficient', smaller extent of cortical activation. Other studies found that ERD increases with semantic processing demands and that this increase is larger for good performers. Studies supporting the neural efficiency hypothesis used tasks that do not specifically require semantic processing. Thus, we assume that the lack of semantic processing demands may at least in part be responsible for the reduced ERD. In the present study we measured ERD during a difficult verbal-semantic task. The findings demonstrate that during semantic processing, more intelligent (as compared to less intelligent) subjects exhibited a significantly larger upper alpha ERD over the left hemisphere. We conclude that more intelligent subjects exhibit a more extensive activation in a semantic processing system and suggest that divergent findings regarding the neural efficiency hypotheses are due to task specific differences in semantic processing demands.
An Example of Economic Value in Rapid Prototyping
NASA Technical Reports Server (NTRS)
Hauer, R. L.; Braunscheidel, E. P.
2001-01-01
Today's modern machining projects are composed more and more of complicated and intricate structure due to a variety of reasons including the ability to computer model complex surfaces and forms. The cost of producing these forms can be extremely high not only in dollars but in time to complete. Changes are even more difficult to incorporate. The subject blade shown is an excellent example. Its complex form would have required hundreds of hours in fabrication for just a simple prototype. The procurement would have taken in the neighborhood of six weeks to complete. The actual fabrication would have been an equal amount of time to complete. An alternative to this process would have been a wood model. Although cheaper than a metal fabrication, it would be extremely time intensive and require in the neighborhood of a month to produce in-house.
Tsukasaki, Wakako; Maruyama, Jun-Ichi; Kitamoto, Katsuhiko
2014-01-01
Hyphal fusion is involved in the formation of an interconnected colony in filamentous fungi, and it is the first process in sexual/parasexual reproduction. However, it was difficult to evaluate hyphal fusion efficiency due to the low frequency in Aspergillus oryzae in spite of its industrial significance. Here, we established a method to quantitatively evaluate the hyphal fusion ability of A. oryzae with mixed culture of two different auxotrophic strains, where the ratio of heterokaryotic conidia growing without the auxotrophic requirements reflects the hyphal fusion efficiency. By employing this method, it was demonstrated that AoSO and AoFus3 are required for hyphal fusion, and that hyphal fusion efficiency of A. oryzae was increased by depleting nitrogen source, including large amounts of carbon source, and adjusting pH to 7.0.
Key ingredients needed when building large data processing systems for scientists
NASA Technical Reports Server (NTRS)
Miller, K. C.
2002-01-01
Why is building a large science software system so painful? Weren't teams of software engineers supposed to make life easier for scientists? Does it sometimes feel as if it would be easier to write the million lines of code in Fortran 77 yourself? The cause of this dissatisfaction is that many of the needs of the science customer remain hidden in discussions with software engineers until after a system has already been built. In fact, many of the hidden needs of the science customer conflict with stated needs and are therefore very difficult to meet unless they are addressed from the outset in a system's architectural requirements. What's missing is the consideration of a small set of key software properties in initial agreements about the requirements, the design and the cost of the system.
Lab-on-a-chip based total-phosphorus analysis device utilizing a photocatalytic reaction
NASA Astrophysics Data System (ADS)
Jung, Dong Geon; Jung, Daewoong; Kong, Seong Ho
2018-02-01
A lab-on-a-chip (LOC) device for total phosphorus (TP) analysis was fabricated for water quality monitoring. Many commercially available TP analysis systems used to estimate water quality have good sensitivity and accuracy. However, these systems also have many disadvantages such as bulky size, complex pretreatment processes, and high cost, which limit their application. In particular, conventional TP analysis systems require an indispensable pretreatment step, in which the fluidic analyte is heated to 120 °C for 30 min to release the dissolved phosphate, because many phosphates are soluble in water at a standard temperature and pressure. In addition, this pretreatment process requires elevated pressures of up to 1.1 kg cm-2 in order to prevent the evaporation of the heated analyte. Because of these limiting conditions required by the pretreatment processes used in conventional systems, it is difficult to miniaturize TP analysis systems. In this study, we employed a photocatalytic reaction in the pretreatment process. The reaction was carried out by illuminating a photocatalytic titanium dioxide (TiO2) surface formed in a microfluidic channel with ultraviolet (UV) light. This pretreatment process does not require elevated temperatures and pressures. By applying this simplified, photocatalytic-reaction-based pretreatment process to a TP analysis system, greater degrees of freedom are conferred to the design and fabrication of LOC devices for TP monitoring. The fabricated LOC device presented in this paper was characterized by measuring the TP concentration of an unknown sample, and comparing the results with those measured by a conventional TP analysis system. The TP concentrations of the unknown sample measured by the proposed LOC device and the conventional TP analysis system were 0.018 mgP/25 mL and 0.019 mgP/25 mL, respectively. The experimental results revealed that the proposed LOC device had a performance comparable to the conventional bulky TP analysis system. Therefore, our device could be directly employed in water quality monitoring as an alternative to conventional TP analysis systems.
Bowie, Dennis M.
1991-01-01
The difficult asthmatic patient should first be managed by confirming the diagnosis and eliminating any aggravating environmental or occupational factors, including medication use. Proper treatment requires rational addition of drugs in a logical sequence. It is most important to ensure proper inhaler technique, patient compliance, effective doctor-patient communication, and proper patient monitoring. ImagesFigure 2 PMID:21229079
Equivalence of Students' Scores on Timed and Untimed Anatomy Practical Examinations
ERIC Educational Resources Information Center
Zhang, Guiyun; Fenderson, Bruce A.; Schmidt, Richard R.; Veloski, J. Jon
2013-01-01
Untimed examinations are popular with students because there is a perception that first impressions may be incorrect, and that difficult questions require more time for reflection. In this report, we tested the hypothesis that timed anatomy practical examinations are inherently more difficult than untimed examinations. Students in the Doctor of…
Advanced Simulation & Computing FY15 Implementation Plan Volume 2, Rev. 0.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCoy, Michel; Archer, Bill; Matzen, M. Keith
2014-09-16
The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities andmore » computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. As the program approaches the end of its second decade, ASC is intently focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), quantify critical margins and uncertainties, and resolve increasingly difficult analyses needed for the SSP. Where possible, the program also enables the use of high-performance simulation and computing tools to address broader national security needs, such as foreign nuclear weapon assessments and counternuclear terrorism.« less
Resource Management Scheme Based on Ubiquitous Data Analysis
Lee, Heung Ki; Jung, Jaehee
2014-01-01
Resource management of the main memory and process handler is critical to enhancing the system performance of a web server. Owing to the transaction delay time that affects incoming requests from web clients, web server systems utilize several web processes to anticipate future requests. This procedure is able to decrease the web generation time because there are enough processes to handle the incoming requests from web browsers. However, inefficient process management results in low service quality for the web server system. Proper pregenerated process mechanisms are required for dealing with the clients' requests. Unfortunately, it is difficult to predict how many requests a web server system is going to receive. If a web server system builds too many web processes, it wastes a considerable amount of memory space, and thus performance is reduced. We propose an adaptive web process manager scheme based on the analysis of web log mining. In the proposed scheme, the number of web processes is controlled through prediction of incoming requests, and accordingly, the web process management scheme consumes the least possible web transaction resources. In experiments, real web trace data were used to prove the improved performance of the proposed scheme. PMID:25197692
Honkoop, Persijn J; Pinnock, Hilary; Kievits-Smeets, Regien M M; Sterk, Peter J; Dekhuijzen, P N Richard; In 't Veen, Johannes C C M
2017-02-09
Patients with difficult-to-manage asthma represent a heterogeneous subgroup of asthma patients who require extensive assessment and tailored management. The International Primary Care Respiratory Group approach emphasises the importance of differentiating patients with asthma that is difficult to manage from those with severe disease. Local adaptation of this approach, however, is required to ensure an appropriate strategy for implementation in the Dutch context. We used a modified three-round e-Delphi approach to assess the opinion of all relevant stakeholders (general practitioners, pulmonologists, practice nurses, pulmonary nurses and people with asthma). In the first round, the participants were asked to provide potentially relevant items for a difficult-to-manage asthma programme, which resulted in 67 items. In the second round, we asked participants to rate the relevance of specific items on a seven-point Likert scale, and 46 items were selected as relevant. In the third round, the selected items were categorised and items were ranked within the categories according to relevance. Finally, we created the alphabet acronym for the categories 'the A-I of difficult-to-manage asthma' to resonate with an established Dutch 'A-E acronym for determining asthma control'. This should facilitate implementation of this programme within the existing structure of educational material on asthma and chronic obstructive pulmonary disease (COPD) in primary care, with potential for improving management of difficult-to-manage asthma. Other countries could use a similar approach to create a locally adapted version of such a programme.
Combining fluorescence imaging with Hi-C to study 3D genome architecture of the same single cell.
Lando, David; Basu, Srinjan; Stevens, Tim J; Riddell, Andy; Wohlfahrt, Kai J; Cao, Yang; Boucher, Wayne; Leeb, Martin; Atkinson, Liam P; Lee, Steven F; Hendrich, Brian; Klenerman, Dave; Laue, Ernest D
2018-05-01
Fluorescence imaging and chromosome conformation capture assays such as Hi-C are key tools for studying genome organization. However, traditionally, they have been carried out independently, making integration of the two types of data difficult to perform. By trapping individual cell nuclei inside a well of a 384-well glass-bottom plate with an agarose pad, we have established a protocol that allows both fluorescence imaging and Hi-C processing to be carried out on the same single cell. The protocol identifies 30,000-100,000 chromosome contacts per single haploid genome in parallel with fluorescence images. Contacts can be used to calculate intact genome structures to better than 100-kb resolution, which can then be directly compared with the images. Preparation of 20 single-cell Hi-C libraries using this protocol takes 5 d of bench work by researchers experienced in molecular biology techniques. Image acquisition and analysis require basic understanding of fluorescence microscopy, and some bioinformatics knowledge is required to run the sequence-processing tools described here.
A Capacity Forecast Model for Volatile Data in Maintenance Logistics
NASA Astrophysics Data System (ADS)
Berkholz, Daniel
2009-05-01
Maintenance, repair and overhaul processes (MRO processes) are elaborate and complex. Rising demands on these after sales services require reliable production planning and control methods particularly for maintaining valuable capital goods. Downtimes lead to high costs and an inability to meet delivery due dates results in severe contract penalties. Predicting the required capacities for maintenance orders in advance is often difficult due to unknown part conditions unless the goods are actually inspected. This planning uncertainty results in extensive capital tie-up by rising stock levels within the whole MRO network. The article outlines an approach to planning capacities when maintenance data forecasting is volatile. It focuses on the development of prerequisites for a reliable capacity planning model. This enables a quick response to maintenance orders by employing appropriate measures. The information gained through the model is then systematically applied to forecast both personnel capacities and the demand for spare parts. The improved planning reliability can support MRO service providers in shortening delivery times and reducing stock levels in order to enhance the performance of their maintenance logistics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia-Montero, Luis G., E-mail: luisgonzaga.garcia@upm.e; Lopez, Elena, E-mail: elopez@caminos.upm.e; Monzon, Andres, E-mail: amonzon@caminos.upm.e
Most Strategic Environmental Assessment (SEA) research has been concerned with SEA as a procedure, and there have been relatively few developments and tests of analytical methodologies. The first stage of the SEA is the 'screening', which is the process whereby a decision is taken on whether or not SEA is required for a particular programme or plan. The effectiveness of screening and SEA procedures will depend on how well the assessment fits into the planning from the early stages of the decision-making process. However, it is difficult to prepare the environmental screening for an infrastructure plan involving a whole country.more » To be useful, such methodologies must be fast and simple. We have developed two screening tools which would make it possible to estimate promptly the overall impact an infrastructure plan might have on biodiversity and global warming for a whole country, in order to generate planning alternatives, and to determine whether or not SEA is required for a particular infrastructure plan.« less
NASA Astrophysics Data System (ADS)
Fetita, Catalin; Tarando, Sebastian; Brillet, Pierre-Yves; Grenier, Philippe A.
2016-03-01
Correct segmentation and labeling of lungs in thorax MSCT is a requirement in pulmonary/respiratory disease analysis as a basis for further processing or direct quantitative measures: lung texture classification, respiratory functional simulations, intrapulmonary vascular remodeling evaluation, detection of pleural effusion or subpleural opacities, are only few clinical applications related to this requirement. Whereas lung segmentation appears trivial for normal anatomo-pathological conditions, the presence of disease may complicate this task for fully-automated algorithms. The challenges come either from regional changes of lung texture opacity or from complex anatomic configurations (e.g., thin septum between lungs making difficult proper lung separation). They make difficult or even impossible the use of classic algorithms based on adaptive thresholding, 3-D connected component analysis and shape regularization. The objective of this work is to provide a robust segmentation approach of the pulmonary field, with individualized labeling of the lungs, able to overcome the mentioned limitations. The proposed approach relies on 3-D mathematical morphology and exploits the concept of controlled relief flooding (to identify contrasted lung areas) together with patient-specific shape properties for peripheral dense tissue detection. Tested on a database of 40 MSCT of pathological lungs, the proposed approach showed correct identification of lung areas with high sensitivity and specificity in locating peripheral dense opacities.
On the difficulty to delimit disease risk hot spots
NASA Astrophysics Data System (ADS)
Charras-Garrido, M.; Azizi, L.; Forbes, F.; Doyle, S.; Peyrard, N.; Abrial, D.
2013-06-01
Representing the health state of a region is a helpful tool to highlight spatial heterogeneity and localize high risk areas. For ease of interpretation and to determine where to apply control procedures, we need to clearly identify and delineate homogeneous regions in terms of disease risk, and in particular disease risk hot spots. However, even if practical purposes require the delineation of different risk classes, such a classification does not correspond to a reality and is thus difficult to estimate. Working with grouped data, a first natural choice is to apply disease mapping models. We apply a usual disease mapping model, producing continuous estimations of the risks that requires a post-processing classification step to obtain clearly delimited risk zones. We also apply a risk partition model that build a classification of the risk levels in a one step procedure. Working with point data, we will focus on the scan statistic clustering method. We illustrate our article with a real example concerning the bovin spongiform encephalopathy (BSE) an animal disease whose zones at risk are well known by the epidemiologists. We show that in this difficult case of a rare disease and a very heterogeneous population, the different methods provide risk zones that are globally coherent. But, related to the dichotomy between the need and the reality, the exact delimitation of the risk zones, as well as the corresponding estimated risks are quite different.
Control of a Supernumerary Robotic Hand by Foot: An Experimental Study in Virtual Reality
Abdi, Elahe; Burdet, Etienne; Bouri, Mohamed; Bleuler, Hannes
2015-01-01
In the operational theater, the surgical team could highly benefit from a robotic supplementary hand under the surgeon’s full control. The surgeon may so become more autonomous; this may reduce communication errors with the assistants and take over difficult tasks such as holding tools without tremor. In this paper, we therefore examine the possibility to control a third robotic hand with one foot’s movements. Three experiments in virtual reality were designed to assess the feasibility of this control strategy, the learning curve of the subjects in different tasks and the coordination of foot movements with the two natural hands. Results show that the limbs are moved simultaneously, in parallel rather than serially. Participants’ performance improved within a few minutes of practice without any specific difficulty to complete the tasks. Subjective assessment by the subjects indicated that controlling a third hand by foot has been easy and required only negligible physical and mental efforts. The sense of ownership was reported to improve through the experiments. The mental burden was not directly related to the level of motion required by a task, but depended on the type of activity and practice. The most difficult task was moving two hands and foot in opposite directions. These results suggest that a combination of practice and appropriate tasks can enhance the learning process for controlling a robotic hand by foot. PMID:26225938
Automatic Fault Characterization via Abnormality-Enhanced Classification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bronevetsky, G; Laguna, I; de Supinski, B R
Enterprise and high-performance computing systems are growing extremely large and complex, employing hundreds to hundreds of thousands of processors and software/hardware stacks built by many people across many organizations. As the growing scale of these machines increases the frequency of faults, system complexity makes these faults difficult to detect and to diagnose. Current system management techniques, which focus primarily on efficient data access and query mechanisms, require system administrators to examine the behavior of various system services manually. Growing system complexity is making this manual process unmanageable: administrators require more effective management tools that can detect faults and help tomore » identify their root causes. System administrators need timely notification when a fault is manifested that includes the type of fault, the time period in which it occurred and the processor on which it originated. Statistical modeling approaches can accurately characterize system behavior. However, the complex effects of system faults make these tools difficult to apply effectively. This paper investigates the application of classification and clustering algorithms to fault detection and characterization. We show experimentally that naively applying these methods achieves poor accuracy. Further, we design novel techniques that combine classification algorithms with information on the abnormality of application behavior to improve detection and characterization accuracy. Our experiments demonstrate that these techniques can detect and characterize faults with 65% accuracy, compared to just 5% accuracy for naive approaches.« less
NASA Technical Reports Server (NTRS)
Chakrabarti, S.; Schmidt, G. R.; Thio, Y. C.; Hurst, C. M.
1999-01-01
Rapid transportation of human crews to destinations throughout the solar system will require propulsion systems having not only very high exhaust velocities (i.e., I(sub sp) >= 10(exp 4) to 10(exp 5) sec) but also extremely low mass-power ratios (i.e., alpha <= 10(exp -2) kg/kW). These criteria are difficult to meet with electric propulsion and other power-limited systems, but may be achievable with propulsion concepts that use onboard power to produce a net gain in energy via fusion or some other nuclear process. This paper compares the fundamental performance of these gain-limited systems with that of power-limited systems, and determines from a generic power balance the gains required for ambitious planetary missions ranging up to 100 AU. Results show that energy gain reduces the required effective mass-power ratio of the system, thus enabling shorter trip times than those of power-limited concepts.
Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow
NASA Technical Reports Server (NTRS)
Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.
1999-01-01
The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.
Design and Verification of a Distributed Communication Protocol
NASA Technical Reports Server (NTRS)
Munoz, Cesar A.; Goodloe, Alwyn E.
2009-01-01
The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.
Courage and Compassion: Virtues in Caring for So-Called "Difficult" Patients.
Hawking, Michael; Curlin, Farr A; Yoon, John D
2017-04-01
What, if anything, can medical ethics offer to assist in the care of the "difficult" patient? We begin with a discussion of virtue theory and its application to medical ethics. We conceptualize the "difficult" patient as an example of a "moral stress test" that especially challenges the physician's character, requiring the good physician to display the virtues of courage and compassion. We then consider two clinical vignettes to flesh out how these virtues might come into play in the care of "difficult" patients, and we conclude with a brief proposal for how medical educators might cultivate these essential character traits in physicians-in-training. © 2017 American Medical Association. All Rights Reserved.
Tannier, C; Crozier, S; Zuber, M; Constantinides, Y; Delezie, E; Gisquet, E; Grignoli, N; Lamy, C; Louvet, F; Pinel, J-F
2015-02-01
In the majority of cases, severe stroke is accompanied by difficulty in swallowing and an altered state of consciousness requiring artificial nutrition and hydration. Because of their artificial nature, nutrition and hydration are considered by law as treatment rather basic care. Withdrawal of these treatments is dictated by the refusal of unreasonable obstinacy enshrined in law and is justified by the risk of severe disability and very poor quality of life. It is usually the last among other withholding and withdrawal decisions which have already been made during the long course of the disease. Reaching a collegial consensus on a controversial decision such as artificial nutrition and hydration withdrawal is a difficult and complex process. The reluctance for such decisions is mainly due to the symbolic value of food and hydration, to the fear of "dying badly" while suffering from hunger and thirst, and to the difficult distinction between this medical act and euthanasia. The only way to overcome such reluctance is to ensure flawless accompaniment, associating sedation and appropriate comfort care with a clear explanation (with relatives but also caregivers) of the rationale and implications of this type of decision. All teams dealing with this type of situation must have thoroughly thought through the medical, legal and ethical considerations involved in making this difficult decision. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
Apparatus for electroplating particles of small dimension
Yu, Conrad M.; Illige, John D.
1982-01-01
The thickness, uniformity, and surface smoothness requirements for surface coatings of glass microspheres for use as targets for laser fusion research are critical. Because of their minute size, the microspheres are difficult to manipulate and control in electroplating systems. The electroplating apparatus (10) of the present invention addresses these problems by providing a cathode cell (20) having a cell chamber (22), a cathode (23) and an anode (26) electrically isolated from each other and connected to an electrical power source (24). During the plating process, the cathode (23) is controllably vibrated along with solution pulse to maintain the particles in random free motion so as to attain the desired properties.
Listeriosis in Human Pregnancy: a systematic review
Lamont, Ronald F.; Sobel, Jack; Mazaki-Tovi, Shali; Kusanovic, Juan Pedro; Vaisbuch, Edi; Kim, Sun Kwon; Uldbjerg, Niels; Romero, Roberto
2013-01-01
Listeria is commonly found in processed and prepared foods and listeriosis is associated with high morbidity and mortality. Preventative measures are well prescribed and monitoring and voluntary recall of contaminated products has resulted in a 44% reduction in the prevalence of perinatal listeriosis in the USA. Pregnant women are at high risk for listeriosis, but symptoms are non-specific and diagnosis is difficult. The intracellular life-cycle of Listeria protects the bacterium from host innate and adaptive immune responses. Antibiotic treatment requires agents able to penetrate, distribute, and remain stable within host cells. Prolonged use of high-dose ampicillin can significantly improve neonatal outcome. PMID:21517700
Exploiting replication in distributed systems
NASA Technical Reports Server (NTRS)
Birman, Kenneth P.; Joseph, T. A.
1989-01-01
Techniques are examined for replicating data and execution in directly distributed systems: systems in which multiple processes interact directly with one another while continuously respecting constraints on their joint behavior. Directly distributed systems are often required to solve difficult problems, ranging from management of replicated data to dynamic reconfiguration in response to failures. It is shown that these problems reduce to more primitive, order-based consistency problems, which can be solved using primitives such as the reliable broadcast protocols. Moreover, given a system that implements reliable broadcast primitives, a flexible set of high-level tools can be provided for building a wide variety of directly distributed application programs.
NASA Astrophysics Data System (ADS)
Baillard, C.; Dissard, O.; Jamet, O.; Maître, H.
Above-ground analysis is a key point to the reconstruction of urban scenes, but it is a difficult task because of the diversity of the involved objects. We propose a new method to above-ground extraction from an aerial stereo pair, which does not require any assumption about object shape or nature. A Digital Surface Model is first produced by a stereoscopic matching stage preserving discontinuities, and then processed by a region-based Markovian classification algorithm. The produced above-ground areas are finally characterized as man-made or natural according to the grey level information. The quality of the results is assessed and discussed.
NASA Technical Reports Server (NTRS)
1979-01-01
Education of medical students in cardiology requires access to patients having a variety of different forms of heart disease. But bringing together student, instructor and patient is a difficult and expensive process that does not benefit the patient. An alternate approach is substitution of a lifelike mannequin capable of simulating many conditions of heart disease. The mannequin pictured below, together with a related information display, is an advanced medical training system whose development benefited from NASA visual display technology and consultative input from NASA's Kennedy Space Center. The mannequin system represents more than 10 years of development effort by Dr. Michael S. Gordon, professor of cardiology at the University of Miami (Florida) School of Medicine.
Object recognition based on Google's reverse image search and image similarity
NASA Astrophysics Data System (ADS)
Horváth, András.
2015-12-01
Image classification is one of the most challenging tasks in computer vision and a general multiclass classifier could solve many different tasks in image processing. Classification is usually done by shallow learning for predefined objects, which is a difficult task and very different from human vision, which is based on continuous learning of object classes and one requires years to learn a large taxonomy of objects which are not disjunct nor independent. In this paper I present a system based on Google image similarity algorithm and Google image database, which can classify a large set of different objects in a human like manner, identifying related classes and taxonomies.
Numerical simulation of crystal fractionation in shergottite meteorites
NASA Astrophysics Data System (ADS)
Grimm, R. E.; McSween, H. Y., Jr.
Cumulus clinopyroxenes in the Shergotty and Zagami meteorites suggest crystal fractionation occurred, possibly by gravitative settling. Numerical models of this process in a nonconvecting environment argue that the small phenocrysts can segregate only under extreme conditions of cooling time or gravitational field strength. Since textures indicate that cooling time was not excessive, a large (planetary) g is required by these models, in agreement with other suggestions that the shergottite parent body may be Mars. Other calculations indicate that it is extremely difficult to produce the observed textures in a convecting environment, unless crystal setting occurred in a quiescent zone at the bottom of the magma chamber.
NASA Astrophysics Data System (ADS)
Paladini, D.; Mello, A. B.
2016-07-01
Inmetro's data about the conformity of certificated products, process and services are, usually, displayed at fragmented databases of difficult access for several reasons, for instance, the lack of computational solutions which allow this kind of access to its users. A discussion about some of the technological solutions to support supervisory activities by the appropriate regulatory bodies and also to provide information access to society in general is herein presented, along with a theoretical explanation of the pros and cons of such technologies to the conclusion that a mobile platform seems to be the best tool for the requirements of Inmetro.
Acoustic emission of rock mass under the constant-rate fluid injection
NASA Astrophysics Data System (ADS)
Shadrin Klishin, AV, VI
2018-03-01
The authors study acoustic emission in coal bed and difficult-to-cave roof under injection of fluid by pumps at a constant rate. The functional connection between the roof hydrofracture length and the total number of AE pulses is validated, it is also found that the coal bed hydroloosening time, injection rate and time behavior of acoustic emission activity depend on the fluid injection volume required until the fluid breakout in a roadway through growing fractures. In the formulas offered for the practical application, integral parameters that characterize permeability and porosity of rock mass and process parameters of the technology are found during test injection.
Numerical simulation of crystal fractionation in shergottite meteorites
NASA Technical Reports Server (NTRS)
Grimm, R. E.; Mcsween, H. Y., Jr.
1982-01-01
Cumulus clinopyroxenes in the Shergotty and Zagami meteorites suggest crystal fractionation occurred, possibly by gravitative settling. Numerical models of this process in a nonconvecting environment argue that the small phenocrysts can segregate only under extreme conditions of cooling time or gravitational field strength. Since textures indicate that cooling time was not excessive, a large (planetary) g is required by these models, in agreement with other suggestions that the shergottite parent body may be Mars. Other calculations indicate that it is extremely difficult to produce the observed textures in a convecting environment, unless crystal setting occurred in a quiescent zone at the bottom of the magma chamber.
An improvement of vehicle detection under shadow regions in satellite imagery
NASA Astrophysics Data System (ADS)
Karim, Shahid; Zhang, Ye; Ali, Saad; Asif, Muhammad Rizwan
2018-04-01
The processing of satellite imagery is dependent upon the quality of imagery. Due to low resolution, it is difficult to extract accurate information according to the requirements of applications. For the purpose of vehicle detection under shadow regions, we have used HOG for feature extraction, SVM is used for classification and HOG is discerned worthwhile tool for complex environments. Shadow images have been scrutinized and found very complex for detection as observed very low detection rates therefore our dedication is towards enhancement of detection rate under shadow regions by implementing appropriate preprocessing. Vehicles are precisely detected under non-shadow regions with high detection rate than shadow regions.
Patients with difficult intubation may need referral to sleep clinics.
Chung, Frances; Yegneswaran, Balaji; Herrera, Francisco; Shenderey, Alex; Shapiro, Colin M
2008-09-01
Upper airway abnormalities carry the risk of obstructive sleep apnea (OSA) and difficult tracheal intubations. Both conditions contribute to significant clinical problems and have increased perioperative morbidity and mortality. We hypothesized that patients who presented with difficult intubation would have a very high prevalence of OSA and that those with unexpected difficult intubation may require referral to sleep clinics for polysomnography (PSG). Patients classified as a grade 4 Cormack and Lehane on direct laryngoscopic view, and who required more than two attempts for successful endotracheal intubation, were referred to the study by consultant anesthesiologists at four hospitals. Apnea-hypopnea index (AHI) data and postoperative events were collected. Patients with AHI >5/h were considered positive for OSA. Clinical and PSG variables were compared using t-tests and chi(2) test. Over a 20-mo period, 84 patients with a difficult intubation were referred into the study. Thirty-three patients agreed to participate. Sixty-six percent (22 of 33) had OSA (AHI >5/h). Of the 22 OSA patients, 10 patients (64%) had mild OSA (AHI 5-15), 6 (18%) had moderate OSA (AHI >15/h), and 6 (18%) had severe OSA (AHI >30/h). Of the 33 patients, 11 patients (33%) were recommended for continuous positive airway pressure treatment. Between the OSA group and the non-OSA group, there were significant differences in gender, neck size, and the quality of sleep, but there were no significant differences in age and body mass index. Sixty-six percent of patients with unexpected difficult intubation who consented to undergo a sleep study were diagnosed with OSA by PSG. Patients with difficult intubation are at high risk for OSA and should be screened for signs and symptoms of sleep apnea. Screening for OSA should be considered by referral to a sleep clinic for PSG.
Take a Multidisciplinary, Team-based Approach on Elder Abuse.
2016-07-01
While EDs are well positioned to identify incidents of elder abuse, providers often miss the opportunity. Experts say providers find only one in every 24 cases, and that the pendulum must swing toward over-detection. Investigators acknowledge elder abuse is difficult to confirm, given that disease processes can explain some of the signs. Further, older adults are often reluctant to report abuse because they fear they will be removed from their homes or separated from their caregivers. Given the complexity involved with addressing the issue, investigators recommend EDs establish a multidisciplinary approach to the problem. Providing great care to a victim of elder abuse requires time and setting up a circumstance whereby one can actually communicate with the patient reliably and alone. While most states require providers to report suspected cases of elder abuse to Adult Protective Services, there is little evidence this requirement has incentivized more reports in the same way a similar requirement has prompted providers to report cases of suspected child abuse. Investigators advise ED leaders to train and empower every member of their team to identify potential signs of elder abuse.
ERIC Educational Resources Information Center
Parker, Judith A.
Although a world of uncertainty and continual change is difficult to explain to children, by exploring and discussing the process of grief, parents can begin to understand the significance and necessity of the grief process and help their children to cope with difficult events. This booklet offers parents advice on how to talk with children about…
NASA Astrophysics Data System (ADS)
Xu, Boyi; Xu, Li Da; Fei, Xiang; Jiang, Lihong; Cai, Hongming; Wang, Shuai
2017-08-01
Facing the rapidly changing business environments, implementation of flexible business process is crucial, but difficult especially in data-intensive application areas. This study aims to provide scalable and easily accessible information resources to leverage business process management. In this article, with a resource-oriented approach, enterprise data resources are represented as data-centric Web services, grouped on-demand of business requirement and configured dynamically to adapt to changing business processes. First, a configurable architecture CIRPA involving information resource pool is proposed to act as a scalable and dynamic platform to virtualise enterprise information resources as data-centric Web services. By exposing data-centric resources as REST services in larger granularities, tenant-isolated information resources could be accessed in business process execution. Second, dynamic information resource pool is designed to fulfil configurable and on-demand data accessing in business process execution. CIRPA also isolates transaction data from business process while supporting diverse business processes composition. Finally, a case study of using our method in logistics application shows that CIRPA provides an enhanced performance both in static service encapsulation and dynamic service execution in cloud computing environment.
Evaluating methods for monitoring populations of Mexican spotted owls: A case study
Jospeh L. Ganey; Gary C. White; David C. Bowden; Alan B. Franklin
2004-01-01
Monitoring population status of rare or elusive species presents special challenges. Understanding population trends requires separating signal (true and important changes in abundance) from noise (normal temporal and sampling variation; e.g., Block et al. 2001). This is particularly difficult when small numbers or elusive habits make it difficult to obtain precise...
ERIC Educational Resources Information Center
Ocker, Rosalie J.; Yaverbaum, Gayle J.
2004-01-01
Although collaborative learning techniques have been shown to enhance the learning experience, it is difficult to incorporate these concepts into courses without requiring students to collaborate outside of class. There is an ever increasing number of nontraditional university students who find it difficult to schedule the necessary meetings with…
Difficult intubation: are you prepared for it?
Balcom, C
1994-01-01
The endotracheal intubation of a patient for surgery requires an anaesthetist who is aided by a skilled and experienced helper. This paper explores reasons why some patients are difficult to intubate. Some are predictable on pre-operative assessment and others are not. Suggestions are given on how the helper is useful to the anaesthetist in this potentially critical situation.
An auditory attention task: a note on the processing of verbal information.
Linde, L
1994-04-01
On an auditory attention task subjects were required to reproduce spatial relationships between letters from auditorily presented verbal information containing the prepositions "before" or "after." It was assumed that propositions containing "after" induce a conflict between temporal, and semantically implied, spatial order between letters. Data from 36 subjects showing that propositions with "after" are more difficult to process are presented. A significant, general training effect appeared. 200 mg caffeine had a certain beneficial effect on performance of 18 subjects who had been awake for about 22 hours and were tested at 6 a.m.; however, the beneficial effect was not related to amount of conflict but concerned items without and with conflict. On the other hand, the effect of caffeine for 18 subjects tested at 4 p.m. after normal sleep was slightly negative.
Prioritization of Stockpile Maintenance with Layered Pareto Fronts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burke, Sarah E.; Anderson-Cook, Christine M.; Lu, Lu
Difficult choices are required for a decision-making process where resources and budgets are increasingly constrained. This study demonstrates a structured decision-making approach using layered Pareto fronts to identify priorities about how to allocate funds between munitions stockpiles based on their estimated reliability, the urgency of needing available units, and the consequences if adequate numbers of units are not available. This case study, while specific to the characteristics of a group of munitions stockpiles, illustrates the general process of structured decision-making based on first identifying appropriate metrics that summarize the important dimensions of the decision, and then objectively eliminating non-contenders frommore » further consideration. Finally, the final subjective stage incorporates user priorities to select the four stockpiles to receive additional maintenance and surveillance funds based on understanding the trade-offs and robustness to various user priorities.« less
Management of change through force field analysis.
Baulcomb, Jean Sandra
2003-07-01
Today's NHS is rapidly changing, placing more emphasis on the managerial responsibilities of ward managers. Managing change is seen as being skilled at creating, acquiring and transferring knowledge to reflect new knowledge and insights. Defining core concepts is often difficult and requires the drawing on models/theories of change for guidance. Guidance from Lewin's (1951) force field analysis demonstrates the complexities of the change process and how driving and resisting forces were incorporated within the planning and implementation phases. Findings outline the benefits of a small scale change for staff, patients and the organization when successfully used to introduce a change of shift pattern within a progressively busy haematology day unit, in order to meet service demands without additional funding. Conclusions have been drawn in relation to the process and recommendations for practice made to further enhance care delivery within the unit.
Evaluating models of climate and forest vegetation
NASA Technical Reports Server (NTRS)
Clark, James S.
1992-01-01
Understanding how the biosphere may respond to increasing trace gas concentrations in the atmosphere requires models that contain vegetation responses to regional climate. Most of the processes ecologists study in forests, including trophic interactions, nutrient cycling, and disturbance regimes, and vital components of the world economy, such as forest products and agriculture, will be influenced in potentially unexpected ways by changing climate. These vegetation changes affect climate in the following ways: changing C, N, and S pools; trace gases; albedo; and water balance. The complexity of the indirect interactions among variables that depend on climate, together with the range of different space/time scales that best describe these processes, make the problems of modeling and prediction enormously difficult. These problems of predicting vegetation response to climate warming and potential ways of testing model predictions are the subjects of this chapter.
Uncertainty propagation from raw data to final results. [ALEX
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larson, N.M.
1985-01-01
Reduction of data from raw numbers (counts per channel) to physically meaningful quantities (such as cross sections) is in itself a complicated procedure. Propagation of experimental uncertainties through that reduction process has sometimes been perceived as even more difficult, if not impossible. At the Oak Ridge Electron Linear Accelerator, a computer code ALEX has been developed to assist in the propagation process. The purpose of ALEX is to carefully and correctly propagate all experimental uncertainties through the entire reduction procedure, yielding the complete covariance matrix for the reduced data, while requiring little additional input from the experimentalist beyond that whichmore » is needed for the data reduction itself. The theoretical method used in ALEX is described, with emphasis on transmission measurements. Application to the natural iron and natural nickel measurements of D.C. Larson is shown.« less
Students Fail to Transfer Knowledge of Chromosome Structure to Topics Pertaining to Cell Division
Newman, Dina L.; Catavero, Christina M.; Wright, L. Kate
2012-01-01
Cellular processes that rely on knowledge of molecular behavior are difficult for students to comprehend. For example, thorough understanding of meiosis requires students to integrate several complex concepts related to chromosome structure and function. Using a grounded theory approach, we have unified classroom observations, assessment data, and in-depth interviews under the theory of knowledge transfer to explain student difficulties with concepts related to chromosomal behavior. In this paper, we show that students typically understand basic chromosome structure but do not activate cognitive resources that would allow them to explain macromolecular phenomena (e.g., homologous pairing during meiosis). To improve understanding of topics related to genetic information flow, we suggest that instructors use pedagogies and activities that prime students for making connections between chromosome structure and cellular processes. PMID:23222838
Williams, David J; Archer, Richard; Archibald, Peter; Bantounas, Ioannis; Baptista, Ricardo; Barker, Roger; Barry, Jacqueline; Bietrix, Florence; Blair, Nicholas; Braybrook, Julian; Campbell, Jonathan; Canham, Maurice; Chandra, Amit; Foldes, Gabor; Gilmanshin, Rudy; Girard, Mathilde; Gorjup, Erwin; Hewitt, Zöe; Hourd, Paul; Hyllner, Johan; Jesson, Helen; Kee, Jasmin; Kerby, Julie; Kotsopoulou, Nina; Kowalski, Stanley; Leidel, Chris; Marshall, Damian; Masi, Louis; McCall, Mark; McCann, Conor; Medcalf, Nicholas; Moore, Harry; Ozawa, Hiroki; Pan, David; Parmar, Malin; Plant, Anne L; Reinwald, Yvonne; Sebastian, Sujith; Stacey, Glyn; Thomas, Robert J; Thomas, Dave; Thurman-Newell, Jamie; Turner, Marc; Vitillo, Loriana; Wall, Ivan; Wilson, Alison; Wolfrum, Jacqueline; Yang, Ying; Zimmerman, Heiko
2016-01-01
This paper summarizes the proceedings of a workshop held at Trinity Hall, Cambridge to discuss comparability and includes additional information and references to related information added subsequently to the workshop. Comparability is the need to demonstrate equivalence of product after a process change; a recent publication states that this ‘may be difficult for cell-based medicinal products’. Therefore a well-managed change process is required which needs access to good science and regulatory advice and developers are encouraged to seek help early. The workshop shared current thinking and best practice and allowed the definition of key research questions. The intent of this report is to summarize the key issues and the consensus reached on each of these by the expert delegates. PMID:27404768
Prioritization of Stockpile Maintenance with Layered Pareto Fronts
Burke, Sarah E.; Anderson-Cook, Christine M.; Lu, Lu; ...
2017-10-11
Difficult choices are required for a decision-making process where resources and budgets are increasingly constrained. This study demonstrates a structured decision-making approach using layered Pareto fronts to identify priorities about how to allocate funds between munitions stockpiles based on their estimated reliability, the urgency of needing available units, and the consequences if adequate numbers of units are not available. This case study, while specific to the characteristics of a group of munitions stockpiles, illustrates the general process of structured decision-making based on first identifying appropriate metrics that summarize the important dimensions of the decision, and then objectively eliminating non-contenders frommore » further consideration. Finally, the final subjective stage incorporates user priorities to select the four stockpiles to receive additional maintenance and surveillance funds based on understanding the trade-offs and robustness to various user priorities.« less
PURPA 210 avoided cost rates: Economic and implementation issues
DOE Office of Scientific and Technical Information (OSTI.GOV)
Devine, M.D.; Chartock, M.A.; Gunn, E.M.
The purpose of Section 210 of the Public Utilities Regulatory Policies Act (PURPA) was to promote the utilization of waste and renewable fuels and cogeneration processes for increasing electric power supplies. It represents a radical change in policy by allowing financially unregulated parties to generate power in ''qualifying facilities'' and by requiring utilities to purchase this power at the utilities' marginal (or ''avoided'') cost. PURPA 210 has clearly had a major impact as measured by the actual and proposed number of new qualifying facilities; however, implementation has been difficult due to the adversarial nature of the process for negotiating ormore » setting the avoided cost rates. This paper reviews the pertinent PURPA rules and regulations, analyzes the status of current avoided cost rates that have been established, and discusses implementation issues and options for resolving those issues.« less
Multi-ball and one-ball geolocation
NASA Astrophysics Data System (ADS)
Nelson, D. J.; Townsend, J. L.
2017-05-01
We present analysis methods that may be used to geolocate emitters using one or more moving receivers. While some of the methods we present may apply to a broader class of signals, our primary interest is locating and tracking ships from short pulsed transmissions, such as the maritime Automatic Identification System (AIS.) The AIS signal is difficult to process and track since the pulse duration is only 25 milliseconds, and the pulses may only be transmitted every six to ten seconds. In this article, we address several problems including accurate TDOA and FDOA estimation methods that do not require searching a two dimensional surface such as the cross-ambiguity surface. As an example, we apply these methods to identify and process AIS pulses from a single emitter, making it possible to geolocate the AIS signal using a single moving receiver.
Simulation on turning aspheric surface method via oscillating feed
NASA Astrophysics Data System (ADS)
Kong, Fanxing; Li, Zengqiang; Sun, Tao
2014-08-01
It is quite difficult to manufacturing optical components, the combination of high gradient ellipsoid and hyperboloid, with high machining surface requirements. To solve the problem, in this paper we present a turning and forming method via oscillating feed of R-θ layout lathe, analyze machining ellipsoid segment and hyperboloid segment separately through oscillating feed. Also calculate parameters on each trajectory during processing respectively and obtain displacement, velocity, acceleration and other parameters. The simulation result shows that this rotary turning method is capable of ensuring that the cutter is on the equidistance line of meridian cross section curve of work piece during processing high gradient aspheric surface, which helps getting high quality surface. Also the method provides a new approach and a theory basis for manufacturing high quality aspheric surface and extending function of the available twin-spindle lathe as well.
Automated MAD and MIR structure solution
Terwilliger, Thomas C.; Berendzen, Joel
1999-01-01
Obtaining an electron-density map from X-ray diffraction data can be difficult and time-consuming even after the data have been collected, largely because MIR and MAD structure determinations currently require many subjective evaluations of the qualities of trial heavy-atom partial structures before a correct heavy-atom solution is obtained. A set of criteria for evaluating the quality of heavy-atom partial solutions in macromolecular crystallography have been developed. These have allowed the conversion of the crystal structure-solution process into an optimization problem and have allowed its automation. The SOLVE software has been used to solve MAD data sets with as many as 52 selenium sites in the asymmetric unit. The automated structure-solution process developed is a major step towards the fully automated structure-determination, model-building and refinement procedure which is needed for genomic scale structure determinations. PMID:10089316
Prebiotic Chemistry: Geochemical Context and Reaction Screening
Cleaves, Henderson James
2013-01-01
The origin of life on Earth is widely believed to have required the reactions of organic compounds and their self- and/or environmental organization. What those compounds were remains open to debate, as do the environment in and process or processes by which they became organized. Prebiotic chemistry is the systematic organized study of these phenomena. It is difficult to study poorly defined phenomena, and research has focused on producing compounds and structures familiar to contemporary biochemistry, which may or may not have been crucial for the origin of life. Given our ignorance, it may be instructive to explore the extreme regions of known and future investigations of prebiotic chemistry, where reactions fail, that will relate them to or exclude them from plausible environments where they could occur. Come critical parameters which most deserve investigation are discussed. PMID:25369745
Williams, David J; Archer, Richard; Archibald, Peter; Bantounas, Ioannis; Baptista, Ricardo; Barker, Roger; Barry, Jacqueline; Bietrix, Florence; Blair, Nicholas; Braybrook, Julian; Campbell, Jonathan; Canham, Maurice; Chandra, Amit; Foldes, Gabor; Gilmanshin, Rudy; Girard, Mathilde; Gorjup, Erwin; Hewitt, Zöe; Hourd, Paul; Hyllner, Johan; Jesson, Helen; Kee, Jasmin; Kerby, Julie; Kotsopoulou, Nina; Kowalski, Stanley; Leidel, Chris; Marshall, Damian; Masi, Louis; McCall, Mark; McCann, Conor; Medcalf, Nicholas; Moore, Harry; Ozawa, Hiroki; Pan, David; Parmar, Malin; Plant, Anne L; Reinwald, Yvonne; Sebastian, Sujith; Stacey, Glyn; Thomas, Robert J; Thomas, Dave; Thurman-Newell, Jamie; Turner, Marc; Vitillo, Loriana; Wall, Ivan; Wilson, Alison; Wolfrum, Jacqueline; Yang, Ying; Zimmerman, Heiko
2016-07-01
This paper summarizes the proceedings of a workshop held at Trinity Hall, Cambridge to discuss comparability and includes additional information and references to related information added subsequently to the workshop. Comparability is the need to demonstrate equivalence of product after a process change; a recent publication states that this 'may be difficult for cell-based medicinal products'. Therefore a well-managed change process is required which needs access to good science and regulatory advice and developers are encouraged to seek help early. The workshop shared current thinking and best practice and allowed the definition of key research questions. The intent of this report is to summarize the key issues and the consensus reached on each of these by the expert delegates.
Controlled assembly of In2O3 nanowires on electronic circuits using scanning optical tweezers.
Lee, Song-Woo; Jo, Gunho; Lee, Takhee; Lee, Yong-Gu
2009-09-28
In(2)O(3) nanowires can be used effectively as building blocks in the production of electronic circuits used in transparent and flexible electronic devices. The fabrication of these devices requires a controlled assembly of nanowires at crucial places and times. However, this kind of controlled assembly, which results in the fusion of nanowires to circuits, is still very difficult to execute. In this study, we demonstrate the benefits of using various lengths of In(2)O(3) nanowires by using non-contact mechanisms, such as scanning optical tweezers, to place them on designated targets during the fabrication process. Furthermore, these nanowires can be stabilized at both ends of the conducting wires using a focused laser, and later in the process, the annealed technique, so that proper flow of electrons is affected.
NASA Astrophysics Data System (ADS)
Srinivas, G.; Raghunandana, K.; Satish Shenoy, B.
2018-02-01
In the recent years the development of turbomachinery materials performance enhancement plays a vital role especially in aircraft air breathing engines like turbojet engine, turboprop engine, turboshaft engine and turbofan engines. Especially the transonic flow engines required highly sophisticated materials where it can sustain the entire thrust which can create by the engine. The main objective of this paper is to give an overview of the present cost-effective and technological capabilities process for turbomachinery component materials. Especially the main focus is given to study the Electro physical, Photonic additive removal process and Electro chemical process for turbomachinery parts manufacture. The aeronautical propulsion based technologies are reviewed thoroughly where in surface reliability, geometrical precession, and material removal and highly strengthened composite material deposition rates usually difficult to cut dedicated steels, Titanium and Nickel based alloys. In this paper the past aeronautical and propulsion mechanical based manufacturing technologies, current sophisticated technologies and also future challenging material processing techniques are covered. The paper also focuses on the brief description of turbomachinery components of shaping process and coating in aeromechanical applications.
Target electron ionization in Li2+-Li collisions: A multi-electron perspective
NASA Astrophysics Data System (ADS)
Śpiewanowski, M. D.; Gulyás, L.; Horbatsch, M.; Kirchner, T.
2015-05-01
The recent development of the magneto-optical trap reaction-microscope has opened a new chapter for detailed investigations of charged-particle collisions from alkali atoms. It was shown that energy-differential cross sections for ionization from the outer-shell in O8+-Li collisions at 1500 keV/amu can be readily explained with the single-active-electron approximation. Understanding of K-shell ionization, however, requires incorporating many-electron effects. An ionization-excitation process was found to play an important role. We present a theoretical study of target electron removal in Li2+-Li collisions at 2290 keV/amu. The results indicate that in outer-shell ionization a single-electron process plays the dominant part. However, the K-shell ionization results are more difficult to interpret. On one hand, we find only weak contributions from multi-electron processes. On the other hand, a large discrepancy between experimental and single-particle theoretical results indicate that multi-electron processes involving ionization from the outer shell may be important for a complete understanding of the process. Work supported by NSERC, Canada and the Hungarian Scientific Research Fund.
Chen, Song; Wang, Chenran; Yeo, Syn; Liang, Chun-Chi; Okamoto, Takako; Sun, Shaogang; Wen, Jian; Guan, Jun-Lin
2016-01-01
Autophagy is an evolutionarily conserved cellular process controlled through a set of essential autophagy genes (Atgs). However, there is increasing evidence that most, if not all, Atgs also possess functions independent of their requirement in canonical autophagy, making it difficult to distinguish the contributions of autophagy-dependent or -independent functions of a particular Atg to various biological processes. To distinguish these functions for FIP200 (FAK family-interacting protein of 200 kDa), an Atg in autophagy induction, we examined FIP200 interaction with its autophagy partner, Atg13. We found that residues 582–585 (LQFL) in FIP200 are required for interaction with Atg13, and mutation of these residues to AAAA (designated the FIP200-4A mutant) abolished its canonical autophagy function in vitro. Furthermore, we created a FIP200-4A mutant knock-in mouse model and found that specifically blocking FIP200 interaction with Atg13 abolishes autophagy in vivo, providing direct support for the essential role of the ULK1/Atg13/FIP200/Atg101 complex in the process beyond previous studies relying on the complete knockout of individual components. Analysis of the new mouse model showed that nonautophagic functions of FIP200 are sufficient to fully support embryogenesis by maintaining a protective role in TNFα-induced apoptosis. However, FIP200-mediated canonical autophagy is required to support neonatal survival and tumor cell growth. These studies provide the first genetic evidence linking an Atg's autophagy and nonautophagic functions to different biological processes in vivo. PMID:27013233
Visual search and emotion: how children with autism spectrum disorders scan emotional scenes.
Maccari, Lisa; Pasini, Augusto; Caroli, Emanuela; Rosa, Caterina; Marotta, Andrea; Martella, Diana; Fuentes, Luis J; Casagrande, Maria
2014-11-01
This study assessed visual search abilities, tested through the flicker task, in children diagnosed with autism spectrum disorders (ASDs). Twenty-two children diagnosed with ASD and 22 matched typically developing (TD) children were told to detect changes in objects of central interest or objects of marginal interest (MI) embedded in either emotion-laden (positive or negative) or neutral real-world pictures. The results showed that emotion-laden pictures equally interfered with performance of both ASD and TD children, slowing down reaction times compared with neutral pictures. Children with ASD were faster than TD children, particularly in detecting changes in MI objects, the most difficult condition. However, their performance was less accurate than performance of TD children just when the pictures were negative. These findings suggest that children with ASD have better visual search abilities than TD children only when the search is particularly difficult and requires strong serial search strategies. The emotional-social impairment that is usually considered as a typical feature of ASD seems to be limited to processing of negative emotional information.
Research of thread rolling on difficult-to-cut material workpieces
NASA Astrophysics Data System (ADS)
Popov, A. Yu; Bugay, I. A.; Nazarov, P. V.; Evdokimova, O. P.; Popov, P. E.; Vasilyev, E. V.
2018-01-01
In medicine production Ti-6Al-4V Grade 5 alloys are used. One of the most important tasks is to increase the strength of the products and decrease in value. The possibility to roll special thread on Ti-6Al-4V Grade 5 alloy workpiece on 2-roller thread rolling machine has been studied. This is wrought alloy, treatment of which in cold condition causes difficulties due to low plasticity. To obtain Ti-6Al-4V Grade 5 alloy product with thread by rolling is rather difficult. This is due to large axial workpiece displacements resulting from large alloy resistance to cold plastic deformation. The provision of adequate kinematics requires experimental researches and the selection of modes - speed of rolling and pressure on the movable roller. The purpose of the work is to determine the optimal modes for rolling thread on titanium alloy workpiece. It has been stated that, after rolling, the product strength has increased up to 30%. As a result of the work, the unit has been made and recommendations to choose the optimal rolling process modes have been offered.
NASA Astrophysics Data System (ADS)
Isern-Fontanet, Jordi; Ballabrera-Poy, Joaquim; Turiel, Antonio; García-Ladona, Emilio
2017-10-01
Ocean currents play a key role in Earth's climate - they impact almost any process taking place in the ocean and are of major importance for navigation and human activities at sea. Nevertheless, their observation and forecasting are still difficult. First, no observing system is able to provide direct measurements of global ocean currents on synoptic scales. Consequently, it has been necessary to use sea surface height and sea surface temperature measurements and refer to dynamical frameworks to derive the velocity field. Second, the assimilation of the velocity field into numerical models of ocean circulation is difficult mainly due to lack of data. Recent experiments that assimilate coastal-based radar data have shown that ocean currents will contribute to increasing the forecast skill of surface currents, but require application in multidata assimilation approaches to better identify the thermohaline structure of the ocean. In this paper we review the current knowledge in these fields and provide a global and systematic view of the technologies to retrieve ocean velocities in the upper ocean and the available approaches to assimilate this information into ocean models.
Testing and validation of computerized decision support systems.
Sailors, R M; East, T D; Wallace, C J; Carlson, D A; Franklin, M A; Heermann, L K; Kinder, A T; Bradshaw, R L; Randolph, A G; Morris, A H
1996-01-01
Systematic, through testing of decision support systems (DSSs) prior to release to general users is a critical aspect of high quality software design. Omission of this step may lead to the dangerous, and potentially fatal, condition of relying on a system with outputs of uncertain quality. Thorough testing requires a great deal of effort and is a difficult job because tools necessary to facilitate testing are not well developed. Testing is a job ill-suited to humans because it requires tireless attention to a large number of details. For these reasons, the majority of DSSs available are probably not well tested prior to release. We have successfully implemented a software design and testing plan which has helped us meet our goal of continuously improving the quality of our DSS software prior to release. While requiring large amounts of effort, we feel that the process of documenting and standardizing our testing methods are important steps toward meeting recognized national and international quality standards. Our testing methodology includes both functional and structural testing and requires input from all levels of development. Our system does not focus solely on meeting design requirements but also addresses the robustness of the system and the completeness of testing.
An intelligent CNC machine control system architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, D.J.; Loucks, C.S.
1996-10-01
Intelligent, agile manufacturing relies on automated programming of digitally controlled processes. Currently, processes such as Computer Numerically Controlled (CNC) machining are difficult to automate because of highly restrictive controllers and poor software environments. It is also difficult to utilize sensors and process models for adaptive control, or to integrate machining processes with other tasks within a factory floor setting. As part of a Laboratory Directed Research and Development (LDRD) program, a CNC machine control system architecture based on object-oriented design and graphical programming has been developed to address some of these problems and to demonstrate automated agile machining applications usingmore » platform-independent software.« less
2013-01-01
Background Confidential product listing agreements (PLAs) negotiated between pharmaceutical manufacturers and individual health care payers may contribute to unwanted price disparities, high administrative costs, and unequal bargaining power within and across jurisdictions. In the context of Canada’s decentralized health system, we aimed to document provincial policy makers’ perceptions about collaborative PLA negotiations. Methods We conducted semi-structured telephone interviews with a senior policy maker from nine of the ten Canadian provinces. We conducted a thematic analysis of interview transcripts to identify benefits, drawbacks, and barriers to routine collaboration on PLA negotiations. Results Canadian policy makers expressed support for joint negotiations of PLAs in principle, citing benefits of increased bargaining power and reduced inter-jurisdictional inequities in drug prices and formulary listings. However, established policy institutions and the politics of individual jurisdictional authority are formidable barriers to routine PLA collaboration. Achieving commitment to a joint process may be difficult to sustain among heterogeneous and autonomous partners. Conclusions Though collaboration on PLA negotiation is an extension of collaboration on health technology assessment, it is a very significant next step that requires harmonization of the outcomes of decision-making processes. Views of policy makers in Canada suggest that sustaining routine collaborations on PLA negotiations may be difficult unless participating jurisdictions have similar policy institutions, capacities to implement coverage decisions, and local political priorities. PMID:23363626
NASA Astrophysics Data System (ADS)
Christanto, N.; Sartohadi, J.; Setiawan, M. A.; Shrestha, D. B. P.; Jetten, V. G.
2018-04-01
Land use change influences the hydrological as well as landscape processes such as runoff and sediment yields. The main objectives of this study are to assess the land use change and its impact on the runoff and sediment yield of the upper Serayu Catchment. Land use changes of 1991 to 2014 have been analyzed. Spectral similarity and vegetation indices were used to classify the old image. Therefore, the present and the past images are comparable. The influence of the past and present land use on runoff and sediment yield has been compared with field measurement. The effect of land use changes shows the increased surface runoff which is the result of change in the curve number (CN) values. The study shows that it is possible to classify previously obtained image based on spectral characteristics and indices of major land cover types derived from recently obtained image. This avoids the necessity of having training samples which will be difficult to obtain. On the other hand, it also demonstrates that it is possible to link land cover changes with land degradation processes and finally to sedimentation in the reservoir. The only condition is the requirement for having the comparable dataset which should not be difficult to generate. Any variation inherent in the data which are other than surface reflectance has to be corrected.
Bio-inspired color image enhancement
NASA Astrophysics Data System (ADS)
Meylan, Laurence; Susstrunk, Sabine
2004-06-01
Capturing and rendering an image that fulfills the observer's expectations is a difficult task. This is due to the fact that the signal reaching the eye is processed by a complex mechanism before forming a percept, whereas a capturing device only retains the physical value of light intensities. It is especially difficult to render complex scenes with highly varying luminances. For example, a picture taken inside a room where objects are visible through the windows will not be rendered correctly by a global technique. Either details in the dim room will be hidden in shadow or the objects viewed through the window will be too bright. The image has to be treated locally to resemble more closely to what the observer remembers. The purpose of this work is to develop a technique for rendering images based on human local adaptation. We take inspiration from a model of color vision called Retinex. This model determines the perceived color given spatial relationships of the captured signals. Retinex has been used as a computational model for image rendering. In this article, we propose a new solution inspired by Retinex that is based on a single filter applied to the luminance channel. All parameters are image-dependent so that the process requires no parameter tuning. That makes the method more flexible than other existing ones. The presented results show that our method suitably enhances high dynamic range images.
Neutron capture reactions in astrophysics
NASA Astrophysics Data System (ADS)
Käppeler, F.
1985-01-01
About 2/3 of the chemical elements in nature were formed in neutron capture reactions. During the life of a star there are certain evolutionary stages where neutrons are available to build up the elements beyond iron which cannot be synthesized by charged particle reactions. The observed abundance pattern allows to distinguish a rapid and a slow neutron capture process (r- and s-process). The r-process taking place far from the valley of stability is difficult to investigate because of the required extrapolation of nuclear properties to extreme neutron rich nuclei. The s-process, on the other hand, proceeds along the valley of stability. Therefore, the involved isotopes are accessible to laboratory measurements. This information allows for quantitative calculation of s-process abundances and other parameters which represent constraints for stellar models. Two examples are outlined: (i) the s-process branching at A=147, 148 yields a rather accurate value for the neutron density. (ii) Comparison of s-process abundances with observations of stellar atmospheres are particularly interesting for the unstable isotopes 93Zr, 99Tc and 147Pm. Their deficiency with respect to stable neighbors may yield estimates for the transport time from the stellar interior to the surface.
Honkoop, Persijn J; Pinnock, Hilary; Kievits-Smeets, Regien M M; Sterk, Peter J; Dekhuijzen, P N Richard; in ’t Veen, Johannes C C M
2017-01-01
Patients with difficult-to-manage asthma represent a heterogeneous subgroup of asthma patients who require extensive assessment and tailored management. The International Primary Care Respiratory Group approach emphasises the importance of differentiating patients with asthma that is difficult to manage from those with severe disease. Local adaptation of this approach, however, is required to ensure an appropriate strategy for implementation in the Dutch context. We used a modified three-round e-Delphi approach to assess the opinion of all relevant stakeholders (general practitioners, pulmonologists, practice nurses, pulmonary nurses and people with asthma). In the first round, the participants were asked to provide potentially relevant items for a difficult-to-manage asthma programme, which resulted in 67 items. In the second round, we asked participants to rate the relevance of specific items on a seven-point Likert scale, and 46 items were selected as relevant. In the third round, the selected items were categorised and items were ranked within the categories according to relevance. Finally, we created the alphabet acronym for the categories ‘the A–I of difficult-to-manage asthma’ to resonate with an established Dutch ‘A–E acronym for determining asthma control’. This should facilitate implementation of this programme within the existing structure of educational material on asthma and chronic obstructive pulmonary disease (COPD) in primary care, with potential for improving management of difficult-to-manage asthma. Other countries could use a similar approach to create a locally adapted version of such a programme. PMID:28184039
Toward the Atomic-Level Mass Analysis of Biomolecules by the Scanning Atom Probe.
Nishikawa, Osamu; Taniguchi, Masahiro
2017-04-01
In 1994, a new type of atom probe instrument, named the scanning atom probe (SAP), was proposed. The unique feature of the SAP is the introduction of a small extraction electrode, which scans over a specimen surface and confines the high field, required for field evaporation of surface atoms in a small space, between the specimen and the electrode. Thus, the SAP does not require a sharp specimen tip. This indicates that the SAP can mass analyze the specimens which are difficult to form in a sharp tip, such as organic materials and biomolecules. Clean single wall carbon nanotubes (CNT), made by high-pressure carbon monoxide process are found to be the best substrates for biomolecules. Various amino acids and dipeptide biomolecules were successfully mass analyzed, revealing characteristic clusters formed by strongly bound atoms in the specimens. The mass analysis indicates that SAP analysis of biomolecules is not only qualitative, but also quantitative.
Dual fuel gradients in uranium silicide plates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pace, B.W.
1997-08-01
Babcock & Wilcox has been able to achieve dual gradient plates with good repeatability in small lots of U{sub 3}Si{sub 2} plates. Improvements in homogeneity and other processing parameters and techniques have allowed the development of contoured fuel within the cladding. The most difficult obstacles to overcome have been the ability to evaluate the bidirectional fuel loadings in comparison to the perfect loading model and the different methods of instilling the gradients in the early compact stage. The overriding conclusion is that to control the contour of the fuel, a known relationship between the compact, the frames and final coremore » gradient must exist. Therefore, further development in the creation and control of dual gradients in fuel plates will involve arriving at a plausible gradient requirement and building the correct model between the compact configuration and the final contoured loading requirements.« less
Muhammad Sarfraz, Rai; Bashir, Sajid; Mahmood, Asif; Ahsan, Haseeb; Riaz, Humayun; Raza, Hina; Rashid, Zermina; Atif Raza, Syed; Asad Abrar, Muhammad; Abbas, Khawar; Yasmeen, Tahira
2017-03-01
Solubility is concerned with solute and solvent to form a homogenous mixture. If solubility of a drug is low, then usually it is difficult to achieve desired therapeutic level of drug. Most of the newly developed entities have solubility problems and encounter difficulty in dissolution. Basic aim of solubility enhancement is to achieve desired therapeutic'level of drug to produce required pharmacological response. Different techniques are being used to enhance the solubility of water insoluble drugs. These techniques include particle size reduction, spray drying, kneading method, solvent evaporation method, salt formation, microemulsions, co-solven- cy, hydrosols, prodrug approach, supercritical fluid process, hydrogel micro particles etc. Selection of solubility improving method depends on drug properties, site of absorption, and required dosage form characteristics. Variety of polymers are also used to enhance solubility of these drugs like polyethylene glycol 300, polyvinyl pyrrolidone, chitosan, β-cyclodextrins etc.
Non-completion and informed consent.
Wertheimer, Alan
2014-02-01
There is a good deal of biomedical research that does not produce scientifically useful data because it fails to recruit a sufficient number of subjects. This fact is typically not disclosed to prospective subjects. In general, the guidance about consent concerns the information required to make intelligent self-interested decisions and ignores some of the information required for intelligent altruistic decisions. Bioethics has worried about the 'therapeutic misconception', but has ignored the 'completion misconception'. This article argues that, other things being equal, prospective subjects should be informed about the possibility of non-completion as part of the standard consent process if (1) it is or should be anticipatable that there is a non-trivial possibility of non-completion and (2) that information is likely to be relevant to a prospective subject's decision to consent. The article then considers several objections to the argument, including the objection that disclosing non-completion information would make recruitment even more difficult.
Generalizing Landauer's principle
NASA Astrophysics Data System (ADS)
Maroney, O. J. E.
2009-03-01
In a recent paper [Stud. Hist. Philos. Mod. Phys. 36, 355 (2005)] it is argued that to properly understand the thermodynamics of Landauer’s principle it is necessary to extend the concept of logical operations to include indeterministic operations. Here we examine the thermodynamics of such operations in more detail, extending the work of Landauer to include indeterministic operations and to include logical states with variable entropies, temperatures, and mean energies. We derive the most general statement of Landauer’s principle and prove its universality, extending considerably the validity of previous proofs. This confirms conjectures made that all logical operations may, in principle, be performed in a thermodynamically reversible fashion, although logically irreversible operations would require special, practically rather difficult, conditions to do so. We demonstrate a physical process that can perform any computation without work requirements or heat exchange with the environment. Many widespread statements of Landauer’s principle are shown to be special cases of our generalized principle.
Epigenetic Treatment of Neurodegenerative Ophthalmic Disorders: An Eye Toward the Future
Moos, Walter H.; Faller, Douglas V.; Glavas, Ioannis P.; Harpp, David N.; Irwin, Michael H.; Kanara, Iphigenia; Pinkert, Carl A.; Powers, Whitney R.; Steliou, Kosta; Vavvas, Demetrios G.; Kodukula, Krishna
2017-01-01
Abstract Eye disease is one of the primary medical conditions that requires attention and therapeutic intervention in ageing populations worldwide. Further, the global burden of diabetes and obesity, along with heart disease, all lead to secondary manifestations of ophthalmic distress. Therefore, there is increased interest in developing innovative new approaches that target various mechanisms and sequelae driving conditions that result in adverse vision. The research challenge is even greater given that the terrain of eye diseases is difficult to landscape into a single therapeutic theme. This report addresses the burden of eye disease due to mitochondrial dysfunction, including antioxidant, autophagic, epigenetic, mitophagic, and other cellular processes that modulate the biomedical end result. In this light, we single out lipoic acid as a potent known natural activator of these pathways, along with alternative and potentially more effective conjugates, which together harness the necessary potency, specificity, and biodistribution parameters required for improved therapeutic outcomes. PMID:29291141
Epigenetic Treatment of Neurodegenerative Ophthalmic Disorders: An Eye Toward the Future.
Moos, Walter H; Faller, Douglas V; Glavas, Ioannis P; Harpp, David N; Irwin, Michael H; Kanara, Iphigenia; Pinkert, Carl A; Powers, Whitney R; Steliou, Kosta; Vavvas, Demetrios G; Kodukula, Krishna
2017-01-01
Eye disease is one of the primary medical conditions that requires attention and therapeutic intervention in ageing populations worldwide. Further, the global burden of diabetes and obesity, along with heart disease, all lead to secondary manifestations of ophthalmic distress. Therefore, there is increased interest in developing innovative new approaches that target various mechanisms and sequelae driving conditions that result in adverse vision. The research challenge is even greater given that the terrain of eye diseases is difficult to landscape into a single therapeutic theme. This report addresses the burden of eye disease due to mitochondrial dysfunction, including antioxidant, autophagic, epigenetic, mitophagic, and other cellular processes that modulate the biomedical end result. In this light, we single out lipoic acid as a potent known natural activator of these pathways, along with alternative and potentially more effective conjugates, which together harness the necessary potency, specificity, and biodistribution parameters required for improved therapeutic outcomes.
Requirements Document for Development of a Livermore Tomography Tools Interface
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seetho, I. M.
In this document, we outline an exercise performed at LLNL to evaluate the user interface deficits of a LLNL-developed CT reconstruction software package, Livermore Tomography Tools (LTT). We observe that a difficult-to-use command line interface and the lack of support functions compound to generate a bottleneck in the CT reconstruction process when input parameters to key functions are not well known. Through the exercise of systems engineering best practices, we generate key performance parameters for a LTT interface refresh, and specify a combination of back-end (“test-mode” functions) and front-end (graphical user interface visualization and command scripting tools) solutions to LTT’smore » poor user interface that aim to mitigate issues and lower costs associated with CT reconstruction using LTT. Key functional and non-functional requirements and risk mitigation strategies for the solution are outlined and discussed.« less
Training Requirements in OSHA Standards. Revised.
ERIC Educational Resources Information Center
Occupational Safety and Health Administration, Washington, DC.
This booklet contains excerpts of the training-related requirements of the standards promulgated by the Occupational Safety and Health Administration (OSHA). It is designed as an aid for employers, safety and health professionals, and others who need to know training requirements. (References to training may be difficult to locate in the long and…
NASA Astrophysics Data System (ADS)
Prakash, Shashi; Kumar, Subrata
2017-09-01
CO2 lasers are commonly used for fabricating polymer based microfluidic devices. Despite several key advantages like low cost, time effectiveness, easy to operate and no requirement of clean room facility, CO2 lasers suffer from few disadvantages like thermal bulging, improper dimensional control, difficulty to produce microchannels of other than Gaussian cross sectional shapes and inclined surface walls. Many microfluidic devices require square or rectangular cross-sections which are difficult to produce using normal CO2 laser procedures. In this work, a thin copper sheet of 40 μm was used as a mask above the PMMA (Polymethyl-methacrylate) substrate while fabricating the microchannels utilizing the raster scanning feature of the CO2 lasers. Microchannels with different width dimensions were fabricated utilizing a CO2 laser in with mask and without-mask conditions. A comparison of both the fabricating process has been made. It was found that microchannels with U shape cross section and rectangular cross-section can efficiently be produced using the with mask technique. In addition to this, this technique can provide perfect dimensional control and better surface quality of the microchannel walls. Such a microchannel fabrication process do not require any post-processing. The fabrication of mask using a nanosecond fiber laser has been discussed in details. An underwater laser fabrication method was adopted to overcome heat related defects in mask preparation. Overall, the technique was found to be easy to adopt and significant improvements were observed in microchannel fabrication.
Face-to-face interference in typical and atypical development
Riby, Deborah M; Doherty-Sneddon, Gwyneth; Whittle, Lisa
2012-01-01
Visual communication cues facilitate interpersonal communication. It is important that we look at faces to retrieve and subsequently process such cues. It is also important that we sometimes look away from faces as they increase cognitive load that may interfere with online processing. Indeed, when typically developing individuals hold face gaze it interferes with task completion. In this novel study we quantify face interference for the first time in Williams syndrome (WS) and Autism Spectrum Disorder (ASD). These disorders of development impact on cognition and social attention, but how do faces interfere with cognitive processing? Individuals developing typically as well as those with ASD (n = 19) and WS (n = 16) were recorded during a question and answer session that involved mathematics questions. In phase 1 gaze behaviour was not manipulated, but in phase 2 participants were required to maintain eye contact with the experimenter at all times. Looking at faces decreased task accuracy for individuals who were developing typically. Critically, the same pattern was seen in WS and ASD, whereby task performance decreased when participants were required to hold face gaze. The results show that looking at faces interferes with task performance in all groups. This finding requires the caveat that individuals with WS and ASD found it harder than individuals who were developing typically to maintain eye contact throughout the interaction. Individuals with ASD struggled to hold eye contact at all points of the interaction while those with WS found it especially difficult when thinking. PMID:22356183
Revisiting software specification and design for large astronomy projects
NASA Astrophysics Data System (ADS)
Wiant, Scott; Berukoff, Steven
2016-07-01
The separation of science and engineering in the delivery of software systems overlooks the true nature of the problem being solved and the organization that will solve it. Use of a systems engineering approach to managing the requirements flow between these two groups as between a customer and contractor has been used with varying degrees of success by well-known entities such as the U.S. Department of Defense. However, treating science as the customer and engineering as the contractor fosters unfavorable consequences that can be avoided and opportunities that are missed. For example, the "problem" being solved is only partially specified through the requirements generation process since it focuses on detailed specification guiding the parties to a technical solution. Equally important is the portion of the problem that will be solved through the definition of processes and staff interacting through them. This interchange between people and processes is often underrepresented and under appreciated. By concentrating on the full problem and collaborating on a strategy for its solution a science-implementing organization can realize the benefits of driving towards common goals (not just requirements) and a cohesive solution to the entire problem. The initial phase of any project when well executed is often the most difficult yet most critical and thus it is essential to employ a methodology that reinforces collaboration and leverages the full suite of capabilities within the team. This paper describes an integrated approach to specifying the needs induced by a problem and the design of its solution.
Kfits: a software framework for fitting and cleaning outliers in kinetic measurements.
Rimon, Oded; Reichmann, Dana
2018-01-01
Kinetic measurements have played an important role in elucidating biochemical and biophysical phenomena for over a century. While many tools for analysing kinetic measurements exist, most require low noise levels in the data, leaving outlier measurements to be cleaned manually. This is particularly true for protein misfolding and aggregation processes, which are extremely noisy and hence difficult to model. Understanding these processes is paramount, as they are associated with diverse physiological processes and disorders, most notably neurodegenerative diseases. Therefore, a better tool for analysing and cleaning protein aggregation traces is required. Here we introduce Kfits, an intuitive graphical tool for detecting and removing noise caused by outliers in protein aggregation kinetics data. Following its workflow allows the user to quickly and easily clean large quantities of data and receive kinetic parameters for assessment of the results. With minor adjustments, the software can be applied to any type of kinetic measurements, not restricted to protein aggregation. Kfits is implemented in Python and available online at http://kfits.reichmannlab.com, in source at https://github.com/odedrim/kfits/, or by direct installation from PyPI (`pip install kfits`). danare@mail.huji.ac.il. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Jupp, Simon; Burdett, Tony; Welter, Danielle; Sarntivijai, Sirarat; Parkinson, Helen; Malone, James
2016-01-01
Authoring bio-ontologies is a task that has traditionally been undertaken by skilled experts trained in understanding complex languages such as the Web Ontology Language (OWL), in tools designed for such experts. As requests for new terms are made, the need for expert ontologists represents a bottleneck in the development process. Furthermore, the ability to rigorously enforce ontology design patterns in large, collaboratively developed ontologies is difficult with existing ontology authoring software. We present Webulous, an application suite for supporting ontology creation by design patterns. Webulous provides infrastructure to specify templates for populating ontology design patterns that get transformed into OWL assertions in a target ontology. Webulous provides programmatic access to the template server and a client application has been developed for Google Sheets that allows templates to be loaded, populated and resubmitted to the Webulous server for processing. The development and delivery of ontologies to the community requires software support that goes beyond the ontology editor. Building ontologies by design patterns and providing simple mechanisms for the addition of new content helps reduce the overall cost and effort required to develop an ontology. The Webulous system provides support for this process and is used as part of the development of several ontologies at the European Bioinformatics Institute.
Image processing for IMRT QA dosimetry.
Zaini, Mehran R; Forest, Gary J; Loshek, David D
2005-01-01
We have automated the determination of the placement location of the dosimetry ion chamber within intensity-modulated radiotherapy (IMRT) fields, as part of streamlining the entire IMRT quality assurance process. This paper describes the mathematical image-processing techniques to arrive at the appropriate measurement locations within the planar dose maps of the IMRT fields. A specific spot within the found region is identified based on its flatness, radiation magnitude, location, area, and the avoidance of the interleaf spaces. The techniques used include applying a Laplacian, dilation, erosion, region identification, and measurement point selection based on three parameters: the size of the erosion operator, the gradient, and the importance of the area of a region versus its magnitude. These three parameters are adjustable by the user. However, the first one requires tweaking in extremely rare occasions, the gradient requires rare adjustments, and the last parameter needs occasional fine-tuning. This algorithm has been tested in over 50 cases. In about 5% of cases, the algorithm does not find a measurement point due to the extremely steep and narrow regions within the fluence maps. In such cases, manual selection of a point is allowed by our code, which is also difficult to ascertain, since the fluence map does not yield itself to an appropriate measurement point selection.
Engineering model for ultrafast laser microprocessing
NASA Astrophysics Data System (ADS)
Audouard, E.; Mottay, E.
2016-03-01
Ultrafast laser micro-machining relies on complex laser-matter interaction processes, leading to a virtually athermal laser ablation. The development of industrial ultrafast laser applications benefits from a better understanding of these processes. To this end, a number of sophisticated scientific models have been developed, providing valuable insights in the physics of the interaction. Yet, from an engineering point of view, they are often difficult to use, and require a number of adjustable parameters. We present a simple engineering model for ultrafast laser processing, applied in various real life applications: percussion drilling, line engraving, and non normal incidence trepanning. The model requires only two global parameters. Analytical results are derived for single pulse percussion drilling or simple pass engraving. Simple assumptions allow to predict the effect of non normal incident beams to obtain key parameters for trepanning drilling. The model is compared to experimental data on stainless steel with a wide range of laser characteristics (time duration, repetition rate, pulse energy) and machining conditions (sample or beam speed). Ablation depth and volume ablation rate are modeled for pulse durations from 100 fs to 1 ps. Trepanning time of 5.4 s with a conicity of 0.15° is obtained for a hole of 900 μm depth and 100 μm diameter.
The Difficult Gallbladder: A Safe Approach to a Dangerous Problem.
Santos, B Fernando; Brunt, L Michael; Pucci, Michael J
2017-06-01
Laparoscopic cholecystectomy is a common surgical procedure, and remains the gold standard for the management of benign gallbladder and biliary disease. While this procedure can be technically straightforward, it can also represent one of the most challenging operations facing surgeons. This dichotomy of a routine operation performed so commonly that poses such a hidden risk of severe complications, such as bile duct injury, must keep surgeons steadfast in the pursuit of safety. The "difficult gallbladder" requires strict adherence to the Culture of Safety in Cholecystectomy, which promotes safety first and assists surgeons in managing or avoiding difficult operative situations. This review will discuss the management of the difficult gallbladder and propose the use of subtotal fenestrating cholecystectomy as a definitive option during this dangerous situation.
Using literature to help physician-learners understand and manage "difficult" patients.
Shapiro, J; Lie, D
2000-07-01
Despite significant clinical and research efforts aimed at recognizing and managing "difficult" patients, such patients remain a frustrating experience for many clinicians. This is especially true for primary care residents, who are required to see a significant volume of patients with diverse and complex problems, but who may not have adequate training and life experience to enable them to deal with problematic doctor-patient situations. Literature--short stories, poems, and patient narratives--is a little-explored educational tool to help residents in understanding and working with difficult patients. In this report, the authors examine the mechanics of using literature to teach about difficult patients, including structuring the learning environment, establishing learning objectives, identifying teaching resources and appropriate pedagogic methods, and incorporating creative writing assignments. They also present an illustrative progression of a typical literature-based teaching session about a difficult patient.
A Phenomenological Study of Urban Search and Rescue Members Who Responded to a Disaster
ERIC Educational Resources Information Center
Kerns, Terry L.
2012-01-01
The complicated world of disaster management requires urban search and rescue (US&R) members to be well trained to respond to complex, unpredictable, and difficult to manage disasters anywhere in the world on short notice. Disasters are becoming more complex and difficult to manage as was witnessed by the multi-faceted disaster in Japan in…
ERIC Educational Resources Information Center
Fitzmaurice, Olivia; Leavy, Aisling; Hannigan, Ailish
2014-01-01
An investigation into prospective mathematics/statistics teachers' (n = 134) conceptual understanding of statistics and attitudes to statistics carried out at the University of Limerick revealed an overall positive attitude to statistics but a perception that it can be a difficult subject, in particular that it requires a great deal of discipline…
Bulgeless Galaxies Hosting 107 M⊙ AGN in Galaxy Zoo: The Growth of Black Holes via Secular Processes
NASA Astrophysics Data System (ADS)
Simmons, Brooke; Lintott, C. J.; Schawinski, K.; Moran, E. C.; Han, A.; Kaviraj, S.; Masters, K. L.; Urry, C. M.; Willett, K.; Bamford, S. P.; Nichol, R.
2013-01-01
The growth of supermassive black holes (SMBHs) appears to proceed via multiple pathways including mergers and secular processes, but these are difficult to disentangle for most galaxies given their complex evolutionary histories. In order to understand the effects of secular galaxy evolution on black hole growth, we require a sample of active galactic nuclei (AGN) in galaxies with a calm formation history free of significant mergers, a population that heretofore has been difficult to locate. Here we present a sample of 13 AGN in massive galaxies lacking the classical bulges believed inevitably to result from mergers; they also either lack or have extremely small pseudobulges, meaning they have had very calm accretion histories. This is the largest sample to date of massive, bulgeless AGN host galaxies selected without any direct restriction on the SMBH mass. The broad-line objects in the sample have black hole masses of 106-7 M⊙ Eddington arguments imply similar masses for the rest of the sample, meaning these black holes have grown substantially in the absence of mergers or other bulge-building processes such as violent disk instabilities. The black hole masses are systematically higher than expected from established bulge-black hole relations. However, these systems may be consistent with the correlation between black hole mass and total stellar mass. We discuss these results in the context of other studies and consider the implication that the details of stellar galaxy evolution and dynamics may not be fundamental to the co-evolution of galaxies and black holes.
NASA Astrophysics Data System (ADS)
Audebert, M.; Clément, R.; Touze-Foltz, N.; Günther, T.; Moreau, S.; Duquennoi, C.
2014-12-01
Leachate recirculation is a key process in municipal waste landfills functioning as bioreactors. To quantify the water content and to assess the leachate injection system, in-situ methods are required to obtain spatially distributed information, usually electrical resistivity tomography (ERT). This geophysical method is based on the inversion process, which presents two major problems in terms of delimiting the infiltration area. First, it is difficult for ERT users to choose an appropriate inversion parameter set. Indeed, it might not be sufficient to interpret only the optimum model (i.e. the model with the chosen regularisation strength) because it is not necessarily the model which best represents the physical process studied. Second, it is difficult to delineate the infiltration front based on resistivity models because of the smoothness of the inversion results. This paper proposes a new methodology called MICS (multiple inversions and clustering strategy), which allows ERT users to improve the delimitation of the infiltration area in leachate injection monitoring. The MICS methodology is based on (i) a multiple inversion step by varying the inversion parameter values to take a wide range of resistivity models into account and (ii) a clustering strategy to improve the delineation of the infiltration front. In this paper, MICS was assessed on two types of data. First, a numerical assessment allows us to optimise and test MICS for different infiltration area sizes, contrasts and shapes. Second, MICS was applied to a field data set gathered during leachate recirculation on a bioreactor.
NASA Technical Reports Server (NTRS)
Scola, Salvatore; Stavely, Rebecca; Jackson, Trevor; Boyer, Charlie; Osmundsen, Jim; Turczynski, Craig; Stimson, Chad
2016-01-01
Performance-related effects of system level temperature changes can be a key consideration in the design of many types of optical instruments. This is especially true for space-based imagers, which may require complex thermal control systems to maintain alignment of the optical components. Structural-Thermal-Optical-Performance (STOP) analysis is a multi-disciplinary process that can be used to assess the performance of these optical systems when subjected to the expected design environment. This type of analysis can be very time consuming, which makes it difficult to use as a trade study tool early in the project life cycle. In many cases, only one or two iterations can be performed over the course of a project. This limits the design space to best practices since it may be too difficult, or take too long, to test new concepts analytically. In order to overcome this challenge, automation, and a standard procedure for performing these studies is essential. A methodology was developed within the framework of the Comet software tool that captures the basic inputs, outputs, and processes used in most STOP analyses. This resulted in a generic, reusable analysis template that can be used for design trades for a variety of optical systems. The template captures much of the upfront setup such as meshing, boundary conditions, data transfer, naming conventions, and post-processing, and therefore saves time for each subsequent project. A description of the methodology and the analysis template is presented, and results are described for a simple telescope optical system.
NASA Astrophysics Data System (ADS)
Scola, Salvatore; Stavely, Rebecca; Jackson, Trevor; Boyer, Charlie; Osmundsen, Jim; Turczynski, Craig; Stimson, Chad
2016-09-01
Performance-related effects of system level temperature changes can be a key consideration in the design of many types of optical instruments. This is especially true for space-based imagers, which may require complex thermal control systems to maintain alignment of the optical components. Structural-Thermal-Optical-Performance (STOP) analysis is a multi-disciplinary process that can be used to assess the performance of these optical systems when subjected to the expected design environment. This type of analysis can be very time consuming, which makes it difficult to use as a trade study tool early in the project life cycle. In many cases, only one or two iterations can be performed over the course of a project. This limits the design space to best practices since it may be too difficult, or take too long, to test new concepts analytically. In order to overcome this challenge, automation, and a standard procedure for performing these studies is essential. A methodology was developed within the framework of the Comet software tool that captures the basic inputs, outputs, and processes used in most STOP analyses. This resulted in a generic, reusable analysis template that can be used for design trades for a variety of optical systems. The template captures much of the upfront setup such as meshing, boundary conditions, data transfer, naming conventions, and post-processing, and therefore saves time for each subsequent project. A description of the methodology and the analysis template is presented, and results are described for a simple telescope optical system.
Goold, S D
1996-01-01
Assuming that rationing health care is unavoidable, and that it requires moral reasoning, how should we allocate limited health care resources? This question is difficult because our pluralistic, liberal society has no consensus on a conception of distributive justice. In this article I focus on an alternative: Who shall decide how to ration health care, and how shall this be done to respect autonomy, pluralism, liberalism, and fairness? I explore three processes for making rationing decisions: cost-utility analysis, informed democratic decision making, and applications of the veil of ignorance. I evaluate these processes as examples of procedural justice, assuming that there is no outcome considered the most just. I use consent as a criterion to judge competing processes so that rationing decisions are, to some extent, self-imposed. I also examine the processes' feasibility in our current health care system. Cost-utility analysis does not meet criteria for actual or presumed consent, even if costs and health-related utility could be measured perfectly. Existing structures of government cannot creditably assimilate the information required for sound rationing decisions, and grassroots efforts are not representative. Applications of the veil of ignorance are more useful for identifying principles relevant to health care rationing than for making concrete rationing decisions. I outline a process of decision making, specifically for health care, that relies on substantive, selected representation, respects pluralism, liberalism, and deliberative democracy, and could be implemented at the community or organizational level.
Weight and the Future of Space Flight Hardware Cost Modeling
NASA Technical Reports Server (NTRS)
Prince, Frank A.
2003-01-01
Weight has been used as the primary input variable for cost estimating almost as long as there have been parametric cost models. While there are good reasons for using weight, serious limitations exist. These limitations have been addressed by multi-variable equations and trend analysis in models such as NAFCOM, PRICE, and SEER; however, these models have not be able to address the significant time lags that can occur between the development of similar space flight hardware systems. These time lags make the cost analyst's job difficult because insufficient data exists to perform trend analysis, and the current set of parametric models are not well suited to accommodating process improvements in space flight hardware design, development, build and test. As a result, people of good faith can have serious disagreement over the cost for new systems. To address these shortcomings, new cost modeling approaches are needed. The most promising approach is process based (sometimes called activity) costing. Developing process based models will require a detailed understanding of the functions required to produce space flight hardware combined with innovative approaches to estimating the necessary resources. Particularly challenging will be the lack of data at the process level. One method for developing a model is to combine notional algorithms with a discrete event simulation and model changes to the total cost as perturbations to the program are introduced. Despite these challenges, the potential benefits are such that efforts should be focused on developing process based cost models.
In situ ohmic contact formation for n-type Ge via non-equilibrium processing
NASA Astrophysics Data System (ADS)
Prucnal, S.; Frigerio, J.; Napolitani, E.; Ballabio, A.; Berencén, Y.; Rebohle, L.; Wang, M.; Böttger, R.; Voelskow, M.; Isella, G.; Hübner, R.; Helm, M.; Zhou, S.; Skorupa, W.
2017-11-01
Highly scaled nanoelectronics requires effective channel doping above 5 × 1019 cm-3 together with ohmic contacts with extremely low specific contact resistivity. Nowadays, Ge becomes very attractive for modern optoelectronics due to the high carrier mobility and the quasi-direct bandgap, but n-type Ge doped above 5 × 1019 cm-3 is metastable and thus difficult to be achieved. In this letter, we report on the formation of low-resistivity ohmic contacts in highly n-type doped Ge via non-equilibrium thermal processing consisting of millisecond-range flash lamp annealing. This is a single-step process that allows for the formation of a 90 nm thick NiGe layer with a very sharp interface between NiGe and Ge. The measured carrier concentration in Ge is above 9 × 1019 cm-3 with a specific contact resistivity of 1.2 × 10-6 Ω cm2. Simultaneously, both the diffusion and the electrical deactivation of P are fully suppressed.
Gauvry, Emilie; Mathot, Anne-Gabrielle; Leguérinel, Ivan; Couvert, Olivier; Postollec, Florence; Broussolle, Véronique; Coroller, Louis
2017-05-01
Spore-forming bacteria are able to grow under a wide range of environmental conditions, to form biofilms and to differentiate into resistant forms: spores. This resistant form allows their dissemination in the environment; consequently, they may contaminate raw materials. Sporulation can occur all along the food chain, in raw materials, but also in food processes, leading to an increase in food contamination. However, the problem of sporulation during food processing is poorly addressed and sporulation niches are difficult to identify from the farm to the fork. Sporulation is a survival strategy. Some environmental factors are required to trigger this differentiation process and others act by modulating it. The efficiency of sporulation is the result of the combined effects of these two types of factors on vegetative cell metabolism. This paper aims to explain and help identify sporulation niches in the food chain, based on features of spore-former physiology. Copyright © 2016 Institut Pasteur. Published by Elsevier Masson SAS. All rights reserved.
[Inflammation and obesity (lipoinflammation)].
Izaola, Olatz; de Luis, Daniel; Sajoux, Ignacio; Domingo, Joan Carles; Vidal, Montse
2015-06-01
Obesity is a chronic disease with multiple origins. It is a widespread global phenomenon carrying potentially serious complications which requires a multidisciplinary approach due to the significant clinical repercussions and elevated health costs associated with the disease. The most recent evidence indicates that it shares a common characteristic with other prevalent, difficult-to-treat pathologies: chronic, low-grade inflammation which perpetuates the disease and is associated with multiple complications. The current interest in lipoinflammation or chronic inflammation associated with obesity derives from an understanding of the alterations and remodelling that occurs in the adipose tissue, with the participation of multiple factors and elements throughout the process. Recent research highlights the importance of some of these molecules, called pro-resolving mediators, as possible therapeutic targets in the treatment of obesity. This article reviews the evidence published on the mechanisms that regulate the adipose tissue remodelling process and lipoinflammation both in obesity and in the mediators that are directly involved in the appearance and resolution of the inflammatory process. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
An applications-oriented approach to the development of virtual environments
NASA Technical Reports Server (NTRS)
Crowe, Michael X.
1994-01-01
The field of Virtual Reality (VR) is diverse, ranging in scope from research into fundamental enabling technologies to the building of full-scale entertainment facilities. However, the concept of virtual reality means many things to many people. Ideally, a definition of VR should derive from how it can provide solutions to existing challenges in building advanced human computer interfaces. The measure of success for VR lies in its ability to enhance the assimilation of complex information, whether to aid in difficult decision making processes, or to recreate real experiences in a compelling way. This philosophy is described using an example from a VR-based advertising project. The common and unique elements of this example are explained, though the fundamental development process is the same for all virtual environments that support information transfer. In short, this development approach is an applications oriented approach that begins by establishing and prioritizing user requirements and seeks to add value to the information transfer process through the appropriate use of VR technology.
Silicon cells made by self-aligned selective-emitter plasma-etchback process
Ruby, Douglas S.; Schubert, William K.; Gee, James M.; Zaidi, Saleem H.
2000-01-01
Photovoltaic cells and methods for making them are disclosed wherein the metallized grids of the cells are used to mask portions of cell emitter regions to allow selective etching of phosphorus-doped emitter regions. The preferred etchant is SF.sub.6 or a combination of SF.sub.6 and O.sub.2. This self-aligned selective etching allows for enhanced blue response (versus cells with uniform heavy doping of the emitter) while preserving heavier doping in the region beneath the gridlines needed for low contact resistance. Embodiments are disclosed for making cells with or without textured surfaces. Optional steps include plasma hydrogenation and PECVD nitride deposition, each of which are suited to customized applications for requirements of given cells to be manufactured. The techniques disclosed could replace expensive and difficult alignment methodologies used to obtain selectively etched emitters, and they may be easily integrated with existing plasma processing methods and techniques of the invention may be accomplished in a single plasma-processing chamber.
Lunar oxygen and metal for use in near-Earth space: Magma electrolysis
NASA Technical Reports Server (NTRS)
Colson, Russell O.; Haskin, Larry A.
1990-01-01
Because it is energetically easier to get material from the Moon to Earth orbit than from the Earth itself, the Moon is a potentially valuable source of materials for use in space. The unique conditions on the Moon, such as vacuum, absence of many reagents common on the Earth, and the presence of very nontraditional ores suggest that a unique and nontraditional process for extracting materials from the ores may prove the most practical. With this in mind, an investigation of unfluxed silicate electrolysis as a method for extracting oxygen, iron, and silicon from lunar regolith was initiated and is discussed. The advantages of the process include simplicity of concept, absence of need to supply reagents from Earth, and low power and mass requirements for the processing plant. Disadvantages include the need for uninterrupted high temperature and the highly corrosive nature of the high-temperature silicate melts which has made identifying suitable electrode and container materials difficult.
NASA Astrophysics Data System (ADS)
He, Li; Song, Xuan
2018-03-01
In recent years, ceramic fabrication using stereolithography (SLA) has gained in popularity because of its high accuracy and density that can be achieved in the final part of production. One of the key challenges in ceramic SLA is that support structures are required for building overhanging features, whereas removing these support structures without damaging the components is difficult. In this research, a suspension-enclosing projection-stereolithography process is developed to overcome this challenge. This process uses a high-yield-stress ceramic slurry as the feedstock material and exploits the elastic force of the material to support overhanging features without the need for building additional support structures. Ceramic slurries with different solid loadings are studied to identify the rheological properties most suitable for supporting overhanging features. An analytical model of a double doctor-blade module is established to obtain uniform and thin recoating layers from a high-yield-stress slurry. Several test cases highlight the feasibility of using a high-yield-stress slurry to support overhanging features in SLA.
Interferometry meets the third and fourth dimensions in galaxies
NASA Astrophysics Data System (ADS)
Trimble, Virginia
2015-02-01
Radio astronomy began with one array (Jansky's) and one paraboloid of revolution (Reber's) as collecting areas and has now reached the point where a large number of facilities are arrays of paraboloids, each of which would have looked enormous to Reber in 1932. In the process, interferometry has contributed to the counting of radio sources, establishing superluminal velocities in AGN jets, mapping of sources from the bipolar cow shape on up to full grey-scale and colored images, determining spectral energy distributions requiring non-thermal emission processes, and much else. The process has not been free of competition and controversy, at least partly because it is just a little difficult to understand how earth-rotation, aperture-synthesis interferometry works. Some very important results, for instance the mapping of HI in the Milky Way to reveal spiral arms, warping, and flaring, actually came from single moderate-sized paraboloids. The entry of China into the radio astronomy community has given large (40-110 meter) paraboloids a new lease on life.
Metallization for Yb14MnSb11-Based Thermoelectric Materials
NASA Technical Reports Server (NTRS)
Firdosy, Samad; Li, Billy Chun-Yip; Ravi, Vilupanur; Sakamoto, Jeffrey; Caillat, Thierry; Ewell, Richard C.; Brandon, Erik J.
2011-01-01
Thermoelectric materials provide a means for converting heat into electrical power using a fully solid-state device. Power-generating devices (which include individual couples as well as multicouple modules) require the use of ntype and p-type thermoelectric materials, typically comprising highly doped narrow band-gap semiconductors which are connected to a heat collector and electrodes. To achieve greater device efficiency and greater specific power will require using new thermoelectric materials, in more complex combinations. One such material is the p-type compound semiconductor Yb14MnSb11 (YMS), which has been demonstrated to have one of the highest ZT values at 1,000 C, the desired operational temperature of many space-based radioisotope thermoelectric generators (RTGs). Despite the favorable attributes of the bulk YMS material, it must ultimately be incorporated into a power-generating device using a suitable joining technology. Typically, processes such as diffusion bonding and/or brazing are used to join thermoelectric materials to the heat collector and electrodes, with the goal of providing a stable, ohmic contact with high thermal conductivity at the required operating temperature. Since YMS is an inorganic compound featuring chemical bonds with a mixture of covalent and ionic character, simple metallurgical diffusion bonding is difficult to implement. Furthermore, the Sb within YMS readily reacts with most metals to form antimonide compounds with a wide range of stoichiometries. Although choosing metals that react to form high-melting-point antimonides could be employed to form a stable reaction bond, it is difficult to limit the reactivity of Sb in YMS such that the electrode is not completely consumed at an operating temperature of 1,000 C. Previous attempts to form suitable metallization layers resulted in poor bonding, complete consumption of the metallization layer or fracture within the YMS thermoelement (or leg).
Adaptive grid methods for RLV environment assessment and nozzle analysis
NASA Technical Reports Server (NTRS)
Thornburg, Hugh J.
1996-01-01
Rapid access to highly accurate data about complex configurations is needed for multi-disciplinary optimization and design. In order to efficiently meet these requirements a closer coupling between the analysis algorithms and the discretization process is needed. In some cases, such as free surface, temporally varying geometries, and fluid structure interaction, the need is unavoidable. In other cases the need is to rapidly generate and modify high quality grids. Techniques such as unstructured and/or solution-adaptive methods can be used to speed the grid generation process and to automatically cluster mesh points in regions of interest. Global features of the flow can be significantly affected by isolated regions of inadequately resolved flow. These regions may not exhibit high gradients and can be difficult to detect. Thus excessive resolution in certain regions does not necessarily increase the accuracy of the overall solution. Several approaches have been employed for both structured and unstructured grid adaption. The most widely used involve grid point redistribution, local grid point enrichment/derefinement or local modification of the actual flow solver. However, the success of any one of these methods ultimately depends on the feature detection algorithm used to determine solution domain regions which require a fine mesh for their accurate representation. Typically, weight functions are constructed to mimic the local truncation error and may require substantial user input. Most problems of engineering interest involve multi-block grids and widely disparate length scales. Hence, it is desirable that the adaptive grid feature detection algorithm be developed to recognize flow structures of different type as well as differing intensity, and adequately address scaling and normalization across blocks. These weight functions can then be used to construct blending functions for algebraic redistribution, interpolation functions for unstructured grid generation, forcing functions to attract/repel points in an elliptic system, or to trigger local refinement, based upon application of an equidistribution principle. The popularity of solution-adaptive techniques is growing in tandem with unstructured methods. The difficultly of precisely controlling mesh densities and orientations with current unstructured grid generation systems has driven the use of solution-adaptive meshing. Use of derivatives of density or pressure are widely used for construction of such weight functions, and have been proven very successful for inviscid flows with shocks. However, less success has been realized for flowfields with viscous layers, vortices or shocks of disparate strength. It is difficult to maintain the appropriate mesh point spacing in the various regions which require a fine spacing for adequate resolution. Mesh points often migrate from important regions due to refinement of dominant features. An example of this is the well know tendency of adaptive methods to increase the resolution of shocks in the flowfield around airfoils, but in the incorrect location due to inadequate resolution of the stagnation region. This problem has been the motivation for this research.
[Diagnosis and treatment guidelines for difficult-to-control asthma in children].
Navarro Merino, M; Andrés Martín, A; Asensio de la Cruz, O; García García, M L; Liñán Cortes, S; Villa Asensi, J R
2009-12-01
Children suffering from difficult-to-control asthma (DCA) require frequent appointments with their physician, complex treatment regimes and often admissions to hospital. Less than 5% of the asthmatic population suffer this condition. DCA must be correctly characterised to rule out false causes of DCA and requires making a differential diagnosis from pathologies that mimic asthma, comorbidity, environmental and psychological factors, and analysing the factors to determine poor treatment compliance. In true DCA cases, inflammation studies (exhaled nitric oxide, induced sputum, broncho-alveolar lavage and bronchial biopsy), pulmonary function and other clinical aspects can classify DCA into different phenotypes which could make therapeutic decision-making easier.
Bubenheim, D L; Wignarajah, K
1995-01-01
Resource recovery from waste streams in a space habitat is essential to minimize the resupply burden and achieve self-sufficiency. In a Controlled Ecological Life Support System (CELSS) human wastes and inedible biomass will represent significant sources of secondary raw materials necessary for support of crop plant production (carbon, water, and inorganic plant nutrients). Incineration, pyrolysis, and water extraction have been investigated as candidate processes for recovery of these important resources from inedible biomass in a CELSS. During incineration CO2 is produced by oxidation of the organic components and this product can be directly utilized by plants. Water is concomitantly produced, requiring only a phase change for recovery. Recovery of inorganics is more difficult, requiring solubilization of the incinerator ash. The process of incineration followed by water solubilization of ash resulted in loss of 35% of the inorganics originally present in the biomass. Losses were attributed to volatilization (8%) and non-water-soluble ash (27%). All of the ash remaining following incineration could be solubilized with acid, with losses resulting from volatilization only. The recovery for individual elements varied. Elemental retention in the ash ranged from 100% of that present in the biomass for Ca, P, Mg, Na, and Si to 10% for Zn. The greatest water solubility was observed for potassium with recovery of approximately 77% of that present in the straw. Potassium represented 80% of the inorganic constituents in the wheat straw, and because of slightly greater solubility made up 86% of the water-soluble ash. Following incineration of inedible biomass from wheat, 65% of the inorganics originally present in the straw were recovered by water solubilization and 92% recovered by acid solubilization. Recovery of resources is more complex for pyrolysis and water extraction. Recovery of carbon, a resource of greater mass than the inorganic component of biomass, is more difficult following pyrolysis and water extraction of biomass. In both cases, additional processors would be required to provide products equivalent to those resulting from incineration alone. The carbon, water, and inorganic resources of inedible biomass are effectively separated and output in usable forms through incineration.
NASA Technical Reports Server (NTRS)
Bubenheim, David L.; Wignarajah, Kanapathipillai
1995-01-01
Resource recovery from waste streams in a space habitat is essential to minimize the resupply burden and achieve self-sufficiency. In a Controlled Ecological Life Support System (CELSS) human wastes and inedible biomass will represent significant sources of secondary raw materials necessary for support of crop plant production (carbon, water, and inorganic plant nutrients). Incineration, pyrolysis, and water extraction have been investigated as candidate processes for recovery of these important resources from inedible biomass in a CELSS. During incineration CO2 is produced by oxidation of the organic components and this product can be directly utilized by plants. Water is concomitantly produced, requiring only a phase change for recovery. Recovery of inorganics is more difficult, requiring solubilization of the incinerator ash. The process of incineration followed by water solubilization of ash resulted in loss of 35% of the inorganics originally present in the biomass. Losses were attributed to volatilization (8%) and non-water-soluble ash (27%). All of the ash remaining following incineration could be solubilized with acid, with losses resulting from volatilization only. The recovery for individual elements varied. Elemental retention in the ash ranged from 100% of that present in the biomass for Ca, P, Mg, Na, and Si to 10% for Zn. The greatest water solubility was observed for potassium with recovery of approximately 77% of that present in the straw. Potassium represented 80% of the inorganic constituents in the wheat straw, and because of slightly greater solubility made up 86% of the water-soluble ash. Following incineration of inedible biomass from wheat, 65% of the inorganics originally present in the straw were recovered by water solubilization and 92% recovered by acid solubilization. Recovery of resources is more complex for pyrolysis and water extraction. Recovery of carbon, a resource of greater mass than the inorganic component of biomass, is more difficult following pyrolysis and water extraction of biomass. In both cases, additional processors would be required to provide products equivalent to those resulting from incineration alone. The carbon, water, and organic resources of inedible biomass are effectively separated and output in usable forms through incineration.
In-vivo quantification of primary microRNA processing by Drosha with a luciferase based system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allegra, Danilo; Cooperation Unit 'Mechanisms of Leukemogenesis', B061, DKFZ, Im Neuenheimer Feld 280, 69120 Heidelberg; Mertens, Daniel, E-mail: daniel.mertens@uniklinik-ulm.de
2011-03-25
Research highlights: {yields} Posttranscriptional regulation of miRNA processing is difficult to quantify. {yields} Our in-vivo processing assay can quantify Drosha cleavage in live cells. {yields} It is based on luciferase reporters fused with pri-miRNAs. {yields} The assay validates the processing defect caused by a mutation in pri-16-1. {yields} It is a sensitive method to quantify pri-miRNA cleavage by Drosha in live cells. -- Abstract: The RNAse III Drosha is responsible for the first step of microRNA maturation, the cleavage of primary miRNA to produce the precursor miRNA. Processing by Drosha is finely regulated and influences the amount of mature microRNAmore » in a cell. We describe in the present work a method to quantify Drosha processing activity in-vivo, which is applicable to any microRNA. With respect to other methods for measuring Drosha activity, our system is faster and scalable, can be used with any cellular system and does not require cell sorting or use of radioactive isotopes. This system is useful to study regulation of Drosha activity in physiological and pathological conditions.« less
Nieke, Jens; Reusen, Ils
2007-01-01
User-driven requirements for remote sensing data are difficult to define, especially details on geometric, spectral and radiometric parameters. Even more difficult is a decent assessment of the required degrees of processing and corresponding data quality. It is therefore a real challenge to appropriately assess data costs and services to be provided. In 2006, the HYRESSA project was initiated within the framework 6 programme of the European Commission to analyze the user needs of the hyperspectral remote sensing community. Special focus was given to finding an answer to the key question, “What are the individual user requirements for hyperspectral imagery and its related data products?”. A Value-Benefit Analysis (VBA) was performed to retrieve user needs and address open items accordingly. The VBA is an established tool for systematic problem solving by supporting the possibility of comparing competing projects or solutions. It enables evaluation on the basis of a multidimensional objective model and can be augmented with expert's preferences. After undergoing a VBA, the scaling method (e.g., Law of Comparative Judgment) was applied for achieving the desired ranking judgments. The result, which is the relative value of projects with respect to a well-defined main objective, can therefore be produced analytically using a VBA. A multidimensional objective model adhering to VBA methodology was established. Thereafter, end users and experts were requested to fill out a Questionnaire of User Needs (QUN) at the highest level of detail - the value indicator level. The end user was additionally requested to report personal preferences for his particular research field. In the end, results from the experts' evaluation and results from a sensor data survey can be compared in order to understand user needs and the drawbacks of existing data products. The investigation – focusing on the needs of the hyperspectral user community in Europe – showed that a VBA is a suitable method for analyzing the needs of hyperspectral data users and supporting the sensor/data specification-building process. The VBA has the advantage of being easy to handle, resulting in a comprehensive evaluation. The primary disadvantage is the large effort in realizing such an analysis because the level of detail is extremely high.
NASA Astrophysics Data System (ADS)
Warren, M. A.; Goult, S.; Clewley, D.
2018-06-01
Advances in technology allow remotely sensed data to be acquired with increasingly higher spatial and spectral resolutions. These data may then be used to influence government decision making and solve a number of research and application driven questions. However, such large volumes of data can be difficult to handle on a single personal computer or on older machines with slower components. Often the software required to process data is varied and can be highly technical and too advanced for the novice user to fully understand. This paper describes an open-source tool, the Simple Concurrent Online Processing System (SCOPS), which forms part of an airborne hyperspectral data processing chain that allows users accessing the tool over a web interface to submit jobs and process data remotely. It is demonstrated using Natural Environment Research Council Airborne Research Facility (NERC-ARF) instruments together with other free- and open-source tools to take radiometrically corrected data from sensor geometry into geocorrected form and to generate simple or complex band ratio products. The final processed data products are acquired via an HTTP download. SCOPS can cut data processing times and introduce complex processing software to novice users by distributing jobs across a network using a simple to use web interface.
Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0
Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; ...
2008-01-01
The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less
Machining of Fibre Reinforced Plastic Composite Materials.
Caggiano, Alessandra
2018-03-18
Fibre reinforced plastic composite materials are difficult to machine because of the anisotropy and inhomogeneity characterizing their microstructure and the abrasiveness of their reinforcement components. During machining, very rapid cutting tool wear development is experienced, and surface integrity damage is often produced in the machined parts. An accurate selection of the proper tool and machining conditions is therefore required, taking into account that the phenomena responsible for material removal in cutting of fibre reinforced plastic composite materials are fundamentally different from those of conventional metals and their alloys. To date, composite materials are increasingly used in several manufacturing sectors, such as the aerospace and automotive industry, and several research efforts have been spent to improve their machining processes. In the present review, the key issues that are concerning the machining of fibre reinforced plastic composite materials are discussed with reference to the main recent research works in the field, while considering both conventional and unconventional machining processes and reporting the more recent research achievements. For the different machining processes, the main results characterizing the recent research works and the trends for process developments are presented.
Machining of Fibre Reinforced Plastic Composite Materials
2018-01-01
Fibre reinforced plastic composite materials are difficult to machine because of the anisotropy and inhomogeneity characterizing their microstructure and the abrasiveness of their reinforcement components. During machining, very rapid cutting tool wear development is experienced, and surface integrity damage is often produced in the machined parts. An accurate selection of the proper tool and machining conditions is therefore required, taking into account that the phenomena responsible for material removal in cutting of fibre reinforced plastic composite materials are fundamentally different from those of conventional metals and their alloys. To date, composite materials are increasingly used in several manufacturing sectors, such as the aerospace and automotive industry, and several research efforts have been spent to improve their machining processes. In the present review, the key issues that are concerning the machining of fibre reinforced plastic composite materials are discussed with reference to the main recent research works in the field, while considering both conventional and unconventional machining processes and reporting the more recent research achievements. For the different machining processes, the main results characterizing the recent research works and the trends for process developments are presented. PMID:29562635
Advantages offered by high average power picosecond lasers
NASA Astrophysics Data System (ADS)
Moorhouse, C.
2011-03-01
As electronic devices shrink in size to reduce material costs, device size and weight, thinner material thicknesses are also utilized. Feature sizes are also decreasing, which is pushing manufacturers towards single step laser direct write process as an attractive alternative to conventional, multiple step photolithography processes by eliminating process steps and the cost of chemicals. The fragile nature of these thin materials makes them difficult to machine either mechanically or with conventional nanosecond pulsewidth, Diode Pumped Solids State (DPSS) lasers. Picosecond laser pulses can cut materials with reduced damage regions and selectively remove thin films due to the reduced thermal effects of the shorter pulsewidth. Also, the high repetition rate allows high speed processing for industrial applications. Selective removal of thin films for OLED patterning, silicon solar cells and flat panel displays is discussed, as well as laser cutting of transparent materials with low melting point such as Polyethylene Terephthalate (PET). For many of these thin film applications, where low pulse energy and high repetition rate are required, throughput can be increased by the use of a novel technique to using multiple beams from a single laser source is outlined.
Chronic Tuberculous Otomastoiditis: A Case Report.
Bruschini, Luca; Ciabotti, Annalisa; Berrettini, Stefano
2016-08-01
Worldwide, tuberculosis is a widespread disease, with 8.7 million new cases occurring annually. Its etiologic agent, Mycobacterium tuberculosis, essentially causes pneumonia. However, this organism affects the middle ear in rare cases, accounting for 0.04-0.09% of all chronic middle ear otitis cases in Western countries. In this report, we describe the case of a young woman affected by tuberculosis of the middle ear. In our experience, empiric therapy was not beneficial. Adequate treatment was possible only after obtaining a specific diagnosis through a difficult process requiring surgical sampling for culture examination. We consider surgical sampling to be mandatory in all cases of chronic otitis media that do not respond to prolonged systemic and local therapies.
Accurate and reproducible measurements of RhoA activation in small samples of primary cells.
Nini, Lylia; Dagnino, Lina
2010-03-01
Rho GTPase activation is essential in a wide variety of cellular processes. Measurement of Rho GTPase activation is difficult with limited material, such as tissues or primary cells that exhibit stringent culture requirements for growth and survival. We defined parameters to accurately and reproducibly measure RhoA activation (i.e., RhoA-GTP) in cultured primary keratinocytes in response to serum and growth factor stimulation using enzyme-linked immunosorbent assay (ELISA)-based G-LISA assays. We also established conditions that minimize RhoA-GTP in unstimulated cells without affecting viability, allowing accurate measurements of RhoA activation on stimulation or induction of exogenous GTPase expression. Copyright 2009 Elsevier Inc. All rights reserved.
Physics-based approach to color image enhancement in poor visibility conditions.
Tan, K K; Oakley, J P
2001-10-01
Degradation of images by the atmosphere is a familiar problem. For example, when terrain is imaged from a forward-looking airborne camera, the atmosphere degradation causes a loss in both contrast and color information. Enhancement of such images is a difficult task because of the complexity in restoring both the luminance and the chrominance while maintaining good color fidelity. One particular problem is the fact that the level of contrast loss depends strongly on wavelength. A novel method is presented for the enhancement of color images. This method is based on the underlying physics of the degradation process, and the parameters required for enhancement are estimated from the image itself.
3D printed optical phantoms and deep tissue imaging for in vivo applications including oral surgery
NASA Astrophysics Data System (ADS)
Bentz, Brian Z.; Costas, Alfonso; Gaind, Vaibhav; Garcia, Jose M.; Webb, Kevin J.
2017-03-01
Progress in developing optical imaging for biomedical applications requires customizable and often complex objects known as "phantoms" for testing, evaluation, and calibration. This work demonstrates that 3D printing is an ideal method for fabricating such objects, allowing intricate inhomogeneities to be placed at exact locations in complex or anatomically realistic geometries, a process that is difficult or impossible using molds. We show printed mouse phantoms we have fabricated for developing deep tissue fluorescence imaging methods, and measurements of both their optical and mechanical properties. Additionally, we present a printed phantom of the human mouth that we use to develop an artery localization method to assist in oral surgery.
Why Is Rainfall Error Analysis Requisite for Data Assimilation and Climate Modeling?
NASA Technical Reports Server (NTRS)
Hou, Arthur Y.; Zhang, Sara Q.
2004-01-01
Given the large temporal and spatial variability of precipitation processes, errors in rainfall observations are difficult to quantify yet crucial to making effective use of rainfall data for improving atmospheric analysis, weather forecasting, and climate modeling. We highlight the need for developing a quantitative understanding of systematic and random errors in precipitation observations by examining explicit examples of how each type of errors can affect forecasts and analyses in global data assimilation. We characterize the error information needed from the precipitation measurement community and how it may be used to improve data usage within the general framework of analysis techniques, as well as accuracy requirements from the perspective of climate modeling and global data assimilation.
Dynamic Control of Plans with Temporal Uncertainty
NASA Technical Reports Server (NTRS)
Morris, Paul; Muscettola, Nicola; Vidal, Thierry
2001-01-01
Certain planning systems that deal with quantitative time constraints have used an underlying Simple Temporal Problem solver to ensure temporal consistency of plans. However, many applications involve processes of uncertain duration whose timing cannot be controlled by the execution agent. These cases require more complex notions of temporal feasibility. In previous work, various "controllability" properties such as Weak, Strong, and Dynamic Controllability have been defined. The most interesting and useful Controllability property, the Dynamic one, has ironically proved to be the most difficult to analyze. In this paper, we resolve the complexity issue for Dynamic Controllability. Unexpectedly, the problem turns out to be tractable. We also show how to efficiently execute networks whose status has been verified.
FPGA Based Reconfigurable ATM Switch Test Bed
NASA Technical Reports Server (NTRS)
Chu, Pong P.; Jones, Robert E.
1998-01-01
Various issues associated with "FPGA Based Reconfigurable ATM Switch Test Bed" are presented in viewgraph form. Specific topics include: 1) Network performance evaluation; 2) traditional approaches; 3) software simulation; 4) hardware emulation; 5) test bed highlights; 6) design environment; 7) test bed architecture; 8) abstract sheared-memory switch; 9) detailed switch diagram; 10) traffic generator; 11) data collection circuit and user interface; 12) initial results; and 13) the following conclusions: Advances in FPGA make hardware emulation feasible for performance evaluation, hardware emulation can provide several orders of magnitude speed-up over software simulation; due to the complexity of hardware synthesis process, development in emulation is much more difficult than simulation and requires knowledge in both networks and digital design.
Is it possible to assess the "ethics" of medical school applicants?
Lowe, M.; Kerridge, I.; Bore, M.; Munro, D.; Powis, D.
2001-01-01
Questions surrounding the assessment of medical school applicants' morality are difficult but they are nevertheless important for medical schools to consider. It is probably inappropriate to attempt to assess medical school applicants' ethical knowledge, moral reasoning, or beliefs about ethical issues as these all may be developed during the process of education. Attitudes towards ethical issues and ethical sensitivity, however, might be tested in the context of testing for personality attributes. Before any "ethics" testing is introduced as part of screening for admission to medical school it would require validation. We suggest a number of ways in which this might be achieved. Key Words: Ethics • medical school selection • personality PMID:11731605
Comparison of Available Technologies for Fire Spots Detection via Linear Heat Detector
NASA Astrophysics Data System (ADS)
Miksa, František; Nemlaha, Eduard
2016-12-01
It is very demanding to detect fire spots under difficult conditions with high occurrence of interfering external factors such as large distances, airflow difficultly, high dustiness, high humidity, etc. Spot fire sensors do not meet the requirements due to the aforementioned conditions as well as large distances. Therefore, the detection of a fire spot via linear heat sensing cables is utilized.
Validating the Use of Deep Learning Neural Networks for Correction of Large Hydrometric Datasets
NASA Astrophysics Data System (ADS)
Frazier, N.; Ogden, F. L.; Regina, J. A.; Cheng, Y.
2017-12-01
Collection and validation of Earth systems data can be time consuming and labor intensive. In particular, high resolution hydrometric data, including rainfall and streamflow measurements, are difficult to obtain due to a multitude of complicating factors. Measurement equipment is subject to clogs, environmental disturbances, and sensor drift. Manual intervention is typically required to identify, correct, and validate these data. Weirs can become clogged and the pressure transducer may float or drift over time. We typically employ a graphical tool called Time Series Editor to manually remove clogs and sensor drift from the data. However, this process is highly subjective and requires hydrological expertise. Two different people may produce two different data sets. To use this data for scientific discovery and model validation, a more consistent method is needed to processes this field data. Deep learning neural networks have proved to be excellent mechanisms for recognizing patterns in data. We explore the use of Recurrent Neural Networks (RNN) to capture the patterns in the data over time using various gating mechanisms (LSTM and GRU), network architectures, and hyper-parameters to build an automated data correction model. We also explore the required amount of manually corrected training data required to train the network for reasonable accuracy. The benefits of this approach are that the time to process a data set is significantly reduced, and the results are 100% reproducible after training is complete. Additionally, we train the RNN and calibrate a physically-based hydrological model against the same portion of data. Both the RNN and the model are applied to the remaining data using a split-sample methodology. Performance of the machine learning is evaluated for plausibility by comparing with the output of the hydrological model, and this analysis identifies potential periods where additional investigation is warranted.
NASA Astrophysics Data System (ADS)
McLaughlin, B. D.; Pawloski, A. W.
2015-12-01
Modern development practices require the ability to quickly and easily host an application. Small projects cannot afford to maintain a large staff for infrastructure maintenance. Rapid prototyping fosters innovation. However, maintaining the integrity of data and systems demands care, particularly in a government context. The extensive data holdings that make up much of the value of NASA's EOSDIS (Earth Observing System Data and Information System) are stored in a number of locations, across a wide variety of applications, ranging from small prototypes to large computationally-intensive operational processes.However, it is increasingly difficult for an application to implement the required security controls, perform required registrations and inventory entries, ensure logging, monitoring, patching, and then ensure that all these activities continue for the life of that application, let alone five, or ten, or fifty applications. This process often takes weeks or months to complete and requires expertise in a variety of different domains such as security, systems administration, development, etc.NGAP, the Next Generation Application Platform, is tackling this problem by investigating, automating, and resolving many of the repeatable policy hurdles that a typical application must overcome. This platform provides a relatively simple and straightforward process by which applications can commit source code to a repository and then deploy that source code to a cloud-based infrastructure, all while meeting NASA's policies for security, governance, inventory, reliability, and availability. While there is still work for the application owner for any application hosting, NGAP handles a significant portion of that work.This talk will discuss areas where we have made significant progress, areas that are complex or must remain human-intensive, and areas where we are still striving to improve this application deployment and hosting pipeline.
NASA Astrophysics Data System (ADS)
Kaula, William M.; Stevenson, David J.
David J. Stevenson was awarded the Harry H. Hess Medal at the AGU Spring Meeting Honors Ceremony, which was held on May 27, 1998, in Boston, Massachusetts. The Harry H. Hess Medal recognizes outstanding achievements in the research of the constitution and evolution of Earth and its sister planets.A meaningful understanding of the Earth and planets requires explaining their differences. This explanation of planetary processes is difficult partly because it entails a wide range of scales—from microscale, operating at the atomic level, to macroscale, determined by boundaries thousands of kilometers apart. David Stevenson's graduate study was mainly in theoretical condensedmatter physics, but he is remarkable in his grasp of large-scale planetary processes such as mantle convection and the dynamos. He is also remarkable in his ‘instinct to attack the jugular,’ that is to go for the most important problems and for the versatility of his approaches thereto.
NASA Astrophysics Data System (ADS)
Nair, B. G.; Winter, N.; Daniel, B.; Ward, R. M.
2016-07-01
Direct measurement of the flow of electric current during VAR is extremely difficult due to the aggressive environment as the arc process itself controls the distribution of current. In previous studies the technique of “magnetic source tomography” was presented; this was shown to be effective but it used a computationally intensive iterative method to analyse the distribution of arc centre position. In this paper we present faster computational methods requiring less numerical optimisation to determine the centre position of a single distributed arc both numerically and experimentally. Numerical validation of the algorithms were done on models and experimental validation on measurements based on titanium and nickel alloys (Ti6Al4V and INCONEL 718). The results are used to comment on the effects of process parameters on arc behaviour during VAR.
Feng, Lan; Zhu, Xiaodong; Sun, Xiang
2014-12-15
Coastal reclamation suitability evaluation (CRSE) is a difficult, complex and protracted process requiring the evaluation of many different criteria. In this paper, an integrated framework employing a fuzzy comprehensive evaluation method and analytic hierarchy process (AHP) was applied to the suitability evaluation for coastal reclamation for future sustainable development in the coastal area of Lianyungang, China. The evaluation results classified 6.63%, 22.99%, 31.59% and 38.79% of the coastline as suitable, weakly suitable, unsuitable and forbidden, respectively. The evaluation results were verified by the marine pollution data and highly consistent with the water quality status. The fuzzy-AHP comprehensive evaluation method (FACEM) was found to be suitable for the CRSE. This CRSE can also be applied to other coastal areas in China and thereby be used for the better management of coastal reclamation and coastline protection projects. Copyright © 2014 Elsevier Ltd. All rights reserved.
Boeing Displays Process Action team
NASA Astrophysics Data System (ADS)
Wright, R. Nick; Jacobsen, Alan R.
2000-08-01
Boeing uses Active Matrix Liquid Crystal Display (AMLCD) technology in a wide variety of its aerospace products, including military, commercial, and space applications. With the demise of Optical Imaging Systems (OIS) in September 1998, the source of on-shore custom AMLCD products has become very tenuous. Reliance on off-shore sources of AMLCD glass for aerospace products is also difficult when the average life of a display product is often less than one-tenth the 30 or more years expected from aerospace platforms. Boeing is addressing this problem through the development of a Displays Process Action Team that gathers input from all display users across the spectrum of our aircraft products. By consolidating requirements, developing common interface standards, working with our suppliers and constantly monitoring custom sources as well as commercially available products, Boeing is minimizing the impact (current and future) of the uncertain AMLCD avionics supply picture.
Mortimer, Duncan; Li, Jing Jing; Watts, Jennifer; Harris, Anthony
2011-10-01
Funding contingent upon evidence development (FED) has recently been the subject of some considerable debate in the literature but relatively little has been made of its economic impact. We argue that FED has the potential to shorten the lag between innovation and access but may also (i) crowd-out more valuable interventions in situations in which there is a fixed dedicated budget; or (ii) lead to a de facto increase in the funding threshold and increased expenditure growth in situations in which the programme budget is open-ended. Although FED would typically entail periodic review of provisional or interim listings, it may prove difficult to withdraw funding even at cost/QALY ratios well in excess of current listing thresholds. Further consideration of the design and implementation of FED processes is therefore required to ensure that its introduction yields net benefits over existing processes.
Chemically etched ultrahigh-Q wedge-resonator on a silicon chip
NASA Astrophysics Data System (ADS)
Lee, Hansuek; Chen, Tong; Li, Jiang; Yang, Ki Youl; Jeon, Seokmin; Painter, Oskar; Vahala, Kerry J.
2012-06-01
Ultrahigh-Q optical resonators are being studied across a wide range of fields, including quantum information, nonlinear optics, cavity optomechanics and telecommunications. Here, we demonstrate a new resonator with a record Q-factor of 875 million for on-chip devices. The fabrication of our device avoids the requirement for a specialized processing step, which in microtoroid resonators has made it difficult to control their size and achieve millimetre- and centimetre-scale diameters. Attaining these sizes is important in applications such as microcombs and potentially also in rotation sensing. As an application of size control, stimulated Brillouin lasers incorporating our device are demonstrated. The resonators not only set a new benchmark for the Q-factor on a chip, but also provide, for the first time, full compatibility of this important device class with conventional semiconductor processing. This feature will greatly expand the range of possible `system on a chip' functions enabled by ultrahigh-Q devices.
Submillisecond Dynamics of Mastoparan X Insertion into Lipid Membranes.
Schuler, Erin E; Nagarajan, Sureshbabu; Dyer, R Brian
2016-09-01
The mechanism of protein insertion into a lipid bilayer is poorly understood because the kinetics of this process is difficult to measure. We developed a new approach to study insertion of the antimicrobial peptide Mastoparan X into zwitterionic lipid vesicles, using a laser-induced temperature-jump to initiate insertion on the microsecond time scale and infrared and fluorescence spectroscopies to follow the kinetics. Infrared probes the desolvation of the peptide backbone and yields biphasic kinetics with relaxation lifetimes of 12 and 117 μs, whereas fluorescence probes the intrinsic tryptophan residue located near the N-terminus and yields a single exponential phase with a lifetime of 440 μs. Arrhenius analysis of the temperature-dependent rates yields an activation energy for insertion of 96 kJ/mol. These results demonstrate the complexity of the insertion process and provide mechanistic insight into the interplay between peptides and the lipid bilayer required for peptide transport across cellular membranes.
Resolution of structural heterogeneity in dynamic crystallography
Ren, Zhong; Chan, Peter W. Y.; Moffat, Keith; Pai, Emil F.; Royer, William E.; Šrajer, Vukica; Yang, Xiaojing
2013-01-01
Dynamic behavior of proteins is critical to their function. X-ray crystallography, a powerful yet mostly static technique, faces inherent challenges in acquiring dynamic information despite decades of effort. Dynamic ‘structural changes’ are often indirectly inferred from ‘structural differences’ by comparing related static structures. In contrast, the direct observation of dynamic structural changes requires the initiation of a biochemical reaction or process in a crystal. Both the direct and the indirect approaches share a common challenge in analysis: how to interpret the structural heterogeneity intrinsic to all dynamic processes. This paper presents a real-space approach to this challenge, in which a suite of analytical methods and tools to identify and refine the mixed structural species present in multiple crystallographic data sets have been developed. These methods have been applied to representative scenarios in dynamic crystallography, and reveal structural information that is otherwise difficult to interpret or inaccessible using conventional methods. PMID:23695239
Investigation of sample preparation on the moldability of ceramic injection molding feedstocks
NASA Astrophysics Data System (ADS)
Ide, Jared
Ceramic injection molding is a desirable option for those who are looking to make ceramic parts with complex geometries. Formulating the feedstock needed to produce ideal parts is a difficult process. In this research a series of feedstock blends will be evaluated for moldability. This was done by investigating their viscosity, and how certain components affect the overall ability to flow. These feedstocks varied waxes, surfactants, and solids loading. A capillary rheometer was used to characterize some of the materials, which led to one batch being selected for molding trials. The parts were sintered and further refinements were made to the feedstock. Solids loading was increased from 77.5% to 82%, which required different ratios of organics to flow. Finally, the ceramic powders were treated to lower their specific surface area before being compounded, which resulted in materials that would process easily through an extruder and exhibit properties suitable for CIM.
Design of a Low Power, Fast-Spectrum, Liquid-Metal Cooled Surface Reactor System
NASA Astrophysics Data System (ADS)
Marcille, T. F.; Dixon, D. D.; Fischer, G. A.; Doherty, S. P.; Poston, D. I.; Kapernick, R. J.
2006-01-01
In the current 2005 US budget environment, competition for fiscal resources make funding for comprehensive space reactor development programs difficult to justify and accommodate. Simultaneously, the need to develop these systems to provide planetary and deep space-enabling power systems is increasing. Given that environment, designs intended to satisfy reasonable near-term surface missions, using affordable technology-ready materials and processes warrant serious consideration. An initial lunar application design incorporating a stainless structure, 880 K pumped NaK coolant system and a stainless/UO2 fuel system can be designed, fabricated and tested for a fraction of the cost of recent high-profile reactor programs (JIMO, SP-100). Along with the cost reductions associated with the use of qualified materials and processes, this design offers a low-risk, high-reliability implementation associated with mission specific low temperature, low burnup, five year operating lifetime requirements.
Resolution of structural heterogeneity in dynamic crystallography.
Ren, Zhong; Chan, Peter W Y; Moffat, Keith; Pai, Emil F; Royer, William E; Šrajer, Vukica; Yang, Xiaojing
2013-06-01
Dynamic behavior of proteins is critical to their function. X-ray crystallography, a powerful yet mostly static technique, faces inherent challenges in acquiring dynamic information despite decades of effort. Dynamic `structural changes' are often indirectly inferred from `structural differences' by comparing related static structures. In contrast, the direct observation of dynamic structural changes requires the initiation of a biochemical reaction or process in a crystal. Both the direct and the indirect approaches share a common challenge in analysis: how to interpret the structural heterogeneity intrinsic to all dynamic processes. This paper presents a real-space approach to this challenge, in which a suite of analytical methods and tools to identify and refine the mixed structural species present in multiple crystallographic data sets have been developed. These methods have been applied to representative scenarios in dynamic crystallography, and reveal structural information that is otherwise difficult to interpret or inaccessible using conventional methods.
Keane, Robert E.; Burgan, Robert E.; Van Wagtendonk, Jan W.
2001-01-01
Fuel maps are essential for computing spatial fire hazard and risk and simulating fire growth and intensity across a landscape. However, fuel mapping is an extremely difficult and complex process requiring expertise in remotely sensed image classification, fire behavior, fuels modeling, ecology, and geographical information systems (GIS). This paper first presents the challenges of mapping fuels: canopy concealment, fuelbed complexity, fuel type diversity, fuel variability, and fuel model generalization. Then, four approaches to mapping fuels are discussed with examples provided from the literature: (1) field reconnaissance; (2) direct mapping methods; (3) indirect mapping methods; and (4) gradient modeling. A fuel mapping method is proposed that uses current remote sensing and image processing technology. Future fuel mapping needs are also discussed which include better field data and fuel models, accurate GIS reference layers, improved satellite imagery, and comprehensive ecosystem models.
Human Rating the Orion Parachute System
NASA Technical Reports Server (NTRS)
Machin, Ricardo A.; Fisher, Timothy E.; Evans, Carol T.; Stewart, Christine E.
2011-01-01
Human rating begins with design. Converging on the requirements and identifying the risks as early as possible in the design process is essential. Understanding of the interaction between the recovery system and the spacecraft will in large part dictate the achievable reliability of the final design. Component and complete system full-scale flight testing is critical to assure a realistic evaluation of the performance and reliability of the parachute system. However, because testing is so often difficult and expensive, comprehensive analysis of test results and correlation to accurate modeling completes the human rating process. The National Aeronautics and Space Administration (NASA) Orion program uses parachutes to stabilize and decelerate the Crew Exploration Vehicle (CEV) spacecraft during subsonic flight in order to deliver a safe water landing. This paper describes the approach that CEV Parachute Assembly System (CPAS) will take to human rate the parachute recovery system for the CEV.
Two-step simulation of velocity and passive scalar mixing at high Schmidt number in turbulent jets
NASA Astrophysics Data System (ADS)
Rah, K. Jeff; Blanquart, Guillaume
2016-11-01
Simulation of passive scalar in the high Schmidt number turbulent mixing process requires higher computational cost than that of velocity fields, because the scalar is associated with smaller length scales than velocity. Thus, full simulation of both velocity and passive scalar with high Sc for a practical configuration is difficult to perform. In this work, a new approach to simulate velocity and passive scalar mixing at high Sc is suggested to reduce the computational cost. First, the velocity fields are resolved by Large Eddy Simulation (LES). Then, by extracting the velocity information from LES, the scalar inside a moving fluid blob is simulated by Direct Numerical Simulation (DNS). This two-step simulation method is applied to a turbulent jet and provides a new way to examine a scalar mixing process in a practical application with smaller computational cost. NSF, Samsung Scholarship.
Top-gate pentacene-based organic field-effect transistor with amorphous rubrene gate insulator
NASA Astrophysics Data System (ADS)
Hiroki, Mizuha; Maeda, Yasutaka; Ohmi, Shun-ichiro
2018-02-01
The scaling of organic field-effect transistors (OFETs) is necessary for high-density integration and for this, OFETs with a top-gate configuration are required. There have been several reports of damageless lithography processes for organic semiconductor or insulator layers. However, it is still difficult to fabricate scaled OFETs with a top-gate configuration. In this study, the lift-off process and the device characteristics of the OFETs with a top-gate configuration utilizing an amorphous (α) rubrene gate insulator were investigated. We have confirmed that α-rubrene shows an insulating property, and its extracted linear mobility was 2.5 × 10-2 cm2/(V·s). The gate length and width were 10 and 60 µm, respectively. From these results, the OFET with a top-gate configuration utilizing an α-rubrene gate insulator is promising for the high-density integration of scaled OFETs.
Measuring the success of electronic medical record implementation using electronic and survey data.
Keshavjee, K.; Troyan, S.; Holbrook, A. M.; VanderMolen, D.
2001-01-01
Computerization of physician practices is increasing. Stakeholders are demanding demonstrated value for their Electronic Medical Record (EMR) implementations. We developed survey tools to measure medical office processes, including administrative and physician tasks pre- and post-EMR implementation. We included variables that were expected to improve with EMR implementation and those that were not expected to improve, as controls. We measured the same processes pre-EMR, at six months and 18 months post-EMR. Time required for most administrative tasks decreased within six months of EMR implementation. Staff time spent on charting increased with time, in keeping with our anecdotal observations that nurses were given more responsibility for charting in many offices. Physician time to chart increased initially by 50%, but went down to original levels by 18 months. However, this may be due to the drop-out of those physicians who had a difficult time charting electronically. PMID:11825201
Synthesis and Biological Response of Size-Specific, Monodisperse Drug-Silica Nanoconjugates
Tang, Li; Fan, Timothy M.; Borst, Luke B.; Cheng, Jianjun
2012-01-01
Drug-containing nanoparticles (NPs) with monodisperse, controlled particle sizes are highly desirable for drug delivery. Accumulating evidence suggests that NPs with sizes less than 50 nm demonstrate superior performance in vitro and in vivo. However, it is difficult to fabricate monodisperse, drug-containing NPs with discrete and incremental difference in sizes required for studying and characterizing existing relationships among particle size, biologic processing, and therapeutic functionality. Here, we report a scalable process of fabricating drug-silica conjugated nanoparticles, termed drug-silica nanoconjugates (drug-NCs), which possess monodisperse size distributions and desirable particle sizes as small as 20 nm. We found that 20-nm NCs are superior to their 50-nm and 200-nm NC analogues by 2–5 and 10–20 folds, respectively, with regard to tumor accumulation and penetration, and cellular internalization. These fundamental findings underscore the importance and necessity of further miniaturizing nanomedicine size for optimized drug delivery applications. PMID:22494403
Gundogdu, Ibrahim; Ozturk, Erhan Arif; Umay, Ebru; Karaahmet, Ozgur Zeliha; Unlu, Ece; Cakci, Aytul
2017-06-01
Following repeated weaning failures in acute care services, spinal cord injury (SCI) patients who require prolonged mechanical ventilation and tracheostomy are discharged to their homes or skilled nursing facilities, with a portable mechanical ventilator (MV) and/or tracheostomy tube (TT) with excess risk of complications, high cost and low quality of life. We hypothesized that many difficult-to-wean patients with cervical SCI can be successfully managed in a rehabilitation clinic. The aim of our study was to develop a respiratory rehabilitation, MV weaning and TT decannulation protocol and to evaluate the effectiveness of this protocol in tetraplegic patients. A multidisciplinary and multifaceted protocol, including respiratory assessment and management themes, was developed and performed based on the findings from other studies in the literature. Tetraplegic patients with the diagnosis of difficult-to-wean, who were admitted to the rehabilitation clinic after having been discharged from the intensive care unit to their home with home-type MV and/or TT, were included in this prospective observational study. The respiratory rehabilitation protocol was applied to 35 tetraplegic patients (10 home-type MV and tracheostomy-dependent, and 25 tracheostomized patients) with C1-C7 ASIA impairment scale grade A, B, and C injuries. Seven out of 10 patients successfully weaned from MV and 30 of 35 patients were decannulated. Four patients were referred for diaphragm pace stimulation and tracheal stenosis surgery. The mean durations of MV weaning and decannulation were 37 and 31 days, respectively. A multifaceted, multidisciplinary respiratory management program can change the process of care used for difficult-to-wean patients with SCI. Implications for rehabilitation Findings from this study indicate the significance of a multidimensional evaluation of any reversible factors for prolonged MV- and/or TT-dependent SCI patients. Thus, rehabilitation specialists should take this into consideration and should provide the appropriate amount of time to these patients. The proposed protocol of respiratory rehabilitation for MV- and/or TT-dependent SCI patients shows promising results in terms of changing the care used for these patients. Successful implementation of a respiratory rehabilitation and weaning protocol is dependent on careful planning and detailed communication between the rehabilitation specialist and intensivist during the respiratory rehabilitation process. Because many of the so-called difficult- or impossible-to-wean patients were successfully weaned from MV and TT in the PMR clinic, the need for such an outlet for countries without specialized centers is supported.
Probing the Boundaries of Orthology: The Unanticipated Rapid Evolution of Drosophila centrosomin
Eisman, Robert C.; Kaufman, Thomas C.
2013-01-01
The rapid evolution of essential developmental genes and their protein products is both intriguing and problematic. The rapid evolution of gene products with simple protein folds and a lack of well-characterized functional domains typically result in a low discovery rate of orthologous genes. Additionally, in the absence of orthologs it is difficult to study the processes and mechanisms underlying rapid evolution. In this study, we have investigated the rapid evolution of centrosomin (cnn), an essential gene encoding centrosomal protein isoforms required during syncytial development in Drosophila melanogaster. Until recently the rapid divergence of cnn made identification of orthologs difficult and questionable because Cnn violates many of the assumptions underlying models for protein evolution. To overcome these limitations, we have identified a group of insect orthologs and present conserved features likely to be required for the functions attributed to cnn in D. melanogaster. We also show that the rapid divergence of Cnn isoforms is apparently due to frequent coding sequence indels and an accelerated rate of intronic additions and eliminations. These changes appear to be buffered by multi-exon and multi-reading frame maximum potential ORFs, simple protein folds, and the splicing machinery. These buffering features also occur in other genes in Drosophila and may help prevent potentially deleterious mutations due to indels in genes with large coding exons and exon-dense regions separated by small introns. This work promises to be useful for future investigations of cnn and potentially other rapidly evolving genes and proteins. PMID:23749319
NASA Technical Reports Server (NTRS)
Calle, Carlos I.; Clements, Judson S.; Thompson, Samuel M.; Cox, Nathan D.; Hogue, Michael D.; Johansen, Michael R.; Williams, Blakeley S.
2011-01-01
Future human missions to Mars will require the utilization of local resources for oxygen, fuel. and water. The In Situ Resource Utilization (ISRU) project is an active research endeavor at NASA to develop technologies that can enable cost effective ways to live off the land. The extraction of oxygen from the Martian atmosphere. composed primarily of carbon dioxide, is one of the most important goals of the Mars ISRU project. The main obstacle is the relatively large amount of dust present in the Martian atmosphere. This dust must be efficiently removed from atmospheric gas intakes for ISRU processing chambers. A common technique to achieve this removal on earth is by electrostatic precipitation, where large electrostatic fields are established in a localized region to precipitate and collect previously charged dust particles. This technique is difficult to adapt to the Martian environment, with an atmospheric pressure of about one-hundredth of the terrestrial atmosphere. At these low pressures. the corona discharges required to implant an electrostatic charge to the particles to be collected is extremely difficult to sustain and the corona easily becomes biopolar. which is unsuitable for particle charging. In this paper, we report on our successful efforts to establish a stable corona under Martian simulated conditions. We also present results on dust collecting efficiencies with an electrostatic precipitator prototype that could be effectively used on a future mission to the red planet
NASA Astrophysics Data System (ADS)
Wild, Walter James
1988-12-01
External nuclear medicine diagnostic imaging of early primary and metastatic lung cancer tumors is difficult due to the poor sensitivity and resolution of existing gamma cameras. Nonimaging counting detectors used for internal tumor detection give ambiguous results because distant background variations are difficult to discriminate from neighboring tumor sites. This suggests that an internal imaging nuclear medicine probe, particularly an esophageal probe, may be advantageously used to detect small tumors because of the ability to discriminate against background variations and the capability to get close to sites neighboring the esophagus. The design, theory of operation, preliminary bench tests, characterization of noise behavior and optimization of such an imaging probe is the central theme of this work. The central concept lies in the representation of the aperture shell by a sequence of binary digits. This, coupled with the mode of operation which is data encoding within an axial slice of space, leads to the fundamental imaging equation in which the coding operation is conveniently described by a circulant matrix operator. The coding/decoding process is a classic coded-aperture problem, and various estimators to achieve decoding are discussed. Some estimators require a priori information about the object (or object class) being imaged; the only unbiased estimator that does not impose this requirement is the simple inverse-matrix operator. The effects of noise on the estimate (or reconstruction) is discussed for general noise models and various codes/decoding operators. The choice of an optimal aperture for detector count times of clinical relevance is examined using a statistical class-separability formalism.
Practitioner and scientist perceptions of successful amphibian conservation.
Meredith, Helen M R; St John, Freya A V; Collen, Ben; Black, Simon A; Griffiths, Richard A
2018-04-01
Conservation requires successful outcomes. However, success is perceived in many different ways depending on the desired outcome. Through a questionnaire survey, we examined perceptions of success among 355 scientists and practitioners working on amphibian conservation from over 150 organizations in more than 50 countries. We also sought to identify how different types of conservation actions and respondent experience and background influenced perceptions. Respondents identified 4 types of success: species and habitat improvements (84% of respondents); effective program management (36%); outreach initiatives such as education and public engagement (25%); and the application of science-based conservation (15%). The most significant factor influencing overall perceived success was reducing threats. Capacity building was rated least important. Perceptions were influenced by experience, professional affiliation, involvement in conservation practice, and country of residence. More experienced practitioners associated success with improvements to species and habitats and less so with education and engagement initiatives. Although science-based conservation was rated as important, this factor declined in importance as the number of programs a respondent participated in increased, particularly among those from less economically developed countries. The ultimate measure of conservation success-population recovery-may be difficult to measure in many amphibians; difficult to relate to the conservation actions intended to drive it; and difficult to achieve within conventional funding time frames. The relaunched Amphibian Conservation Action Plan provides a framework for capturing lower level processes and outcomes, identifying gaps, and measuring progress. © 2017 The Authors. Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.
High-Performance Modeling and Simulation of Anchoring in Granular Media for NEO Applications
NASA Technical Reports Server (NTRS)
Quadrelli, Marco B.; Jain, Abhinandan; Negrut, Dan; Mazhar, Hammad
2012-01-01
NASA is interested in designing a spacecraft capable of visiting a near-Earth object (NEO), performing experiments, and then returning safely. Certain periods of this mission would require the spacecraft to remain stationary relative to the NEO, in an environment characterized by very low gravity levels; such situations require an anchoring mechanism that is compact, easy to deploy, and upon mission completion, easy to remove. The design philosophy used in this task relies on the simulation capability of a high-performance multibody dynamics physics engine. On Earth, it is difficult to create low-gravity conditions, and testing in low-gravity environments, whether artificial or in space, can be costly and very difficult to achieve. Through simulation, the effect of gravity can be controlled with great accuracy, making it ideally suited to analyze the problem at hand. Using Chrono::Engine, a simulation pack age capable of utilizing massively parallel Graphic Processing Unit (GPU) hardware, several validation experiments were performed. Modeling of the regolith interaction has been carried out, after which the anchor penetration tests were performed and analyzed. The regolith was modeled by a granular medium composed of very large numbers of convex three-dimensional rigid bodies, subject to microgravity levels and interacting with each other with contact, friction, and cohesional forces. The multibody dynamics simulation approach used for simulating anchors penetrating a soil uses a differential variational inequality (DVI) methodology to solve the contact problem posed as a linear complementarity method (LCP). Implemented within a GPU processing environment, collision detection is greatly accelerated compared to traditional CPU (central processing unit)- based collision detection. Hence, systems of millions of particles interacting with complex dynamic systems can be efficiently analyzed, and design recommendations can be made in a much shorter time. The figure shows an example of this capability where the Brazil Nut problem is simulated: as the container full of granular material is vibrated, the large ball slowly moves upwards. This capability was expanded to account for anchors of different shapes and penetration velocities, interacting with granular soils.
Ground robotic measurement of aeolian processes
NASA Astrophysics Data System (ADS)
Qian, Feifei; Jerolmack, Douglas; Lancaster, Nicholas; Nikolich, George; Reverdy, Paul; Roberts, Sonia; Shipley, Thomas; Van Pelt, R. Scott; Zobeck, Ted M.; Koditschek, Daniel E.
2017-08-01
Models of aeolian processes rely on accurate measurements of the rates of sediment transport by wind, and careful evaluation of the environmental controls of these processes. Existing field approaches typically require intensive, event-based experiments involving dense arrays of instruments. These devices are often cumbersome and logistically difficult to set up and maintain, especially near steep or vegetated dune surfaces. Significant advances in instrumentation are needed to provide the datasets that are required to validate and improve mechanistic models of aeolian sediment transport. Recent advances in robotics show great promise for assisting and amplifying scientists' efforts to increase the spatial and temporal resolution of many environmental measurements governing sediment transport. The emergence of cheap, agile, human-scale robotic platforms endowed with increasingly sophisticated sensor and motor suites opens up the prospect of deploying programmable, reactive sensor payloads across complex terrain in the service of aeolian science. This paper surveys the need and assesses the opportunities and challenges for amassing novel, highly resolved spatiotemporal datasets for aeolian research using partially-automated ground mobility. We review the limitations of existing measurement approaches for aeolian processes, and discuss how they may be transformed by ground-based robotic platforms, using examples from our initial field experiments. We then review how the need to traverse challenging aeolian terrains and simultaneously make high-resolution measurements of critical variables requires enhanced robotic capability. Finally, we conclude with a look to the future, in which robotic platforms may operate with increasing autonomy in harsh conditions. Besides expanding the completeness of terrestrial datasets, bringing ground-based robots to the aeolian research community may lead to unexpected discoveries that generate new hypotheses to expand the science itself.
NASA Astrophysics Data System (ADS)
Mencin, David; Hodgkinson, Kathleen; Sievers, Charlie; David, Phillips; Charles, Meertens; Glen, Mattioli
2017-04-01
UNAVCO has been providing infrastructure and support for solid-earth sciences and earthquake natural hazards for the past two decades. Recent advances in GNSS technology and data processing are now providing position solutions with centimeter-level precision at high-rate (>1 Hz) and low latency (i.e. the time required for data to arrive for analysis, in this case less than 1 second). These data have the potential to improve our understanding in diverse areas of geophysics including properties of seismic, volcanic, magmatic and tsunami sources, and thus profoundly transform rapid event characterization and warning. Scientific and operational applications also include glacier and ice sheet motions; tropospheric modeling; and space weather. These areas of geophysics represent a spectrum of research fields, including geodesy, seismology, tropospheric weather, space weather and natural hazards. Processed Real-Time GNSS (RT-GNSS) data will require formats and standards that allow this broad and diverse community to use these data and associated meta-data in existing research infrastructure. These advances have critically highlighted the difficulties associated with merging data and metadata between scientific disciplines. Even seemingly very closely related fields such as geodesy and seismology, which both have rich histories of handling large volumes of data and metadata, do not go together well in any automated way. Community analysis strategies, or lack thereof, such as treatment of error prove difficult to address and are reflected in the data and metadata. In addition, these communities have differing security, accessibility and reliability requirements. We propose some solutions to the particular problem of making RT-GNSS processed solution data and metadata accessible to multiply scientific and natural hazard communities. Importantly, we discuss the roadblocks encounter and solved and those that remain to be addressed.
Estimating Soil Hydraulic Parameters using Gradient Based Approach
NASA Astrophysics Data System (ADS)
Rai, P. K.; Tripathi, S.
2017-12-01
The conventional way of estimating parameters of a differential equation is to minimize the error between the observations and their estimates. The estimates are produced from forward solution (numerical or analytical) of differential equation assuming a set of parameters. Parameter estimation using the conventional approach requires high computational cost, setting-up of initial and boundary conditions, and formation of difference equations in case the forward solution is obtained numerically. Gaussian process based approaches like Gaussian Process Ordinary Differential Equation (GPODE) and Adaptive Gradient Matching (AGM) have been developed to estimate the parameters of Ordinary Differential Equations without explicitly solving them. Claims have been made that these approaches can straightforwardly be extended to Partial Differential Equations; however, it has been never demonstrated. This study extends AGM approach to PDEs and applies it for estimating parameters of Richards equation. Unlike the conventional approach, the AGM approach does not require setting-up of initial and boundary conditions explicitly, which is often difficult in real world application of Richards equation. The developed methodology was applied to synthetic soil moisture data. It was seen that the proposed methodology can estimate the soil hydraulic parameters correctly and can be a potential alternative to the conventional method.
Designing an evaluation framework for WFME basic standards for medical education.
Tackett, Sean; Grant, Janet; Mmari, Kristin
2016-01-01
To create an evaluation plan for the World Federation for Medical Education (WFME) accreditation standards for basic medical education. We conceptualized the 100 basic standards from "Basic Medical Education: WFME Global Standards for Quality Improvement: The 2012 Revision" as medical education program objectives. Standards were simplified into evaluable items, which were then categorized as inputs, processes, outputs and/or outcomes to generate a logic model and corresponding plan for data collection. WFME standards posed significant challenges to evaluation due to complex wording, inconsistent formatting and lack of existing assessment tools. Our resulting logic model contained 244 items. Standard B 5.1.1 separated into 24 items, the most for any single standard. A large proportion of items (40%) required evaluation of more than one input, process, output and/or outcome. Only one standard (B 3.2.2) was interpreted as requiring evaluation of a program outcome. Current WFME standards are difficult to use for evaluation planning. Our analysis may guide adaptation and revision of standards to make them more evaluable. Our logic model and data collection plan may be useful to medical schools planning an institutional self-review and to accrediting authorities wanting to provide guidance to schools under their purview.
NASA Astrophysics Data System (ADS)
Liu, Xia; Peyton, Liam; Kuziemsky, Craig
Health care is increasingly provided to citizens by a network of collaboration that includes multiple providers and locations. Typically, that collaboration is on an ad-hoc basis via phone calls, faxes, and paper based documentation. Internet and wireless technologies provide an opportunity to improve this situation via electronic data sharing. These new technologies make possible new ways of working and collaboration but it can be difficult for health care organizations to understand how to use the new technologies while still ensuring that their policies and objectives are being met. It is also important to have a systematic approach to validate that e-health processes deliver the performance improvements that are expected. Using a case study of a palliative care patient receiving home care from a team of collaborating health organizations, we introduce a framework based on requirements engineering. Key concerns and objectives are identified and modeled (privacy, security, quality of care, and timeliness of service). And, then, proposed business processes which use new technologies are modeled in terms of these concerns and objectives to assess their impact and ensure that electronic data sharing is well regulated.
NASA Astrophysics Data System (ADS)
Pelizardi, Flavia; Bea, Sergio A.; Carrera, Jesús; Vives, Luis
2017-07-01
Mixing calculations (i.e., the calculation of the proportions in which end-members are mixed in a sample) are essential for hydrological research and water management. However, they typically require the use of conservative species, a condition that may be difficult to meet due to chemical reactions. Mixing calculation also require identifying end-member waters, which is usually achieved through End Member Mixing Analysis (EMMA). We present a methodology to help in the identification of both end-members and such reactions, so as to improve mixing ratio calculations. The proposed approach consists of: (1) identifying the potential chemical reactions with the help of EMMA; (2) defining decoupled conservative chemical components consistent with those reactions; (3) repeat EMMA with the decoupled (i.e., conservative) components, so as to identify end-members waters; and (4) computing mixing ratios using the new set of components and end-members. The approach is illustrated by application to two synthetic mixing examples involving mineral dissolution and cation exchange reactions. Results confirm that the methodology can be successfully used to identify geochemical processes affecting the mixtures, thus improving the accuracy of mixing ratios calculations and relaxing the need for conservative species.
Chain of evidence generation for contrast enhancement in digital image forensics
NASA Astrophysics Data System (ADS)
Battiato, Sebastiano; Messina, Giuseppe; Strano, Daniela
2010-01-01
The quality of the images obtained by digital cameras has improved a lot since digital cameras early days. Unfortunately, it is not unusual in image forensics to find wrongly exposed pictures. This is mainly due to obsolete techniques or old technologies, but also due to backlight conditions. To extrapolate some invisible details a stretching of the image contrast is obviously required. The forensics rules to produce evidences require a complete documentation of the processing steps, enabling the replication of the entire process. The automation of enhancement techniques is thus quite difficult and needs to be carefully documented. This work presents an automatic procedure to find contrast enhancement settings, allowing both image correction and automatic scripting generation. The technique is based on a preprocessing step which extracts the features of the image and selects correction parameters. The parameters are thus saved through a JavaScript code that is used in the second step of the approach to correct the image. The generated script is Adobe Photoshop compliant (which is largely used in image forensics analysis) thus permitting the replication of the enhancement steps. Experiments on a dataset of images are also reported showing the effectiveness of the proposed methodology.
V&V Plan for FPGA-based ESF-CCS Using System Engineering Approach.
NASA Astrophysics Data System (ADS)
Maerani, Restu; Mayaka, Joyce; El Akrat, Mohamed; Cheon, Jung Jae
2018-02-01
Instrumentation and Control (I&C) systems play an important role in maintaining the safety of Nuclear Power Plant (NPP) operation. However, most current I&C safety systems are based on Programmable Logic Controller (PLC) hardware, which is difficult to verify and validate, and is susceptible to software common cause failure. Therefore, a plan for the replacement of the PLC-based safety systems, such as the Engineered Safety Feature - Component Control System (ESF-CCS), with Field Programmable Gate Arrays (FPGA) is needed. By using a systems engineering approach, which ensures traceability in every phase of the life cycle, from system requirements, design implementation to verification and validation, the system development is guaranteed to be in line with the regulatory requirements. The Verification process will ensure that the customer and stakeholder’s needs are satisfied in a high quality, trustworthy, cost efficient and schedule compliant manner throughout a system’s entire life cycle. The benefit of the V&V plan is to ensure that the FPGA based ESF-CCS is correctly built, and to ensure that the measurement of performance indicators has positive feedback that “do we do the right thing” during the re-engineering process of the FPGA based ESF-CCS.
NASA Astrophysics Data System (ADS)
Luo, Yao; Wu, Mei-Ping; Wang, Ping; Duan, Shu-Ling; Liu, Hao-Jun; Wang, Jin-Long; An, Zhan-Feng
2015-09-01
The full magnetic gradient tensor (MGT) refers to the spatial change rate of the three field components of the geomagnetic field vector along three mutually orthogonal axes. The tensor is of use to geological mapping, resources exploration, magnetic navigation, and others. However, it is very difficult to measure the full magnetic tensor gradient using existing engineering technology. We present a method to use triaxial aeromagnetic gradient measurements for deriving the full MGT. The method uses the triaxial gradient data and makes full use of the variation of the magnetic anomaly modulus in three dimensions to obtain a self-consistent magnetic tensor gradient. Numerical simulations show that the full MGT data obtained with the proposed method are of high precision and satisfy the requirements of data processing. We selected triaxial aeromagnetic gradient data from the Hebei Province for calculating the full MGT. Data processing shows that using triaxial tensor gradient data allows to take advantage of the spatial rate of change of the total field in three dimensions and suppresses part of the independent noise in the aeromagnetic gradient. The calculated tensor components have improved resolution, and the transformed full tensor gradient satisfies the requirement of geological mapping and interpretation.
Assuring Ground-Based Detect and Avoid for UAS Operations
NASA Technical Reports Server (NTRS)
Denney, Ewen W.; Pai, Ganeshmadhav Jagadeesh; Berthold, Randall; Fladeland, Matthew; Storms, Bruce; Sumich, Mark
2014-01-01
One of the goals of the Marginal Ice Zones Observations and Processes Experiment (MIZOPEX) NASA Earth science mission was to show the operational capabilities of Unmanned Aircraft Systems (UAS) when deployed on challenging missions, in difficult environments. Given the extreme conditions of the Arctic environment where MIZOPEX measurements were required, the mission opted to use a radar to provide a ground-based detect-and-avoid (GBDAA) capability as an alternate means of compliance (AMOC) with the see-and-avoid federal aviation regulation. This paper describes how GBDAA safety assurance was provided by interpreting and applying the guidelines in the national policy for UAS operational approval. In particular, we describe how we formulated the appropriate safety goals, defined the processes and procedures for system safety, identified and assembled the relevant safety verification evidence, and created an operational safety case in compliance with Federal Aviation Administration (FAA) requirements. To the best of our knowledge, the safety case, which was ultimately approved by the FAA, is the first successful example of non-military UAS operations using GBDAA in the U.S. National Airspace System (NAS), and, therefore, the first nonmilitary application of the safety case concept in this context.