Examining the Effect of the Die Angle on Tool Load and Wear in the Extrusion Process
NASA Astrophysics Data System (ADS)
Nowotyńska, Irena; Kut, Stanisław
2014-04-01
The tool durability is a crucial factor in each manufacturing process, and this also includes the extrusion process. Striving to achieve the higher product quality should be accompanied by a long-term tool life and production cost reduction. This article presents the comparative research of load and wear of die at various angles of working cone during the concurrent extrusion. The numerical calculations of a tool load during the concurrent extrusion were performed using the MSC MARC software using the finite element method (FEM). Archard model was used to determine and compare die wear. This model was implemented in the software using the FEM. The examined tool deformations and stress distribution were determined based on the performed analyses. The die wear depth at various working cone angles was determined. Properly shaped die has an effect on the extruded material properties, but also controls loads, elastic deformation, and the tool life.
Development of materials for the rapid manufacture of die cast tooling
NASA Astrophysics Data System (ADS)
Hardro, Peter Jason
The focus of this research is to develop a material composition that can be processed by rapid prototyping (RP) in order to produce tooling for the die casting process. Where these rapidly produced tools will be superior to traditional tooling production methods by offering one or more of the following advantages: reduced tooling cost, shortened tooling creation time, reduced man-hours for tool creation, increased tool life, and shortened die casting cycle time. By utilizing RP's additive build process and vast material selection, there was a prospect that die cast tooling may be produced quicker and with superior material properties. To this end, the material properties that influence die life and cycle time were determined, and a list of materials that fulfill these "optimal" properties were highlighted. Physical testing was conducted in order to grade the processability of each of the material systems and to optimize the manufacturing process for the downselected material system. Sample specimens were produced and microscopy techniques were utilized to determine a number of physical properties of the material system. Additionally, a benchmark geometry was selected and die casting dies were produced from traditional tool materials (H13 steel) and techniques (machining) and from the newly developed materials and RP techniques (selective laser sintering (SLS) and laser engineered net shaping (LENS)). Once the tools were created, a die cast alloy was selected and a preset number of parts were shot into each tool. During tool creation, the manufacturing time and cost was closely monitored and an economic model was developed to compare traditional tooling to RP tooling. This model allows one to determine, in the early design stages, when it is advantageous to implement RP tooling and when traditional tooling would be best. The results of the physical testing and economic analysis has shown that RP tooling is able to achieve a number of the research objectives, namely, reduce tooling cost, shorten tooling creation time, and reduce the man-hours needed for tool creation. Though identifying the appropriate time to use RP tooling appears to be the most important aspect in achieving successful implementation.
NASA Astrophysics Data System (ADS)
Farina, Simone; Thepsonti, Thanongsak; Ceretti, Elisabetta; Özel, Tugrul
2011-05-01
Titanium alloys offer superb properties in strength, corrosion resistance and biocompatibility and are commonly utilized in medical devices and implants. Micro-end milling process is a direct and rapid fabrication method for manufacturing medical devices and implants in titanium alloys. Process performance and quality depend upon an understanding of the relationship between cutting parameters and forces and resultant tool deflections to avoid tool breakage. For this purpose, FE simulations of chip formation during micro-end milling of Ti-6Al-4V alloy with an ultra-fine grain solid carbide two-flute micro-end mill are investigated using DEFORM software. At first, specific forces in tangential and radial directions of cutting during micro-end milling for varying feed advance and rotational speeds have been determined using designed FE simulations for chip formation process. Later, these forces are applied to the micro-end mill geometry along the axial depth of cut in 3D analysis of ABAQUS. Consequently, 3D distributions for tool deflections & von Misses stress are determined. These analyses will yield in establishing integrated multi-physics process models for high performance micro-end milling and a leap-forward to process improvements.
ERIC Educational Resources Information Center
Ayala, Gabriela Cota; Real, Francia Angélica Karlos; Ivan, Ramirez Alvarado Edqar
2016-01-01
The research was conducted to determine if the study program of the career of industrial processes Technological University of Chihuahua, 1 year after that it was certified by CACEI, continues achieving the established indicators and ISO 9001: 2008, implementing quality tools, monitoring of essential indicators are determined, flow charts are…
Variables Control Charts: A Measurement Tool to Detect Process Problems within Housing
ERIC Educational Resources Information Center
Luna, Andrew
1999-01-01
The purpose of this study was to use quality improvement tools to determine if the current process of supplying hot water to a high-rise residence hall for women at a southeastern Doctoral I granting institution was in control. After a series of focus groups among the residents in the hall, it was determined that they were mostly concerned about…
Porous tooling process for manufacture of graphite/polyimide composites
NASA Technical Reports Server (NTRS)
Smiser, L. W.; Orr, K. K.; Araujo, S. M.
1981-01-01
A porous tooling system was selected for the processing of Graphite/PMR-15 Polyimide laminates in thickness up to 3.2 mm. (0.125 inch). This tool system must have a reasonable strength, permeability dimensional stability, and thermal conductivity to accomplish curing at 600 F and 200 psi and 200 psi autoclave temperature and pressure. A permeability measuring apparatus was constructed and permeability vs. casting water level determined to produce tools at three different permeability levels. On these tools, laminates of 5, 11, and 22 plies (.027, .060, and 0.121 inch) were produced and evaluated by ultrasonic, mechanical, and thermal tests to determine the effect of the tool permeability on the cured laminates. All tools produced acceptable laminates at 5 and 11 plies but only the highest permeability produced acceptable clear ultrasonic C-Scans. Recommendations are made for future investigations of design geometry, and strengthening techniques for porous ceramic tooling.
Colossal Tooling Design: 3D Simulation for Ergonomic Analysis
NASA Technical Reports Server (NTRS)
Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid
2003-01-01
The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lherbier, Louis, W.; Novotnak, David, J.; Herling, Darrell, R.
Hot forming processes such as forging, die casting and glass forming require tooling that is subjected to high temperatures during the manufacturing of components. Current tooling is adversely affected by prolonged exposure at high temperatures. Initial studies were conducted to determine the root cause of tool failures in a number of applications. Results show that tool failures vary and depend on the operating environment under which they are used. Major root cause failures include (1) thermal softening, (2) fatigue and (3) tool erosion, all of which are affected by process boundary conditions such as lubrication, cooling, process speed, etc. Whilemore » thermal management is a key to addressing tooling failures, it was clear that new tooling materials with superior high temperature strength could provide improved manufacturing efficiencies. These efficiencies are based on the use of functionally graded materials (FGM), a new subset of hybrid tools with customizable properties that can be fabricated using advanced powder metallurgy manufacturing technologies. Modeling studies of the various hot forming processes helped identify the effect of key variables such as stress, temperature and cooling rate and aid in the selection of tooling materials for specific applications. To address the problem of high temperature strength, several advanced powder metallurgy nickel and cobalt based alloys were selected for evaluation. These materials were manufactured into tooling using two relatively new consolidation processes. One process involved laser powder deposition (LPD) and the second involved a solid state dynamic powder consolidation (SSDPC) process. These processes made possible functionally graded materials (FGM) that resulted in shaped tooling that was monolithic, bi-metallic or substrate coated. Manufacturing of tooling with these processes was determined to be robust and consistent for a variety of materials. Prototype and production testing of FGM tooling showed the benefits of the nickel and cobalt based powder metallurgy alloys in a number of applications evaluated. Improvements in tool life ranged from three (3) to twenty (20) or more times than currently used tooling. Improvements were most dramatic where tool softening and deformation were the major cause of tool failures in hot/warm forging applications. Significant improvement was also noted in erosion of aluminum die casting tooling. Cost and energy savings can be realized as a result of increased tooling life, increased productivity and a reduction in scrap because of improved dimensional controls. Although LPD and SSDPC tooling usually have higher acquisition costs, net tooling costs per component produced drops dramatically with superior tool performance. Less energy is used to manufacture the tooling because fewer tools are required and less recycling of used tools are needed for the hot forming process. Energy is saved during the component manufacturing cycle because more parts can be produced in shorter periods of time. Energy is also saved by minimizing heating furnace idling time because of less downtime for tooling changes.« less
DDP - a tool for life-cycle risk management
NASA Technical Reports Server (NTRS)
Cornford, S. L.; Feather, M. S.; Hicks, K. A.
2001-01-01
At JPL we have developed, and implemented, a process for achieving life-cycle risk management. This process has been embodied in a software tool and is called Defect Detection and Prevention (DDP). The DDP process can be succinctly stated as: determine where we want to be, what could get in the way and how we will get there.
ERIC Educational Resources Information Center
Derry, Sharon; And Others
This study examined ways in which two independent variables, peer collaboration and the use of a specific tool (the TAPS interface), work together and individually to shape students' problem-solving processes. More specifically, the researchers were interested in determining how collaboration and TAPS use cause metacognitive processes to differ…
Improving overlay control through proper use of multilevel query APC
NASA Astrophysics Data System (ADS)
Conway, Timothy H.; Carlson, Alan; Crow, David A.
2003-06-01
Many state-of-the-art fabs are operating with increasingly diversified product mixes. For example, at Cypress Semiconductor, it is not unusual to be concurrently running multiple technologies and many devices within each technology. This diverse product mix significantly increases the difficulty of manually controlling overlay process corrections. As a result, automated run-to-run feedforward-feedback control has become a necessary and vital component of manufacturing. However, traditional run-to-run controllers rely on highly correlated historical events to forecast process corrections. For example, the historical process events typically are constrained to match the current event for exposure tool, device, process level and reticle ID. This narrowly defined process stream can result in insufficient data when applied to lowvolume or new-release devices. The run-to-run controller implemented at Cypress utilizes a multi-level query (Level-N) correlation algorithm, where each subsequent level widens the search criteria for available historical data. The paper discusses how best to widen the search criteria and how to determine and apply a known bias to account for tool-to-tool and device-to-device differences. Specific applications include offloading lots from one tool to another when the first tool is down for preventive maintenance, utilizing related devices to determine a default feedback vector for new-release devices, and applying bias values to account for known reticle-to-reticle differences. In this study, we will show how historical data can be leveraged from related devices or tools to overcome the limitations of narrow process streams. In particular, this paper discusses how effectively handling narrow process streams allows Cypress to offload lots from a baseline tool to an alternate tool.
Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona
2012-01-01
Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.
Yan, Binjun; Chen, Teng; Xu, Zhilin; Qu, Haibin
2014-06-01
The concept of quality by design (QbD) is widely applied in the process development of pharmaceuticals. However, the additional cost and time have caused some resistance about QbD implementation. To show a possible solution, this work proposed a rapid process development method, which used direct analysis in real time mass spectrometry (DART-MS) as a process analytical technology (PAT) tool for studying the chromatographic process of Ginkgo biloba L., as an example. The breakthrough curves were fast determined by DART-MS at-line. A high correlation coefficient of 0.9520 was found between the concentrations of ginkgolide A determined by DART-MS and HPLC. Based on the PAT tool, the impacts of process parameters on the adsorption capacity were discovered rapidly, which showed a decreased adsorption capacity with the increase of the flow rate. This work has shown the feasibility and advantages of integrating PAT into QbD implementation for rapid process development. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Nunes, Arthur C., Jr.
2008-01-01
Friction stir welding (FSW) is a solid state welding process invented in 1991 at The Welding Institute in the United Kingdom. A weld is made in the FSW process by translating a rotating pin along a weld seam so as to stir the sides of the seam together. FSW avoids deleterious effects inherent in melting and promises to be an important welding process for any industries where welds of optimal quality are demanded. This article provides an introduction to the FSW process. The chief concern is the physical effect of the tool on the weld metal: how weld seam bonding takes place, what kind of weld structure is generated, potential problems, possible defects for example, and implications for process parameters and tool design. Weld properties are determined by structure, and the structure of friction stir welds is determined by the weld metal flow field in the vicinity of the weld tool. Metal flow in the vicinity of the weld tool is explained through a simple kinematic flow model that decomposes the flow field into three basic component flows: a uniform translation, a rotating solid cylinder, and a ring vortex encircling the tool. The flow components, superposed to construct the flow model, can be related to particular aspects of weld process parameters and tool design; they provide a bridge to an understanding of a complex-at-first-glance weld structure. Torques and forces are also discussed. Some simple mathematical models of structural aspects, torques, and forces are included.
Dependency between removal characteristics and defined measurement categories of pellets
NASA Astrophysics Data System (ADS)
Vogt, C.; Rohrbacher, M.; Rascher, R.; Sinzinger, S.
2015-09-01
Optical surfaces are usually machined by grinding and polishing. To achieve short polishing times it is necessary to grind with best possible form accuracy and with low sub surface damages. This is possible by using very fine grained grinding tools for the finishing process. These however often show time dependent properties regarding cutting ability in conjunction with tool wear. Fine grinding tools in the optics are often pellet-tools. For a successful grinding process the tools must show a constant self-sharpening performance. A constant, at least predictable wear and cutting behavior is crucial for a deterministic machining. This work describes a method to determine the characteristics of pellet grinding tools by tests conducted with a single pellet. We investigate the determination of the effective material removal rate and the derivation of the G-ratio. Especially the change from the newly dressed via the quasi-stationary to the worn status of the tool is described. By recording the achieved roughness with the single pellet it is possible to derive the roughness expect from a series pellet tool made of pellets with the same specification. From the results of these tests the usability of a pellet grinding tool for a specific grinding task can be determined without testing a comparably expensive serial tool. The results are verified by a production test with a serial tool under series conditions. The collected data can be stored and used in an appropriate data base for tool characteristics and be combined with useful applications.
Accelerated bridge construction (ABC) decision making and economic modeling tool.
DOT National Transportation Integrated Search
2011-12-01
In this FHWA-sponsored pool funded study, a set of decision making tools, based on the Analytic Hierarchy Process (AHP) was developed. This tool set is prepared for transportation specialists and decision-makers to determine if ABC is more effective ...
Immediate tool incorporation processes determine human motor planning with tools
Ganesh, G.; Yoshioka, T.; Osu, R.; Ikegami, T.
2014-01-01
Human dexterity with tools is believed to stem from our ability to incorporate and use tools as parts of our body. However tool incorporation, evident as extensions in our body representation and peri-personal space, has been observed predominantly after extended tool exposures and does not explain our immediate motor behaviours when we change tools. Here we utilize two novel experiments to elucidate the presence of additional immediate tool incorporation effects that determine motor planning with tools. Interestingly, tools were observed to immediately induce a trial-by-trial, tool length dependent shortening of the perceived limb lengths, opposite to observations of elongations after extended tool use. Our results thus exhibit that tools induce a dual effect on our body representation; an immediate shortening that critically affects motor planning with a new tool, and the slow elongation, probably a consequence of skill related changes in sensory-motor mappings with the repeated use of the tool. PMID:25077612
ERIC Educational Resources Information Center
Burke, Victoria; Greenberg, Daphne
2010-01-01
There are many readability tools that instructors can use to help adult learners select reading materials. We describe and compare different types of readability tools: formulas calculated by hand, tools found on the Web, tools embedded in a word processing program, and readability tools found in a commercial software program. Practitioners do not…
Determination of high-strength materials diamond grinding rational modes
NASA Astrophysics Data System (ADS)
Arkhipov, P. V.; Lobanov, D. V.; Rychkov, D. A.; Yanyushkin, A. S.
2018-03-01
The analysis of methods of high-strength materials abrasive processing is carried out. This method made it possible to determine the necessary directions and prospects for the development of shaping combined methods. The need to use metal bonded diamond abrasive tools in combination with a different kind of energy is noted to improve the processing efficiency and reduce the complexity of operations. The complex of experimental research on revealing the importance of mechanical and electrical components of cutting regimes, on the cutting ability of diamond tools, as well as the need to reduce the specific consumption of an abrasive wheel as one of the important economic indicators of the processing process is performed. It is established that combined diamond grinding with simultaneous continuous correction of the abrasive wheel contributes to an increase in the cutting ability of metal bonded diamond abrasive tools when processing high-strength materials by an average of 30% compared to diamond grinding. Particular recommendations on the designation of technological factors are developed depending on specific production problems.
Enhancement of LEEDS Decision Tools for E-Craft
2012-03-13
software tool was conducted to determine how to best incorporate RCM into the FMEA process already a part of LEEDS. Plans were made to enhance the...as opposed to implementing a strictly scheduled and costly equipment maintenance program. Utilizing the FMEA process already a part of LEEDS
Abstract:This case study application provides discussion on a selected application of advanced concepts, included in the End of Asset Life Reinvestment decision-making process tool, using a utility practitioner’s data set. The tool provides step-by-step process guidance to the as...
UV-Vis as quantification tool for solubilized lignin following a single-shot steam process.
Lee, Roland A; Bédard, Charles; Berberi, Véronique; Beauchet, Romain; Lavoie, Jean-Michel
2013-09-01
In this short communication, UV/Vis was used as an analytical tool for the quantification of lignin concentrations in aqueous mediums. A significant correlation was determined between absorbance and concentration of lignin in solution. For this study, lignin was produced from different types of biomasses (willow, aspen, softwood, canary grass and hemp) using steam processes. Quantification was performed at 212, 225, 237, 270, 280 and 287 nm. UV-Vis quantification of lignin was found suitable for different types of biomass making this a timesaving analytical system that could lead to uses as Process Analytical Tool (PAT) in biorefineries utilizing steam processes or comparable approaches. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Filippov, A. V.; Tarasov, S. Yu; Podgornyh, O. A.; Shamarin, N. N.; Filippova, E. O.
2017-01-01
Automatization of engineering processes requires developing relevant mathematical support and a computer software. Analysis of metal cutting kinematics and tool geometry is a necessary key task at the preproduction stage. This paper is focused on developing a procedure for determining the geometry of oblique peakless round-nose tool lathe machining with the use of vector/matrix transformations. Such an approach allows integration into modern mathematical software packages in distinction to the traditional analytic description. Such an advantage is very promising for developing automated control of the preproduction process. A kinematic criterion for the applicable tool geometry has been developed from the results of this study. The effect of tool blade inclination and curvature on the geometry-dependent process parameters was evaluated.
GREENSCOPE: Sustainable Process Modeling
EPA researchers are responding to environmental problems by incorporating sustainability into process design and evaluation. EPA researchers are also developing a tool that allows users to assess modifications to existing and new chemical processes to determine whether changes in...
Operations management tools to be applied for textile
NASA Astrophysics Data System (ADS)
Maralcan, A.; Ilhan, I.
2017-10-01
In this paper, basic concepts of process analysis such as flow time, inventory, bottleneck, labour cost and utilization are illustrated first. The effect of bottleneck on the results of a business are especially emphasized. In the next section, tools on productivity measurement; KPI (Key Performance Indicators) Tree, OEE (Overall Equipment Effectiveness) and Takt Time are introduced and exemplified. KPI tree is a diagram on which we can visualize all the variables of an operation which are driving financial results through cost and profit. OEE is a tool to measure a potential extra capacity of an equipment or an employee. Takt time is a tool to determine the process flow rate according to the customer demand. KPI tree is studied through the whole process while OEE is exemplified for a stenter frame machine which is the most important machine (and usually the bottleneck) and the most expensive investment in a finishing plant. Takt time is exemplified for the quality control department. Finally quality tools, six sigma, control charts and jidoka are introduced. Six sigma is a tool to measure process capability and by the way probability of a defect. Control chart is a powerful tool to monitor the process. The idea of jidoka (detect, stop and alert) is about alerting the people that there is a problem in the process.
High Thermal Conductivity and High Wear Resistance Tool Steels for cost-effective Hot Stamping Tools
NASA Astrophysics Data System (ADS)
Valls, I.; Hamasaiid, A.; Padré, A.
2017-09-01
In hot stamping/press hardening, in addition to its shaping function, the tool controls the cycle time, the quality of the stamped components through determining the cooling rate of the stamped blank, the production costs and the feasibility frontier for stamping a given component. During the stamping, heat is extracted from the stamped blank and transported through the tool to the cooling medium in the cooling lines. Hence, the tools’ thermal properties determine the cooling rate of the blank, the heat transport mechanism, stamping times and temperature distribution. The tool’s surface resistance to adhesive and abrasive wear is also an important cost factor, as it determines the tool durability and maintenance costs. Wear is influenced by many tool material parameters, such as the microstructure, composition, hardness level and distribution of strengthening phases, as well as the tool’s working temperature. A decade ago, Rovalma developed a hot work tool steel for hot stamping that features a thermal conductivity of more than double that of any conventional hot work tool steel. Since that time, many complimentary grades have been developed in order to provide tailored material solutions as a function of the production volume, degree of blank cooling and wear resistance requirements, tool geometries, tool manufacturing method, type and thickness of the blank material, etc. Recently, Rovalma has developed a new generation of high thermal conductivity, high wear resistance tool steel grades that enable the manufacture of cost effective tools for hot stamping to increase process productivity and reduce tool manufacturing costs and lead times. Both of these novel grades feature high wear resistance and high thermal conductivity to enhance tool durability and cut cycle times in the production process of hot stamped components. Furthermore, one of these new grades reduces tool manufacturing costs through low tool material cost and hardening through readily available gas-quenching, whereas the other new grade enables a faster manufacturing of the tool at reduced cost by eliminating the time and money consuming high temperature hardening altogether. The latter newly developed grade can be hardened from a soft delivery state for easy machining to 52 HRc by way of a simple low temperature precipitation hardening. In this work, these new grades and the role of the tool material’s thermal, mechanical and tribological properties as well as their processing features will be discussed in light of enabling the manufacture of intelligent hot stamping tools.
Implementation Analysis of Cutting Tool Carbide with Cast Iron Material S45 C on Universal Lathe
NASA Astrophysics Data System (ADS)
Junaidi; hestukoro, Soni; yanie, Ahmad; Jumadi; Eddy
2017-12-01
Cutting tool is the tools lathe. Cutting process tool CARBIDE with Cast Iron Material Universal Lathe which is commonly found at Analysiscutting Process by some aspects numely Cutting force, Cutting Speed, Cutting Power, Cutting Indication Power, Temperature Zone 1 and Temperatur Zone 2. Purpose of this Study was to determine how big the cutting Speed, Cutting Power, electromotor Power,Temperatur Zone 1 and Temperatur Zone 2 that drives the chisel cutting CARBIDE in the Process of tur ning Cast Iron Material. Cutting force obtained from image analysis relationship between the recommended Component Cuting Force with plane of the cut and Cutting Speed obtained from image analysis of relationships between the recommended Cutting Speed Feed rate.
Traceability of On-Machine Tool Measurement: A Review.
Mutilba, Unai; Gomez-Acedo, Eneko; Kortaberria, Gorka; Olarra, Aitor; Yagüe-Fabra, Jose A
2017-07-11
Nowadays, errors during the manufacturing process of high value components are not acceptable in driving industries such as energy and transportation. Sectors such as aerospace, automotive, shipbuilding, nuclear power, large science facilities or wind power need complex and accurate components that demand close measurements and fast feedback into their manufacturing processes. New measuring technologies are already available in machine tools, including integrated touch probes and fast interface capabilities. They provide the possibility to measure the workpiece in-machine during or after its manufacture, maintaining the original setup of the workpiece and avoiding the manufacturing process from being interrupted to transport the workpiece to a measuring position. However, the traceability of the measurement process on a machine tool is not ensured yet and measurement data is still not fully reliable enough for process control or product validation. The scientific objective is to determine the uncertainty on a machine tool measurement and, therefore, convert it into a machine integrated traceable measuring process. For that purpose, an error budget should consider error sources such as the machine tools, components under measurement and the interactions between both of them. This paper reviews all those uncertainty sources, being mainly focused on those related to the machine tool, either on the process of geometric error assessment of the machine or on the technology employed to probe the measurand.
A Brief Introduction to the Theory of Friction Stir Welding
NASA Technical Reports Server (NTRS)
Nunes, Arthur C., Jr.
2008-01-01
Friction stir welding (FSW) is a solid state welding process invented in 1991 at The Welding Institute in the United Kingdom. A weld is made in the FSW process by translating a rotating pin along a weld seam so as to stir the sides of the seam together. FSW avoids deleterious effects inherent in melting and is already an important welding process for the aerospace industry, where welds of optimal quality are demanded. The structure of welds determines weld properties. The structure of friction stir welds is determined by the flow field in the weld metal in the vicinity of the weld tool. A simple kinematic model of the FSW flow field developed at Marshall Space Flight Center, which enables the basic features of FSW microstructure to be understood and related to weld process parameters and tool design, is explained.
A software tool for determination of breast cancer treatment methods using data mining approach.
Cakır, Abdülkadir; Demirel, Burçin
2011-12-01
In this work, breast cancer treatment methods are determined using data mining. For this purpose, software is developed to help to oncology doctor for the suggestion of application of the treatment methods about breast cancer patients. 462 breast cancer patient data, obtained from Ankara Oncology Hospital, are used to determine treatment methods for new patients. This dataset is processed with Weka data mining tool. Classification algorithms are applied one by one for this dataset and results are compared to find proper treatment method. Developed software program called as "Treatment Assistant" uses different algorithms (IB1, Multilayer Perception and Decision Table) to find out which one is giving better result for each attribute to predict and by using Java Net beans interface. Treatment methods are determined for the post surgical operation of breast cancer patients using this developed software tool. At modeling step of data mining process, different Weka algorithms are used for output attributes. For hormonotherapy output IB1, for tamoxifen and radiotherapy outputs Multilayer Perceptron and for the chemotherapy output decision table algorithm shows best accuracy performance compare to each other. In conclusion, this work shows that data mining approach can be a useful tool for medical applications particularly at the treatment decision step. Data mining helps to the doctor to decide in a short time.
Requirements management for Gemini Observatory: a small organization with big development projects
NASA Astrophysics Data System (ADS)
Close, Madeline; Serio, Andrew; Cordova, Martin; Hardie, Kayla
2016-08-01
Gemini Observatory is an astronomical observatory operating two premier 8m-class telescopes, one in each hemisphere. As an operational facility, a majority of Gemini's resources are spent on operations however the observatory undertakes major development projects as well. Current projects include new facility science instruments, an operational paradigm shift to full remote operations, and new operations tools for planning, configuration and change control. Three years ago, Gemini determined that a specialized requirements management tool was needed. Over the next year, the Gemini Systems Engineering Group investigated several tools, selected one for a trial period and configured it for use. Configuration activities including definition of systems engineering processes, development of a requirements framework, and assignment of project roles to tool roles. Test projects were implemented in the tool. At the conclusion of the trial, the group determined that the Gemini could meet its requirements management needs without use of a specialized requirements management tool, and the group identified a number of lessons learned which are described in the last major section of this paper. These lessons learned include how to conduct an organizational needs analysis prior to pursuing a tool; caveats concerning tool criteria and the selection process; the prerequisites and sequence of activities necessary to achieve an optimum configuration of the tool; the need for adequate staff resources and staff training; and a special note regarding organizations in transition and archiving of requirements.
Study on electroplating technology of diamond tools for machining hard and brittle materials
NASA Astrophysics Data System (ADS)
Cui, Ying; Chen, Jian Hua; Sun, Li Peng; Wang, Yue
2016-10-01
With the development of the high speed cutting, the ultra-precision machining and ultrasonic vibration technique in processing hard and brittle material , the requirement of cutting tools is becoming higher and higher. As electroplated diamond tools have distinct advantages, such as high adaptability, high durability, long service life and good dimensional stability, the cutting tools are effective and extensive used in grinding hard and brittle materials. In this paper, the coating structure of electroplating diamond tool is described. The electroplating process flow is presented, and the influence of pretreatment on the machining quality is analyzed. Through the experimental research and summary, the reasonable formula of the electrolyte, the electroplating technologic parameters and the suitable sanding method were determined. Meanwhile, the drilling experiment on glass-ceramic shows that the electroplating process can effectively improve the cutting performance of diamond tools. It has laid a good foundation for further improving the quality and efficiency of the machining of hard and brittle materials.
Process development and tooling design for intrinsic hybrid composites
NASA Astrophysics Data System (ADS)
Riemer, M.; Müller, R.; Drossel, W. G.; Landgrebe, D.
2017-09-01
Hybrid parts, which combine the advantages of different material classes, are moving into the focus of lightweight applications. This development is amplified by their high potential for usage in the field of crash relevant structures. By the current state of the art, hybrid parts are mainly made in separate, subsequent forming and joining processes. By using the concept of an intrinsic hybrid, the shaping of the part and the joining of the different materials are performed in a single process step for shortening the overall processing time and thereby the manufacturing costs. The investigated hybrid part is made from continuous fibre reinforced plastic (FRP), in which a metallic reinforcement structure is integrated. The connection between these layered components is realized by a combination of adhesive bonding and a geometrical form fit. The form fit elements are intrinsically generated during the forming process. This contribution regards the development of the forming process and the design of the forming tool for the single step production of a hybrid part. To this end a forming tool, which combines the thermo-forming and the metal forming process, is developed. The main challenge by designing the tool is the temperature management of the tool elements for the variothermal forming process. The process parameters are determined in basic tests and finite element (FE) simulation studies. On the basis of these investigations a control concept for the steering of the motion axes and the tool temperature is developed. Forming tests are carried out with the developed tool and the manufactured parts are analysed by computer assisted tomography (CT) scans.
Electronic and software systems of an automated portable static mass spectrometer
NASA Astrophysics Data System (ADS)
Chichagov, Yu. V.; Bogdanov, A. A.; Lebedev, D. S.; Kogan, V. T.; Tubol'tsev, Yu. V.; Kozlenok, A. V.; Moroshkin, V. S.; Berezina, A. V.
2017-01-01
The electronic systems of a small high-sensitivity static mass spectrometer and software and hardware tools, which allow one to determine trace concentrations of gases and volatile compounds in air and water samples in real time, have been characterized. These systems and tools have been used to set up the device, control the process of measurement, synchronize this process with accompanying measurements, maintain reliable operation of the device, process the obtained results automatically, and visualize and store them. The developed software and hardware tools allow one to conduct continuous measurements for up to 100 h and provide an opportunity for personnel with no special training to perform maintenance on the device. The test results showed that mobile mass spectrometers for geophysical and medical research, which were fitted with these systems, had a determination limit for target compounds as low as several ppb(m) and a mass resolving power (depending on the current task) as high as 250.
PCC properties to support w/c determination for durability.
DOT National Transportation Integrated Search
2012-10-01
The fresh concrete watercement ratio (w/c) determination tool is urgently needed for use in the QC/QA process at the job site. Various : techniques have been used in the past to determine this parameter. However, many of these techniques can be co...
Developing and Validating a New Classroom Climate Observation Assessment Tool
Leff, Stephen S.; Thomas, Duane E.; Shapiro, Edward S.; Paskewich, Brooke; Wilson, Kim; Necowitz-Hoffman, Beth; Jawad, Abbas F.
2011-01-01
The climate of school classrooms, shaped by a combination of teacher practices and peer processes, is an important determinant for children’s psychosocial functioning and is a primary factor affecting bullying and victimization. Given that there are relatively few theoretically-grounded and validated assessment tools designed to measure the social climate of classrooms, our research team developed an observation tool through participatory action research (PAR). This article details how the assessment tool was designed and preliminarily validated in 18 third-, fourth-, and fifth-grade classrooms in a large urban public school district. The goals of this study are to illustrate the feasibility of a PAR paradigm in measurement development, ascertain the psychometric properties of the assessment tool, and determine associations with different indices of classroom levels of relational and physical aggression. PMID:21643447
Developing and Validating a New Classroom Climate Observation Assessment Tool.
Leff, Stephen S; Thomas, Duane E; Shapiro, Edward S; Paskewich, Brooke; Wilson, Kim; Necowitz-Hoffman, Beth; Jawad, Abbas F
2011-01-01
The climate of school classrooms, shaped by a combination of teacher practices and peer processes, is an important determinant for children's psychosocial functioning and is a primary factor affecting bullying and victimization. Given that there are relatively few theoretically-grounded and validated assessment tools designed to measure the social climate of classrooms, our research team developed an observation tool through participatory action research (PAR). This article details how the assessment tool was designed and preliminarily validated in 18 third-, fourth-, and fifth-grade classrooms in a large urban public school district. The goals of this study are to illustrate the feasibility of a PAR paradigm in measurement development, ascertain the psychometric properties of the assessment tool, and determine associations with different indices of classroom levels of relational and physical aggression.
The Tool Life of Ball Nose end Mill Depending on the Different Types of Ramping
NASA Astrophysics Data System (ADS)
Vopát, Tomáš; Peterka, Jozef; Kováč, Martin
2014-12-01
The article deals with the cutting tool wear measurement process and tool life of ball nose end mill depending on upward ramping and downward ramping. The aim was to determine and compare the wear (tool life) of ball nose end mill for different types of copy milling operations, as well as to specify particular steps of the measurement process. In addition, we examined and observed cutter contact areas of ball nose end mill with machined material. For tool life test, DMG DMU 85 monoBLOCK 5-axis CNC milling machine was used. In the experiment, cutting speed, feed rate, axial depth of cut and radial depth of cut were not changed. The cutting tool wear was measured on Zoller Genius 3s universal measuring machine. The results show different tool life of ball nose end mills depending on the copy milling strategy.
Prioritizing Health: A Systematic Approach to Scoping Determinants in Health Impact Assessment.
McCallum, Lindsay C; Ollson, Christopher A; Stefanovic, Ingrid L
2016-01-01
The determinants of health are those factors that have the potential to affect health, either positively or negatively, and include a range of personal, social, economic, and environmental factors. In the practice of health impact assessment (HIA), the stage at which the determinants of health are considered for inclusion is during the scoping step. The scoping step is intended to identify how the HIA will be carried out and to set the boundaries (e.g., temporal and geographical) for the assessment. There are several factors that can help to inform the scoping process, many of which are considered in existing HIA tools and guidance; however, a systematic method of prioritizing determinants was found to be lacking. In order to analyze existing HIA scoping tools that are available, a systematic literature review was conducted, including both primary and gray literature. A total of 10 HIA scoping tools met the inclusion/exclusion criteria and were carried forward for comparative analysis. The analysis focused on minimum elements and practice standards of HIA scoping that have been established in the field. The analysis determined that existing approaches lack a clear, systematic method of prioritization of health determinants for inclusion in HIA. This finding led to the development of a Systematic HIA Scoping tool that addressed this gap. The decision matrix tool uses factors, such as impact, public concern, and data availability, to prioritize health determinants. Additionally, the tool allows for identification of data gaps and provides a transparent method for budget allocation and assessment planning. In order to increase efficiency and improve utility, the tool was programed into Microsoft Excel. Future work in the area of HIA methodology development is vital to the ongoing success of the practice and utilization of HIA as a reliable decision-making tool.
NASA Technical Reports Server (NTRS)
Schneider, Judy; Nunes, Arthur C., Jr.; Brendel, Michael S.
2010-01-01
Although friction stir welding (FSW) was patented in 1991, process development has been based upon trial and error and the literature still exhibits little understanding of the mechanisms determining weld structure and properties. New concepts emerging from a better understanding of these mechanisms enhance the ability of FSW engineers to think about the FSW process in new ways, inevitably leading to advances in the technology. A kinematic approach in which the FSW flow process is decomposed into several simple flow components has been found to explain the basic structural features of FSW welds and to relate them to tool geometry and process parameters. Using this modelling approach, this study reports on a correlation between the features of the weld nugget, process parameters, weld tool geometry, and weld strength. This correlation presents a way to select process parameters for a given tool geometry so as to optimize weld strength. It also provides clues that may ultimately explain why the weld strength varies within the sample population.
Traceability of On-Machine Tool Measurement: A Review
Gomez-Acedo, Eneko; Kortaberria, Gorka; Olarra, Aitor
2017-01-01
Nowadays, errors during the manufacturing process of high value components are not acceptable in driving industries such as energy and transportation. Sectors such as aerospace, automotive, shipbuilding, nuclear power, large science facilities or wind power need complex and accurate components that demand close measurements and fast feedback into their manufacturing processes. New measuring technologies are already available in machine tools, including integrated touch probes and fast interface capabilities. They provide the possibility to measure the workpiece in-machine during or after its manufacture, maintaining the original setup of the workpiece and avoiding the manufacturing process from being interrupted to transport the workpiece to a measuring position. However, the traceability of the measurement process on a machine tool is not ensured yet and measurement data is still not fully reliable enough for process control or product validation. The scientific objective is to determine the uncertainty on a machine tool measurement and, therefore, convert it into a machine integrated traceable measuring process. For that purpose, an error budget should consider error sources such as the machine tools, components under measurement and the interactions between both of them. This paper reviews all those uncertainty sources, being mainly focused on those related to the machine tool, either on the process of geometric error assessment of the machine or on the technology employed to probe the measurand. PMID:28696358
AFM surface imaging of AISI D2 tool steel machined by the EDM process
NASA Astrophysics Data System (ADS)
Guu, Y. H.
2005-04-01
The surface morphology, surface roughness and micro-crack of AISI D2 tool steel machined by the electrical discharge machining (EDM) process were analyzed by means of the atomic force microscopy (AFM) technique. Experimental results indicate that the surface texture after EDM is determined by the discharge energy during processing. An excellent machined finish can be obtained by setting the machine parameters at a low pulse energy. The surface roughness and the depth of the micro-cracks were proportional to the power input. Furthermore, the AFM application yielded information about the depth of the micro-cracks is particularly important in the post treatment of AISI D2 tool steel machined by EDM.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ragan, Eric D; Goodall, John R
2014-01-01
Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less
ERIC Educational Resources Information Center
Mattis, Ted B.
2011-01-01
The purpose of this study was to determine whether community college administrators in the state of Michigan believe that commonly known quality and continuous improvement tools, prevalent in a manufacturing environment, can be adapted to a community college model. The tools, specifically Six Sigma, benchmarking and process mapping have played a…
Design tool for multiprocessor scheduling and evaluation of iterative dataflow algorithms
NASA Technical Reports Server (NTRS)
Jones, Robert L., III
1995-01-01
A graph-theoretic design process and software tool is defined for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described with a dataflow graph and are intended to be executed repetitively on a set of identical processors. Typical applications include signal processing and control law problems. Graph-search algorithms and analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool applies the design process to a given problem and includes performance optimization through the inclusion of additional precedence constraints among the schedulable tasks.
Engineering Property Prediction Tools for Tailored Polymer Composite Structures (49465)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Ba Nghiep; Kunc, Vlastimil
2009-12-29
Process and constitutive models as well as characterization tools and testing methods were developed to determine stress-strain responses, damage development, strengths and creep of long-fiber thermoplastics (LFTs). The developed models were implemented in Moldflow and ABAQUS and have been validated against LFT data obtained experimentally.
Han, Tae Hee; Kim, Moon Jung; Kim, Shinyoung; Kim, Hyun Ok; Lee, Mi Ae; Choi, Ji Seon; Hur, Mina; St John, Andrew
2013-05-01
Failure modes and effects analysis (FMEA) is a risk management tool used by the manufacturing industry but now being applied in laboratories. Teams from six South Korean blood banks used this tool to map their manual and automated blood grouping processes and determine the risk priority numbers (RPNs) as a total measure of error risk. The RPNs determined by each of the teams consistently showed that the use of automation dramatically reduced the RPN compared to manual processes. In addition, FMEA showed where the major risks occur in each of the manual processes and where attention should be prioritized to improve the process. Despite no previous experience with FMEA, the teams found the technique relatively easy to use and the subjectivity associated with assigning risk numbers did not affect the validity of the data. FMEA should become a routine technique for improving processes in laboratories. © 2012 American Association of Blood Banks.
The Methodology of Calculation of Cutting Forces When Machining Composite Materials
NASA Astrophysics Data System (ADS)
Rychkov, D. A.; Yanyushkin, A. S.
2016-08-01
Cutting of composite materials has specific features and is different from the processing of metals. When this characteristic intense wear of the cutting tool. An important criterion in the selection process parameters composite processing is the value of the cutting forces, which depends on many factors and is determined experimentally, it is not always appropriate. The study developed a method of determining the cutting forces when machining composite materials and the comparative evaluation of the calculated and actual values of cutting forces. The methodology for calculating cutting forces into account specific features of the cutting tool and the extent of wear, the strength properties of the processed material and cutting conditions. Experimental studies conducted with fiberglass milling cutter equipped with elements of hard metal VK3M. The discrepancy between the estimated and the actual values of the cutting force is not more than 10%.
Numerical studies of the polymer melt flow in the extruder screw channel and the forming tool
NASA Astrophysics Data System (ADS)
Ershov, S. V.; Trufanova, N. M.
2017-06-01
To date, polymer compositions based on polyethylene or PVC is widely used as insulating materials. These materials processing conjugate with a number of problems during selection of the rational extrusion regimes. To minimize the time and cost when determining the technological regime uses mathematical modeling techniques. The paper discusses heat and mass transfer processes in the extruder screw channel, output adapter and the cable head. During the study were determined coefficients for three rheological models based on obtained viscosity vs. shear rate experimental data. Also a comparative analysis of this viscosimetric laws application possibility for studying polymer melt flow during its processing on the extrusion equipment was held. As a result of numerical study the temperature, viscosity and shear rate fields in the extruder screw channel and forming tool were obtained.
Gold, Rachel; Cottrell, Erika; Bunce, Arwen; Middendorf, Mary; Hollombe, Celine; Cowburn, Stuart; Mahr, Peter; Melgar, Gerardo
2017-01-01
"Social determinants of heath" (SDHs) are nonclinical factors that profoundly affect health. Helping community health centers (CHCs) document patients' SDH data in electronic health records (EHRs) could yield substantial health benefits, but little has been reported about CHCs' development of EHR-based tools for SDH data collection and presentation. We worked with 27 diverse CHC stakeholders to develop strategies for optimizing SDH data collection and presentation in their EHR, and approaches for integrating SDH data collection and the use of those data (eg, through referrals to community resources) into CHC workflows. We iteratively developed a set of EHR-based SDH data collection, summary, and referral tools for CHCs. We describe considerations that arose while developing the tools and present some preliminary lessons learned. Standardizing SDH data collection and presentation in EHRs could lead to improved patient and population health outcomes in CHCs and other care settings. We know of no previous reports of processes used to develop similar tools. This article provides an example of 1 such process. Lessons from our process may be useful to health care organizations interested in using EHRs to collect and act on SDH data. Research is needed to empirically test the generalizability of these lessons. © Copyright 2017 by the American Board of Family Medicine.
RFI and SCRIMP Model Development and Verification
NASA Technical Reports Server (NTRS)
Loos, Alfred C.; Sayre, Jay
2000-01-01
Vacuum-Assisted Resin Transfer Molding (VARTM) processes are becoming promising technologies in the manufacturing of primary composite structures in the aircraft industry as well as infrastructure. A great deal of work still needs to be done on efforts to reduce the costly trial-and-error methods of VARTM processing that are currently in practice today. A computer simulation model of the VARTM process would provide a cost-effective tool in the manufacturing of composites utilizing this technique. Therefore, the objective of this research was to modify an existing three-dimensional, Resin Film Infusion (RFI)/Resin Transfer Molding (RTM) model to include VARTM simulation capabilities and to verify this model with the fabrication of aircraft structural composites. An additional objective was to use the VARTM model as a process analysis tool, where this tool would enable the user to configure the best process for manufacturing quality composites. Experimental verification of the model was performed by processing several flat composite panels. The parameters verified included flow front patterns and infiltration times. The flow front patterns were determined to be qualitatively accurate, while the simulated infiltration times over predicted experimental times by 8 to 10%. Capillary and gravitational forces were incorporated into the existing RFI/RTM model in order to simulate VARTM processing physics more accurately. The theoretical capillary pressure showed the capability to reduce the simulated infiltration times by as great as 6%. The gravity, on the other hand, was found to be negligible for all cases. Finally, the VARTM model was used as a process analysis tool. This enabled the user to determine such important process constraints as the location and type of injection ports and the permeability and location of the high-permeable media. A process for a three-stiffener composite panel was proposed. This configuration evolved from the variation of the process constraints in the modeling of several different composite panels. The configuration was proposed by considering such factors as: infiltration time, the number of vacuum ports, and possible areas of void entrapment.
Abery, Philip; Kuys, Suzanne; Lynch, Mary; Low Choy, Nancy
2018-05-23
To design and establish reliability of a local stroke audit tool by engaging allied health clinicians within a privately funded hospital. Design: Two-stage study involving a modified Delphi process to inform stroke audit tool development and inter-tester reliability. Allied health clinicians. A modified Delphi process to select stroke guideline recommendations for inclusion in the audit tool. Reliability study: 1 allied health representative from each discipline audited 10 clinical records with sequential admissions to acute and rehabilitation services. Recommendations were admitted to the audit tool when 70% agreement was reached, with 50% set as the reserve agreement. Inter-tester reliability was determined using intra-class correlation coefficients (ICCs) across 10 clinical records. Twenty-two participants (92% female, 50% physiotherapists, 17% occupational therapists) completed the modified Delphi process. Across 6 voting rounds, 8 recommendations reached 70% agreement and 2 reached 50% agreement. Two recommendations (nutrition/hydration; goal setting) were added to ensure representation for all disciplines. Substantial consistency across raters was established for the audit tool applied in acute stroke (ICC .71; range .48 to .90) and rehabilitation (ICC.78; range .60 to .93) services. Allied health clinicians within a privately funded hospital generally agreed in an audit process to develop a reliable stroke audit tool. Allied health clinicians agreed on stroke guideline recommendations to inform a stroke audit tool. The stroke audit tool demonstrated substantial consistency supporting future use for service development. This process, which engages local clinicians, could be adopted by other facilities to design reliable audit tools to identify local service gaps to inform changes to clinical practice. © 2018 John Wiley & Sons, Ltd.
Klump, Barbara C; Sugasawa, Shoko; St Clair, James J H; Rutz, Christian
2015-11-18
New Caledonian crows use a range of foraging tools, and are the only non-human species known to craft hooks. Based on a small number of observations, their manufacture of hooked stick tools has previously been described as a complex, multi-stage process. Tool behaviour is shaped by genetic predispositions, individual and social learning, and/or ecological influences, but disentangling the relative contributions of these factors remains a major research challenge. The properties of raw materials are an obvious, but largely overlooked, source of variation in tool-manufacture behaviour. We conducted experiments with wild-caught New Caledonian crows, to assess variation in their hooked stick tool making, and to investigate how raw-material properties affect the manufacture process. In Experiment 1, we showed that New Caledonian crows' manufacture of hooked stick tools can be much more variable than previously thought (85 tools by 18 subjects), and can involve two newly-discovered behaviours: 'pulling' for detaching stems and bending of the tool shaft. Crows' tool manufactures varied significantly: in the number of different action types employed; in the time spent processing the hook and bending the tool shaft; and in the structure of processing sequences. In Experiment 2, we examined the interaction of crows with raw materials of different properties, using a novel paradigm that enabled us to determine subjects' rank-ordered preferences (42 tools by 7 subjects). Plant properties influenced: the order in which crows selected stems; whether a hooked tool was manufactured; the time required to release a basic tool; and, possibly, the release technique, the number of behavioural actions, and aspects of processing behaviour. Results from Experiment 2 suggested that at least part of the natural behavioural variation observed in Experiment 1 is due to the effect of raw-material properties. Our discovery of novel manufacture behaviours indicates a plausible scenario for the evolutionary origins, and gradual refinement, of New Caledonian crows' hooked stick tool making. Furthermore, our experimental demonstration of a link between raw-material properties and aspects of tool manufacture provides an alternative hypothesis for explaining regional differences in tool behaviours observed in New Caledonian crows, and some primate species.
NASA Astrophysics Data System (ADS)
Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.
2016-11-01
The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.
Model-based high-throughput design of ion exchange protein chromatography.
Khalaf, Rushd; Heymann, Julia; LeSaout, Xavier; Monard, Florence; Costioli, Matteo; Morbidelli, Massimo
2016-08-12
This work describes the development of a model-based high-throughput design (MHD) tool for the operating space determination of a chromatographic cation-exchange protein purification process. Based on a previously developed thermodynamic mechanistic model, the MHD tool generates a large amount of system knowledge and thereby permits minimizing the required experimental workload. In particular, each new experiment is designed to generate information needed to help refine and improve the model. Unnecessary experiments that do not increase system knowledge are avoided. Instead of aspiring to a perfectly parameterized model, the goal of this design tool is to use early model parameter estimates to find interesting experimental spaces, and to refine the model parameter estimates with each new experiment until a satisfactory set of process parameters is found. The MHD tool is split into four sections: (1) prediction, high throughput experimentation using experiments in (2) diluted conditions and (3) robotic automated liquid handling workstations (robotic workstation), and (4) operating space determination and validation. (1) Protein and resin information, in conjunction with the thermodynamic model, is used to predict protein resin capacity. (2) The predicted model parameters are refined based on gradient experiments in diluted conditions. (3) Experiments on the robotic workstation are used to further refine the model parameters. (4) The refined model is used to determine operating parameter space that allows for satisfactory purification of the protein of interest on the HPLC scale. Each section of the MHD tool is used to define the adequate experimental procedures for the next section, thus avoiding any unnecessary experimental work. We used the MHD tool to design a polishing step for two proteins, a monoclonal antibody and a fusion protein, on two chromatographic resins, in order to demonstrate it has the ability to strongly accelerate the early phases of process development. Copyright © 2016 Elsevier B.V. All rights reserved.
Abstract: This case study application provides discussion on a selected application of advanced concepts, included in the End of Asset Life Reinvestment decision-making process tool, using Milwaukee Metropolitan Sewer District (MMSD) pump and motor data sets. The tool provides s...
Integrated landscape/hydrologic modeling tool for semiarid watersheds
Mariano Hernandez; Scott N. Miller
2000-01-01
An integrated hydrologic modeling/watershed assessment tool is being developed to aid in determining the susceptibility of semiarid landscapes to natural and human-induced changes across a range of scales. Watershed processes are by definition spatially distributed and are highly variable through time, and this approach is designed to account for their spatial and...
Human Behavior Based Exploratory Model for Successful Implementation of Lean Enterprise in Industry
ERIC Educational Resources Information Center
Sawhney, Rupy; Chason, Stewart
2005-01-01
Currently available Lean tools such as Lean Assessments, Value Stream Mapping, and Process Flow Charting focus on system requirements and overlook human behavior. A need is felt for a tool that allows one to baseline personnel, determine personnel requirements and align system requirements with personnel requirements. Our exploratory model--The…
cryoem-cloud-tools: A software platform to deploy and manage cryo-EM jobs in the cloud.
Cianfrocco, Michael A; Lahiri, Indrajit; DiMaio, Frank; Leschziner, Andres E
2018-06-01
Access to streamlined computational resources remains a significant bottleneck for new users of cryo-electron microscopy (cryo-EM). To address this, we have developed tools that will submit cryo-EM analysis routines and atomic model building jobs directly to Amazon Web Services (AWS) from a local computer or laptop. These new software tools ("cryoem-cloud-tools") have incorporated optimal data movement, security, and cost-saving strategies, giving novice users access to complex cryo-EM data processing pipelines. Integrating these tools into the RELION processing pipeline and graphical user interface we determined a 2.2 Å structure of ß-galactosidase in ∼55 hours on AWS. We implemented a similar strategy to submit Rosetta atomic model building and refinement to AWS. These software tools dramatically reduce the barrier for entry of new users to cloud computing for cryo-EM and are freely available at cryoem-tools.cloud. Copyright © 2018. Published by Elsevier Inc.
Using the Leitz LMS 2000 for monitoring and improvement of an e-beam
NASA Astrophysics Data System (ADS)
Blaesing-Bangert, Carola; Roeth, Klaus-Dieter; Ogawa, Yoichi
1994-11-01
Kaizen--a continuously improving--is a philosophy lived in Japan which is also becoming more and more important in Western companies. To implement this philosophy in the semiconductor industry, a high performance metrology tool is essential to determine the status of production quality periodically. An important prerequisite for statistical process control is the high stability of the metrology tool over several months or years; the tool-induced shift should be as small as possible. The pattern placement metrology tool Leitz LMS 2000 has been used in a major European mask house for several years now to qualify masks within the tightest specifications and to monitor the MEBES III and its cassettes. The mask shop's internal specification for the long term repeatability of the pattern placement metrology tool is 19 nm instead of 42 nm as specified by the supplier of the tool. Then the process capability of the LMS 2000 over 18 months is represented by an average cpk value of 2.8 for orthogonality, 5.2 for x-scaling, and 3.0 for y-scaling. The process capability of the MEBES III and its cassettes was improved in the past years. For instance, 100% of the masks produced with a process tolerance of +/- 200 nm are now within this limit.
Proposed algorithm to improve job shop production scheduling using ant colony optimization method
NASA Astrophysics Data System (ADS)
Pakpahan, Eka KA; Kristina, Sonna; Setiawan, Ari
2017-12-01
This paper deals with the determination of job shop production schedule on an automatic environment. On this particular environment, machines and material handling system are integrated and controlled by a computer center where schedule were created and then used to dictate the movement of parts and the operations at each machine. This setting is usually designed to have an unmanned production process for a specified interval time. We consider here parts with various operations requirement. Each operation requires specific cutting tools. These parts are to be scheduled on machines each having identical capability, meaning that each machine is equipped with a similar set of cutting tools therefore is capable of processing any operation. The availability of a particular machine to process a particular operation is determined by the remaining life time of its cutting tools. We proposed an algorithm based on the ant colony optimization method and embedded them on matlab software to generate production schedule which minimize the total processing time of the parts (makespan). We test the algorithm on data provided by real industry and the process shows a very short computation time. This contributes a lot to the flexibility and timelines targeted on an automatic environment.
Bridging the gap between finance and clinical operations with activity-based cost management.
Storfjell, J L; Jessup, S
1996-12-01
Activity-based cost management (ABCM) is an exciting management tool that links financial information with operations. By determining the costs of specific activities and processes, nurse managers accurately determine true costs of services more accurately than traditional cost accounting methods, and then can target processes for improvement and monitor them for change and improvement. The authors describe the ABCM process applied to nursing management situations.
Evaluation of Friction Stir Processing of HY-80 Steel Under Wet and Dry Conditions
2012-03-01
MS80. The tool design included a convex scroll shoulder with a step-spiral protruding pin (CS4). Figure 4. PCBN FSW/P threaded tool. 12 For...and cooling water was pumped through during the FSW/P process, Figure 7. Sea salt was added to distilled water to create a 3.5% salt content. 14... Vacuum hot extraction was used to determine the hydrogen concentration as specified by ASTM E 146–83. In addition, combustion infrared detection
NASA Technical Reports Server (NTRS)
Tahmasebi, Farhad; Pearce, Robert
2016-01-01
Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. For efficiency and speed, the tool takes advantage of a function developed in Excels Visual Basic for Applications. The strategic planning process for determining the community Outcomes is also briefly discussed. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples of using the tool are also presented.
In situ monitoring of cocrystals in formulation development using low-frequency Raman spectroscopy.
Otaki, Takashi; Tanabe, Yuta; Kojima, Takashi; Miura, Masaru; Ikeda, Yukihiro; Koide, Tatsuo; Fukami, Toshiro
2018-05-05
In recent years, to guarantee a quality-by-design approach to the development of pharmaceutical products, it is important to identify properties of raw materials and excipients in order to determine critical process parameters and critical quality attributes. Feedback obtained from real-time analyses using various process analytical technology (PAT) tools has been actively investigated. In this study, in situ monitoring using low-frequency (LF) Raman spectroscopy (10-200 cm -1 ), which may have higher discriminative ability among polymorphs than near-infrared spectroscopy and conventional Raman spectroscopy (200-1800 cm -1 ), was investigated as a possible application to PAT. This is because LF-Raman spectroscopy obtains information about intermolecular and/or lattice vibrations in the solid state. The monitoring results obtained from Furosemide/Nicotinamide cocrystal indicate that LF-Raman spectroscopy is applicable to in situ monitoring of suspension and fluidized bed granulation processes, and is an effective technique as a PAT tool to detect the conversion risk of cocrystals. LF-Raman spectroscopy is also used as a PAT tool to monitor reactions, crystallizations, and manufacturing processes of drug substances and products. In addition, a sequence of conversion behaviors of Furosemide/Nicotinamide cocrystals was determined by performing in situ monitoring for the first time. Copyright © 2018 Elsevier B.V. All rights reserved.
What puts the how in where? Tool use and the divided visual streams hypothesis.
Frey, Scott H
2007-04-01
An influential theory suggests that the dorsal (occipito-parietal) visual stream computes representations of objects for purposes of guiding actions (determining 'how') independently of ventral (occipito-temporal) stream processes supporting object recognition and semantic processing (determining 'what'). Yet, the ability of the dorsal stream alone to account for one of the most common forms of human action, tool use, is limited. While experience-dependent modifications to existing dorsal stream representations may explain simple tool use behaviors (e.g., using sticks to extend reach) found among a variety of species, skillful use of manipulable artifacts (e.g., cups, hammers, pencils) requires in addition access to semantic representations of objects' functions and uses. Functional neuroimaging suggests that this latter information is represented in a left-lateralized network of temporal, frontal and parietal areas. I submit that the well-established dominance of the human left hemisphere in the representation of familiar skills stems from the ability for this acquired knowledge to influence the organization of actions within the dorsal pathway.
Managing complex research datasets using electronic tools: A meta-analysis exemplar
Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.
2013-01-01
Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256
Managing complex research datasets using electronic tools: a meta-analysis exemplar.
Brown, Sharon A; Martin, Ellen E; Garcia, Theresa J; Winter, Mary A; García, Alexandra A; Brown, Adama; Cuevas, Heather E; Sumlin, Lisa L
2013-06-01
Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, for example, EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process as well as enhancing communication among research team members. The purpose of this article is to describe the electronic processes designed, using commercially available software, for an extensive, quantitative model-testing meta-analysis. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to decide on which electronic tools to use, determine how these tools would be used, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members.
Evaluation of interaction dynamics of concurrent processes
NASA Astrophysics Data System (ADS)
Sobecki, Piotr; Białasiewicz, Jan T.; Gross, Nicholas
2017-03-01
The purpose of this paper is to present the wavelet tools that enable the detection of temporal interactions of concurrent processes. In particular, the determination of interaction coherence of time-varying signals is achieved using a complex continuous wavelet transform. This paper has used electrocardiogram (ECG) and seismocardiogram (SCG) data set to show multiple continuous wavelet analysis techniques based on Morlet wavelet transform. MATLAB Graphical User Interface (GUI), developed in the reported research to assist in quick and simple data analysis, is presented. These software tools can discover the interaction dynamics of time-varying signals, hence they can reveal their correlation in phase and amplitude, as well as their non-linear interconnections. The user-friendly MATLAB GUI enables effective use of the developed software what enables to load two processes under investigation, make choice of the required processing parameters, and then perform the analysis. The software developed is a useful tool for researchers who have a need for investigation of interaction dynamics of concurrent processes.
Development of a Scale-up Tool for Pervaporation Processes
Thiess, Holger; Strube, Jochen
2018-01-01
In this study, an engineering tool for the design and optimization of pervaporation processes is developed based on physico-chemical modelling coupled with laboratory/mini-plant experiments. The model incorporates the solution-diffusion-mechanism, polarization effects (concentration and temperature), axial dispersion, pressure drop and the temperature drop in the feed channel due to vaporization of the permeating components. The permeance, being the key model parameter, was determined via dehydration experiments on a mini-plant scale for the binary mixtures ethanol/water and ethyl acetate/water. A second set of experimental data was utilized for the validation of the model for two chemical systems. The industrially relevant ternary mixture, ethanol/ethyl acetate/water, was investigated close to its azeotropic point and compared to a simulation conducted with the determined binary permeance data. Experimental and simulation data proved to agree very well for the investigated process conditions. In order to test the scalability of the developed engineering tool, large-scale data from an industrial pervaporation plant used for the dehydration of ethanol was compared to a process simulation conducted with the validated physico-chemical model. Since the membranes employed in both mini-plant and industrial scale were of the same type, the permeance data could be transferred. The comparison of the measured and simulated data proved the scalability of the derived model. PMID:29342956
NASA Astrophysics Data System (ADS)
Vincent, Timothy J.; Rumpfkeil, Markus P.; Chaudhary, Anil
2018-03-01
The complex, multi-faceted physics of laser-based additive metals processing tends to demand high-fidelity models and costly simulation tools to provide predictions accurate enough to aid in selecting process parameters. Of particular difficulty is the accurate determination of melt pool shape and size, which are useful for predicting lack-of-fusion, as this typically requires an adequate treatment of thermal and fluid flow. In this article we describe a novel numerical simulation tool which aims to achieve a balance between accuracy and cost. This is accomplished by making simplifying assumptions regarding the behavior of the gas-liquid interface for processes with a moderate energy density, such as Laser Engineered Net Shaping (LENS). The details of the implementation, which is based on the solver simpleFoam of the well-known software suite OpenFOAM, are given here and the tool is verified and validated for a LENS process involving Ti-6Al-4V. The results indicate that the new tool predicts width and height of a deposited track to engineering accuracy levels.
NASA Astrophysics Data System (ADS)
Vincent, Timothy J.; Rumpfkeil, Markus P.; Chaudhary, Anil
2018-06-01
The complex, multi-faceted physics of laser-based additive metals processing tends to demand high-fidelity models and costly simulation tools to provide predictions accurate enough to aid in selecting process parameters. Of particular difficulty is the accurate determination of melt pool shape and size, which are useful for predicting lack-of-fusion, as this typically requires an adequate treatment of thermal and fluid flow. In this article we describe a novel numerical simulation tool which aims to achieve a balance between accuracy and cost. This is accomplished by making simplifying assumptions regarding the behavior of the gas-liquid interface for processes with a moderate energy density, such as Laser Engineered Net Shaping (LENS). The details of the implementation, which is based on the solver simpleFoam of the well-known software suite OpenFOAM, are given here and the tool is verified and validated for a LENS process involving Ti-6Al-4V. The results indicate that the new tool predicts width and height of a deposited track to engineering accuracy levels.
A review of cultural adaptations of screening tools for autism spectrum disorders.
Soto, Sandra; Linas, Keri; Jacobstein, Diane; Biel, Matthew; Migdal, Talia; Anthony, Bruno J
2015-08-01
Screening children to determine risk for Autism Spectrum Disorders has become more common, although some question the advisability of such a strategy. The purpose of this systematic review is to identify autism screening tools that have been adapted for use in cultures different from that in which they were developed, evaluate the cultural adaptation process, report on the psychometric properties of the adapted instruments, and describe the implications for further research and clinical practice. A total of 21 articles met criteria for inclusion, reporting on the cultural adaptation of autism screening in 19 countries and in 10 languages. The cultural adaptation process was not always clearly outlined and often did not include the recommended guidelines. Cultural/linguistic modifications to the translated tools tended to increase with the rigor of the adaptation process. Differences between the psychometric properties of the original and adapted versions were common, indicating the need to obtain normative data on populations to increase the utility of the translated tool. © The Author(s) 2014.
Watershed forest management using decision support technology
Mark Twery; Robert Northrop
2004-01-01
Using innovative partnerships and a variety of decision support tools, we identified the needs and goals of Baltimore, Maryland, for their reservoir properties containing over 17000 forested acres; developed a management plan; determined the information necessary to evaluate conditions, processes, and context; chose tools to use; collected, organized, and analyzed data...
Lateral position detection and control for friction stir systems
Fleming, Paul; Lammlein, David H.; Cook, George E.; Wilkes, Don Mitchell; Strauss, Alvin M.; Delapp, David R.; Hartman, Daniel A.
2012-06-05
An apparatus and computer program are disclosed for processing at least one workpiece using a rotary tool with rotating member for contacting and processing the workpiece. The methods include oscillating the rotary tool laterally with respect to a selected propagation path for the rotating member with respect to the workpiece to define an oscillation path for the rotating member. The methods further include obtaining force signals or parameters related to the force experienced by the rotary tool at least while the rotating member is disposed at the extremes of the oscillation. The force signals or parameters associated with the extremes can then be analyzed to determine a lateral position of the selected path with respect to a target path and a lateral offset value can be determined based on the lateral position. The lateral distance between the selected path and the target path can be decreased based on the lateral offset value.
Lateral position detection and control for friction stir systems
Fleming, Paul [Boulder, CO; Lammlein, David H [Houston, TX; Cook, George E [Brentwood, TN; Wilkes, Don Mitchell [Nashville, TN; Strauss, Alvin M [Nashville, TN; Delapp, David R [Ashland City, TN; Hartman, Daniel A [Fairhope, AL
2011-11-08
Friction stir methods are disclosed for processing at least one workpiece using a rotary tool with rotating member for contacting and processing the workpiece. The methods include oscillating the rotary tool laterally with respect to a selected propagation path for the rotating member with respect to the workpiece to define an oscillation path for the rotating member. The methods further include obtaining force signals or parameters related to the force experienced by the rotary tool at least while the rotating member is disposed at the extremes of the oscillation. The force signals or parameters associated with the extremes can then be analyzed to determine a lateral position of the selected path with respect to a target path and a lateral offset value can be determined based on the lateral position. The lateral distance between the selected path and the target path can be decreased based on the lateral offset value.
2008-08-18
fidelity will be used to reduce the massive experimental testing and associated time required for qualification of new materials. Tools and...develping a model of the thermo-oxidative process for polymer systems, that incorporates the effects of reaction rates, Fickian diffusion, time varying...degradation processes. Year: 2005 Month: 12 Not required at this time . AIR FORCE OFFICE OF SCIENTIFIC KESEARCH 04 SEP 2008 Page 2 of 2 DTIC Data
Medina-Valverde, M José; Rodríguez-Borrego, M Aurora; Luque-Alcaraz, Olga; de la Torre-Barbero, M José; Parra-Perea, Julia; Moros-Molina, M del Pilar
2012-01-01
To identify problems and critical points in the software application. Assessment of the implementation of the software tool "Azahar" used to manage nursing care processes. The monitored population consisted of nurses who were users of the tool, at the Hospital and those who benefited from it in Primary Care. Each group was selected randomly and the number was determined by data saturation. A qualitative approach was employed using in-depth interviews and group discussion as data collection techniques. The nurses considered that the most beneficial and useful application of the tool was the initial assessment and the continuity of care release forms, as well as the recording of all data on the nursing process to ensure quality. The disadvantages and weaknesses identified were associated with the continuous variability in their daily care. The nurses related an increase in workload with the impossibility of entering the records into the computer, making paper records, thus duplicating the recording process. Likewise, they consider that the operating system of the software should be improved in terms of simplicity and functionality. The simplicity of the tool and the adjustment of workloads would favour its use and as a result, continuity of care. Copyright © 2010 Elsevier España, S.L. All rights reserved.
TU-FG-201-05: Varian MPC as a Statistical Process Control Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carver, A; Rowbottom, C
Purpose: Quality assurance in radiotherapy requires the measurement of various machine parameters to ensure they remain within permitted values over time. In Truebeam release 2.0 the Machine Performance Check (MPC) was released allowing beam output and machine axis movements to be assessed in a single test. We aim to evaluate the Varian Machine Performance Check (MPC) as a tool for Statistical Process Control (SPC). Methods: Varian’s MPC tool was used on three Truebeam and one EDGE linac for a period of approximately one year. MPC was commissioned against independent systems. After this period the data were reviewed to determine whethermore » or not the MPC was useful as a process control tool. Analyses on individual tests were analysed using Shewhart control plots, using Matlab for analysis. Principal component analysis was used to determine if a multivariate model was of any benefit in analysing the data. Results: Control charts were found to be useful to detect beam output changes, worn T-nuts and jaw calibration issues. Upper and lower control limits were defined at the 95% level. Multivariate SPC was performed using Principal Component Analysis. We found little evidence of clustering beyond that which might be naively expected such as beam uniformity and beam output. Whilst this makes multivariate analysis of little use it suggests that each test is giving independent information. Conclusion: The variety of independent parameters tested in MPC makes it a sensitive tool for routine machine QA. We have determined that using control charts in our QA programme would rapidly detect changes in machine performance. The use of control charts allows large quantities of tests to be performed on all linacs without visual inspection of all results. The use of control limits alerts users when data are inconsistent with previous measurements before they become out of specification. A. Carver has received a speaker’s honorarium from Varian.« less
Lateral position detection and control for friction stir systems
Fleming, Paul; Lammlein, David; Cook, George E.; Wilkes, Don Mitchell; Strauss, Alvin M.; Delapp, David; Hartman, Daniel A.
2010-12-14
A friction stir system for processing at least a first workpiece includes a spindle actuator coupled to a rotary tool comprising a rotating member for contacting and processing the first workpiece. A detection system is provided for obtaining information related to a lateral alignment of the rotating member. The detection system comprises at least one sensor for measuring a force experienced by the rotary tool or a parameter related to the force experienced by the rotary tool during processing, wherein the sensor provides sensor signals. A signal processing system is coupled to receive and analyze the sensor signals and determine a lateral alignment of the rotating member relative to a selected lateral position, a selected path, or a direction to decrease a lateral distance relative to the selected lateral position or selected path. In one embodiment, the friction stir system can be embodied as a closed loop tracking system, such as a robot-based tracked friction stir welding (FSW) or friction stir processing (FSP) system.
Temperature and emissivity determination of liquid steel S235
NASA Astrophysics Data System (ADS)
Schöpp, H.; Sperl, A.; Kozakov, R.; Gött, G.; Uhrlandt, D.; Wilhelm, G.
2012-06-01
Temperature determination of liquid metals is difficult but a necessary tool for improving materials and processes such as arc welding in the metal-working industry. A method to determine the surface temperature of the weld pool is described. A TIG welding process and absolute calibrated optical emission spectroscopy are used. This method is combined with high-speed photography. 2D temperature profiles are obtained. The emissivity of the radiating surface has an important influence on the temperature determination. A temperature dependent emissivity for liquid steel is given for the spectral region between 650 and 850 nm.
Analyzing the effect of tool edge radius on cutting temperature in micro-milling process
NASA Astrophysics Data System (ADS)
Liang, Y. C.; Yang, K.; Zheng, K. N.; Bai, Q. S.; Chen, W. Q.; Sun, G. Y.
2010-10-01
Cutting heat is one of the important physical subjects in the cutting process. Cutting heat together with cutting temperature produced by the cutting process will directly have effects on the tool wear and the life as well as on the workpiece processing precision and surface quality. The feature size of the workpiece is usually several microns. Thus, the tiny changes of cutting temperature will affect the workpiece on the surface quality and accuracy. Therefore, cutting heat and temperature generated in micro-milling will have significantly different effect than the one in the traditional tools cutting. In this paper, a two-dimensional coupled thermal-mechanical finite element model is adopted to determine thermal fields and cutting temperature during the Micro-milling process, by using software Deform-2D. The effect of tool edge radius on effective stress, effective strain, velocity field and cutting temperature distribution in micro-milling of aluminum alloy Al2024-T6 were investigated and analyzed. Also, the transient cutting temperature distribution was simulated dynamically. The simulation results show that the cutting temperature in Micro-milling is lower than those occurring in conventional milling processes due to the small loads and low cutting velocity. With increase of tool edge radius, the maximum temperature region gradually occurs on the contact region between finished surfaced and flank face of micro-cutter, instead of the rake face or the corner of micro-cutter. And this phenomenon shows an obvious size effect.
Funding Solar Projects at Federal Agencies: Mechanisms and Selection Criteria (Brochure)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Implementing solar energy projects at federal facilities is a process. The project planning phase of the process includes determining goals, building a team, determining site feasibility and selecting the appropriate project funding tool. This fact sheet gives practical guidance to assist decision-makers with understanding and selecting the funding tool that would best address their site goals. Because project funding tools are complex, federal agencies should seek project assistance before making final decisions. High capital requirements combined with limits on federal agency energy contracts create challenges for funding solar projects. Solar developers typically require long-term contracts (15-20) years to spread outmore » the initial investment and to enable payments similar to conventional utility bill payments. In the private sector, 20-year contracts have been developed, vetted, and accepted, but the General Services Administration (GSA) contract authority (federal acquisition regulation [FAR] part 41) typically limits contract terms to 10 years. Payments on shorter-term contracts make solar economically unattractive compared with conventional generation. However, in several instances, the federal sector has utilized innovative funding tools that allow long-term contracts or has created a project package that is economically attractive within a shorter contract term.« less
USDA-ARS?s Scientific Manuscript database
Photography has been a welcome tool in documenting and conveying qualitative soil information. When coupled with image analysis software, the usefulness of digital cameras can be increased to advance the field of micropedology. The determination of a Representative Elementary Area (REA) still rema...
Non Destructive Analysis of Fsw Welds using Ultrasonic Signal Analysis
NASA Astrophysics Data System (ADS)
Pavan Kumar, T.; Prabhakar Reddy, P.
2017-08-01
Friction Stir Welding is an evolving metal joining technique and is mostly used in joining materials which cannot be easily joined by other available welding techniques. It is a technique which can be used for welding dissimilar materials also. The strength of the weld joint is determined by the way in which these material are mixing with each other, since we are not using any filler material for the welding process the intermixing has a significant importance. The complication with the friction stir welding process is that there are many process parameters which effect this intermixing process such as tool geometry, rotating speed of the tool, transverse speed etc., In this study an attempt is made to compare the material flow and weld quality of various weldments by changing the parameters. Ultrasonic signal Analysis is used to characterize the microstructure of the weldments. use of ultrasonic waves is a non destructive, accurate and fast way of characterization of microstructure. In this method the relationship between the ultrasonic measured parameters and microstructures are evaluated using background echo and backscattered signal process techniques. The ultrasonic velocity and attenuation measurements are dependent on the elastic modulus and any change in the microstructure is reflected in the ultrasonic velocity. An insight into material flow is essential to determine the quality of the weld. Hence an attempt is made in this study to know the relationship between tool geometry and the pattern of material flow and resulting weld quality the experiments are conducted to weld dissimilar aluminum alloys and the weldments are characterized using and ultra Sonic signal processing. Characterization is also done using Scanning Electron Microscopy. It is observed that there is a good correlation between the ultrasonic signal processing results and Scanning Electron Microscopy on the observed precipitates. Tensile tests and hardness tests are conducted on the weldments and compared for determining the weld quality.
Orbit Determination for the Lunar Reconnaissance Orbiter Using an Extended Kalman Filter
NASA Technical Reports Server (NTRS)
Slojkowski, Steven; Lowe, Jonathan; Woodburn, James
2015-01-01
Orbit determination (OD) analysis results are presented for the Lunar Reconnaissance Orbiter (LRO) using a commercially available Extended Kalman Filter, Analytical Graphics' Orbit Determination Tool Kit (ODTK). Process noise models for lunar gravity and solar radiation pressure (SRP) are described and OD results employing the models are presented. Definitive accuracy using ODTK meets mission requirements and is better than that achieved using the operational LRO OD tool, the Goddard Trajectory Determination System (GTDS). Results demonstrate that a Vasicek stochastic model produces better estimates of the coefficient of solar radiation pressure than a Gauss-Markov model, and prediction accuracy using a Vasicek model meets mission requirements over the analysis span. Modeling the effect of antenna motion on range-rate tracking considerably improves residuals and filter-smoother consistency. Inclusion of off-axis SRP process noise and generalized process noise improves filter performance for both definitive and predicted accuracy. Definitive accuracy from the smoother is better than achieved using GTDS and is close to that achieved by precision OD methods used to generate definitive science orbits. Use of a multi-plate dynamic spacecraft area model with ODTK's force model plugin capability provides additional improvements in predicted accuracy.
NASA Astrophysics Data System (ADS)
Hegazy, Maha A.; Lotfy, Hayam M.; Mowaka, Shereen; Mohamed, Ekram Hany
2016-07-01
Wavelets have been adapted for a vast number of signal-processing applications due to the amount of information that can be extracted from a signal. In this work, a comparative study on the efficiency of continuous wavelet transform (CWT) as a signal processing tool in univariate regression and a pre-processing tool in multivariate analysis using partial least square (CWT-PLS) was conducted. These were applied to complex spectral signals of ternary and quaternary mixtures. CWT-PLS method succeeded in the simultaneous determination of a quaternary mixture of drotaverine (DRO), caffeine (CAF), paracetamol (PAR) and p-aminophenol (PAP, the major impurity of paracetamol). While, the univariate CWT failed to simultaneously determine the quaternary mixture components and was able to determine only PAR and PAP, the ternary mixtures of DRO, CAF, and PAR and CAF, PAR, and PAP. During the calculations of CWT, different wavelet families were tested. The univariate CWT method was validated according to the ICH guidelines. While for the development of the CWT-PLS model a calibration set was prepared by means of an orthogonal experimental design and their absorption spectra were recorded and processed by CWT. The CWT-PLS model was constructed by regression between the wavelet coefficients and concentration matrices and validation was performed by both cross validation and external validation sets. Both methods were successfully applied for determination of the studied drugs in pharmaceutical formulations.
Business Process-Based Resource Importance Determination
NASA Astrophysics Data System (ADS)
Fenz, Stefan; Ekelhart, Andreas; Neubauer, Thomas
Information security risk management (ISRM) heavily depends on realistic impact values representing the resources’ importance in the overall organizational context. Although a variety of ISRM approaches have been proposed, well-founded methods that provide an answer to the following question are still missing: How can business processes be used to determine resources’ importance in the overall organizational context? We answer this question by measuring the actual importance level of resources based on business processes. Therefore, this paper presents our novel business process-based resource importance determination method which provides ISRM with an efficient and powerful tool for deriving realistic resource importance figures solely from existing business processes. The conducted evaluation has shown that the calculation results of the developed method comply to the results gained in traditional workshop-based assessments.
Competitive control of cognition in rhesus monkeys.
Kowaguchi, Mayuka; Patel, Nirali P; Bunnell, Megan E; Kralik, Jerald D
2016-12-01
The brain has evolved different approaches to solve problems, but the mechanisms that determine which approach to take remain unclear. One possibility is that control progresses from simpler processes, such as associative learning, to more complex ones, such as relational reasoning, when the simpler ones prove inadequate. Alternatively, control could be based on competition between the processes. To test between these possibilities, we posed the support problem to rhesus monkeys using a tool-use paradigm, in which subjects could pull an object (the tool) toward themselves to obtain an otherwise out-of-reach goal item. We initially provided one problem exemplar as a choice: for the correct option, a food item placed on the support tool; for the incorrect option, the food item placed off the tool. Perceptual cues were also correlated with outcome: e.g., red, triangular tool correct, blue, rectangular tool incorrect. Although the monkeys simply needed to touch the tool to register a response, they immediately pulled it, reflecting a relational reasoning process between themselves and another object (R self-other ), rather than an associative one between the arbitrary touch response and reward (A resp-reward ). Probe testing then showed that all four monkeys used a conjunction of perceptual features to select the correct option, reflecting an associative process between stimuli and reward (A stim-reward ). We then added a second problem exemplar and subsequent testing revealed that the monkeys switched to using the on/off relationship, reflecting a relational reasoning process between two objects (R other-other ). Because behavior appeared to reflect R self-other rather than A resp-reward , and A stim-reward prior to R other-other , our results suggest that cognitive processes are selected via competitive control dynamics. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Sarkari Khorrami, Mahmoud; Kokabi, Amir Hossein; Movahedi, Mojtaba
2015-05-01
In this work, friction stir soldering (FSS) as a new approach for fabrication of copper/copper lap joints was introduced. This process is principally based on the friction stir processing (FSP) that can be performed using FSP tools with and without pin on the top sheet. In the present study, Pb-Sn foil was used as a solder which would be melted and then extruded in the area between the copper sheets during FSS process. This process was carried out using tools with and without pin at various rotation speeds of 1200, 1400, and 1600 rpm and traverse speed of 32 mm/min. Also, the same joint was fabricated using furnace soldering to compare the mechanical properties obtained with FSS and furnace soldering processes. It was observed that FSS possesses some advantages over the conventional furnace soldering process including the formation of more bond area at the interface corresponding to the higher fracture load of FSS joints compared with furnace soldering one. Moreover, it was concluded that the thickness of intermetallic compounds (IMCs) and the formation of voids at the joint interface were the predominant factor determining the mechanical properties of the FSS joints produced by FSS tool with and without pin, respectively. The microstructural examinations revealed that Cu-Sn IMCs of Cu3Sn and Cu6Sn5 were formed at the joint interface. It was observed that the FSS joint produced by tool with pin experienced the more peak temperature in comparison with that produced by pin-free tool. This may lead to the formation of thicker IMCs at the interface. Of course, the thickness of IMCs can be controlled by choosing proper FSS parameters, especially the rotation speed of the tool.
Comparing Noun Phrasing Techniques for Use with Medical Digital Library Tools.
ERIC Educational Resources Information Center
Tolle, Kristin M.; Chen, Hsinchun
2000-01-01
Describes a study that investigated the use of a natural language processing technique called noun phrasing to determine whether it is a viable technique for medical information retrieval. Evaluates four noun phrase generation tools for their ability to isolate noun phrases from medical journal abstracts, focusing on precision and recall.…
NASA Technical Reports Server (NTRS)
Waters, Eric D.
2013-01-01
Recent high level interest in the capability of small launch vehicles has placed significant demand on determining the trade space these vehicles occupy. This has led to the development of a zero level analysis tool that can quickly determine the minimum expected vehicle gross liftoff weight (GLOW) in terms of vehicle stage specific impulse (Isp) and propellant mass fraction (pmf) for any given payload value. Utilizing an extensive background in Earth to orbit trajectory experience a total necessary delta v the vehicle must achieve can be estimated including relevant loss terms. This foresight into expected losses allows for more specific assumptions relating to the initial estimates of thrust to weight values for each stage. This tool was further validated against a trajectory model, in this case the Program to Optimize Simulated Trajectories (POST), to determine if the initial sizing delta v was adequate to meet payload expectations. Presented here is a description of how the tool is setup and the approach the analyst must take when using the tool. Also, expected outputs which are dependent on the type of small launch vehicle being sized will be displayed. The method of validation will be discussed as well as where the sizing tool fits into the vehicle design process.
Cooper Screening of Information Processing (C-SIP). Administrator's Manual.
ERIC Educational Resources Information Center
Cooper, Richard
This document is designed to assist individuals administering the Cooper Screening of Information Processing (C-SIP), which is intended as a diagnostic teaching tool that allows teachers or others to determine, in a conversational setting, whether a person manifests any common characteristics of learning problems. After a brief introduction, a…
Honda, Satoshi; Tsunoda, Hiroko; Fukuda, Wataru; Saida, Yukihisa
2014-12-01
The purpose is to develop a new image toggle tool with automatic density normalization (ADN) and automatic alignment (AA) for comparing serial digital mammograms (DMGs). We developed an ADN and AA process to compare the images of serial DMGs. In image density normalization, a linear interpolation was applied by taking two points of high- and low-brightness areas. The alignment was calculated by determining the point of the greatest correlation while shifting the alignment between the current and prior images. These processes were performed on a PC with a 3.20-GHz Xeon processor and 8 GB of main memory. We selected 12 suspected breast cancer patients who had undergone screening DMGs in the past. Automatic processing was retrospectively performed on these images. Two radiologists subjectively evaluated them. The process of the developed algorithm took approximately 1 s per image. In our preliminary experience, two images could not be aligned approximately. When they were aligned, image toggling allowed detection of differences between examinations easily. We developed a new tool to facilitate comparative reading of DMGs on a mammography viewing system. Using this tool for toggling comparisons might improve the interpretation efficiency of serial DMGs.
Multilayer composition coatings for cutting tools: formation and performance properties
NASA Astrophysics Data System (ADS)
Tabakov, Vladimir P.; Vereschaka, Anatoly S.; Vereschaka, Alexey A.
2018-03-01
The paper considers the concept of a multi-layer architecture of the coating in which each layer has a predetermined functionality. Latest generation of coatings with multi-layered architecture for cutting tools secure a dual nature of the coating, in which coatings should not only improve the mechanical and physical characteristics of the cutting tool material, but also reduce the thermo-mechanical effect on the cutting tool determining wear intensity. Here are presented the results of the development of combined methods of forming multi-layer coatings with improved properties. Combined method of forming coatings using a pulsed laser allowed reducing excessively high levels of compressive residual stress and increasing micro hardness of the multilayered coatings. The results in testing coated HSS tools showed that the use of additional pulse of laser processing increases tool life up to 3 times. Using filtered cathodic vacuum arc deposition for the generation of multilayer coatings based on TiAlN compound has increased the wear-resistance of carbide tools by 2 fold compared with tool life of cutting tool with commercial TiN coatings. The aim of this study was to develop an innovative methodological approach to the deposition of multilayer coatings for cutting tools with functional architectural selection, properties and parameters of the coating based on sound knowledge of coating failure in machining process.
In-Line Monitoring of Fab Processing Using X-Ray Diffraction
NASA Astrophysics Data System (ADS)
Gittleman, Bruce; Kozaczek, Kris
2005-09-01
As the materials shift that started with Cu continues to advance in the semiconductor industry, new issues related to materials microstructure have arisen. While x-ray diffraction (XRD) has long been used in development applications, in this paper we show that results generated in real time by a unique, high throughput, fully automated XRD metrology tool can be used to develop metrics for qualification and monitoring of critical processes in current and future manufacturing. It will be shown that these metrics provide a unique set of data that correlate to manufacturing issues. For example, ionized-sputtering is the current deposition method of choice for both the Cu seed and TaNx/Ta barrier layers. The alpha phase of Ta is widely used in production for the upper layer of the barrier stack, but complete elimination of the beta phase requires a TaNx layer with sufficient N content, but not so much as to start poisoning the target and generating particle issues. This is a well documented issue, but traditional monitoring by sheet resistance methods cannot guarantee the absence of the beta phase, whereas XRD can determine the presence of even small amounts of beta. Nickel silicide for gate metallization is another example where monitoring of phase is critical. As well being able to qualify an anneal process that gives only the desired NiSi phase everywhere across the wafer, XRD can be used to determine if full silicidation of the Ni has occurred and characterize the crystallographic microstructure of the Ni to determine any effect of that microstructure on the anneal process. The post-anneal nickel silicide phase and uniformity of the silicide microstructure can all be monitored in production. Other examples of the application of XRD to process qualification and production monitoring are derived from the dependence of certain processes, some types of defect generation, and device performance on crystallographic texture. The data presented will show that CMP dishing problems could be traced to texture of the barrier layer and mitigated by adjusting the barrier process. The density of pits developed during CMP of electrochemically deposited (ECD) Cu depends on the fraction of (111) oriented grains. It must be emphasized that the crystallographic texture is not only a key parameter for qualification of high yielding and reliable processes, but also serves as a critical parameter for monitoring tool health. The texture of Cu and W are sensitive not only to deviations in performance of the tool depositing or annealing a particular film, but also highly sensitive to the texture of the barrier underlayers and thus any performance deviations in those tools. The XRD metrology tool has been designed with production monitoring in mind and has been fully integrated into both 200 mm and 300 mm fabs. Rapid analysis is achieved by using a high intensity fixed x-ray source, coupled with a large area 2D detector. The output metrics from one point are generated while the tool is measuring a subsequent point, giving true on-the-fly analysis; no post-processing of data is necessary. Spatial resolution on the wafer surface ranging from 35 μm to 1 mm is available, making the tool suitable for monitoring of product wafers. Typical analysis times range from 10 seconds to 2 minutes per point, depending on the film thickness and spot size. Current metrics used for process qualification and production monitoring are phase, FWHM of the primary phase peaks (for mean grain size tracking), and crystallographic texture.
Plasma Diagnostics: Use and Justification in an Industrial Environment
NASA Astrophysics Data System (ADS)
Loewenhardt, Peter
1998-10-01
The usefulness and importance of plasma diagnostics have played a major role in the development of plasma processing tools in the semiconductor industry. As can be seen through marketing materials from semiconductor equipment manufacturers, results from plasma diagnostic equipment can be a powerful tool in selling the technological leadership of tool design. Some diagnostics have long been used for simple process control such as optical emission for endpoint determination, but in recent years more sophisticated and involved diagnostic tools have been utilized in chamber and plasma source development and optimization. It is now common to find an assortment of tools at semiconductor equipment companies such as Langmuir probes, mass spectrometers, spatial optical emission probes, impedance, ion energy and ion flux probes. An outline of how the importance of plasma diagnostics has grown at an equipment manufacturer over the last decade will be given, with examples of significant and useful results obtained. Examples will include the development and optimization of an inductive plasma source, trends and hardware effects on ion energy distributions, mass spectrometry influences on process development and investigations of plasma-wall interactions. Plasma diagnostic focus, in-house development and proliferation in an environment where financial justification requirements are both strong and necessary will be discussed.
Providing Nutritional Care in the Office Practice: Teams, Tools, and Techniques.
Kushner, Robert F
2016-11-01
Provision of dietary counseling in the office setting is enhanced by using team-based care and electronic tools. Effective provider-patient communication is essential for fostering behavior change: the key component of lifestyle medicine. The principles of communication and behavior change are skill-based and grounded in scientific theories and models. Motivational interviewing and shared decision making, a collaboration process between patients and their providers to reach agreement about a health decision, is an important process in counseling. The stages of change, self-determination, health belief model, social cognitive model, theory of planned behavior, and cognitive behavioral therapy are used in the counseling process. Copyright © 2016 Elsevier Inc. All rights reserved.
Thermo-Mechanical Processing in Friction Stir Welds
NASA Technical Reports Server (NTRS)
Schneider, Judy
2003-01-01
Friction stir welding is a solid-phase joining, or welding process that was invented in 1991 at The Welding Institute (TWI). The process is potentially capable of joining a wide variety of aluminum alloys that are traditionally difficult to fusion weld. The friction stir welding (FSW) process produces welds by moving a non-consumable rotating pin tool along a seam between work pieces that are firmly clamped to an anvil. At the start of the process, the rotating pin is plunged into the material to a pre-determined load. The required heat is produced by a combination of frictional and deformation heating. The shape of the tool shoulder and supporting anvil promotes a high hydrostatic pressure along the joint line as the tool shears and literally stirs the metal together. To produce a defect free weld, process variables (RPM, transverse speed, and downward force) and tool pin design must be chosen carefully. An accurate model of the material flow during the process is necessary to guide process variable selection. At MSFC a plastic slip line model of the process has been synthesized based on macroscopic images of the resulting weld material. Although this model appears to have captured the main features of the process, material specific interactions are not understood. The objective of the present research was to develop a basic understanding of the evolution of the microstructure to be able to relate it to the deformation process variables of strain, strain rate, and temperature.
Toward an Efficient Icing CFD Process Using an Interactive Software Toolkit: Smagglce 2D
NASA Technical Reports Server (NTRS)
Vickerman, Mary B.; Choo, Yung K.; Schilling, Herbert W.; Baez, Marivell; Braun, Donald C.; Cotton, Barbara J.
2001-01-01
Two-dimensional CID analysis for iced airfoils can be a labor-intensive task. The software toolkit SmaggIce 2D is being developed to help streamline the CID process and provide the unique features needed for icing. When complete, it will include a combination of partially automated and fully interactive tools for all aspects of the tasks leading up to the flow analysis: geometry preparation, domain decomposition. block boundary demoralization. gridding, and linking with a flow solver. It also includes tools to perform ice shape characterization, an important aid in determining the relationship between ice characteristics and their effects on aerodynamic performance. Completed tools, work-in-progress, and planned features of the software toolkit are presented here.
Influence of Processing Parameters on the Flow Path in Friction Stir Welding
NASA Technical Reports Server (NTRS)
Schneider, J. A.; Nunes, A. C., Jr.
2006-01-01
Friction stir welding (FSW) is a solid phase welding process that unites thermal and mechanical aspects to produce a high quality joint. The process variables are rpm, translational weld speed, and downward plunge force. The strain-temperature history of a metal element at each point on the cross-section of the weld is determined by the individual flow path taken by the particular filament of metal flowing around the tool as influenced by the process variables. The resulting properties of the weld are determined by the strain-temperature history. Thus to control FSW properties, improved understanding of the processing parameters on the metal flow path is necessary.
Towards Automated Screening of Two-dimensional Crystals
Cheng, Anchi; Leung, Albert; Fellmann, Denis; Quispe, Joel; Suloway, Christian; Pulokas, James; Carragher, Bridget; Potter, Clinton S.
2007-01-01
Screening trials to determine the presence of two-dimensional (2D) protein crystals suitable for three-dimensional structure determination using electron crystallography is a very labor-intensive process. Methods compatible with fully automated screening have been developed for the process of crystal production by dialysis and for producing negatively stained grids of the resulting trials. Further automation via robotic handling of the EM grids, and semi-automated transmission electron microscopic imaging and evaluation of the trial grids is also possible. We, and others, have developed working prototypes for several of these tools and tested and evaluated them in a simple screen of 24 crystallization conditions. While further development of these tools is certainly required for a turn-key system, the goal of fully automated screening appears to be within reach. PMID:17977016
Kennen, Jonathan G.; Henriksen, James A.; Nieswand, Steven P.
2007-01-01
The natural flow regime paradigm and parallel stream ecological concepts and theories have established the benefits of maintaining or restoring the full range of natural hydrologic variation for physiochemical processes, biodiversity, and the evolutionary potential of aquatic and riparian communities. A synthesis of recent advances in hydroecological research coupled with stream classification has resulted in a new process to determine environmental flows and assess hydrologic alteration. This process has national and international applicability. It allows classification of streams into hydrologic stream classes and identification of a set of non-redundant and ecologically relevant hydrologic indices for 10 critical sub-components of flow. Three computer programs have been developed for implementing the Hydroecological Integrity Assessment Process (HIP): (1) the Hydrologic Indices Tool (HIT), which calculates 171 ecologically relevant hydrologic indices on the basis of daily-flow and peak-flow stream-gage data; (2) the New Jersey Hydrologic Assessment Tool (NJHAT), which can be used to establish a hydrologic baseline period, provide options for setting baseline environmental-flow standards, and compare past and proposed streamflow alterations; and (3) the New Jersey Stream Classification Tool (NJSCT), designed for placing unclassified streams into pre-defined stream classes. Biological and multivariate response models including principal-component, cluster, and discriminant-function analyses aided in the development of software and implementation of the HIP for New Jersey. A pilot effort is currently underway by the New Jersey Department of Environmental Protection in which the HIP is being used to evaluate the effects of past and proposed surface-water use, ground-water extraction, and land-use changes on stream ecosystems while determining the most effective way to integrate the process into ongoing regulatory programs. Ultimately, this scientifically defensible process will help to quantify the effects of anthropogenic changes and development on hydrologic variability and help planners and resource managers balance current and future water requirements with ecological needs.
Present and future of membrane protein structure determination by electron crystallography.
Ubarretxena-Belandia, Iban; Stokes, David L
2010-01-01
Membrane proteins are critical to cell physiology, playing roles in signaling, trafficking, transport, adhesion, and recognition. Despite their relative abundance in the proteome and their prevalence as targets of therapeutic drugs, structural information about membrane proteins is in short supply. This chapter describes the use of electron crystallography as a tool for determining membrane protein structures. Electron crystallography offers distinct advantages relative to the alternatives of X-ray crystallography and NMR spectroscopy. Namely, membrane proteins are placed in their native membranous environment, which is likely to favor a native conformation and allow changes in conformation in response to physiological ligands. Nevertheless, there are significant logistical challenges in finding appropriate conditions for inducing membrane proteins to form two-dimensional arrays within the membrane and in using electron cryo-microscopy to collect the data required for structure determination. A number of developments are described for high-throughput screening of crystallization trials and for automated imaging of crystals with the electron microscope. These tools are critical for exploring the necessary range of factors governing the crystallization process. There have also been recent software developments to facilitate the process of structure determination. However, further innovations in the algorithms used for processing images and electron diffraction are necessary to improve throughput and to make electron crystallography truly viable as a method for determining atomic structures of membrane proteins. Copyright © 2010 Elsevier Inc. All rights reserved.
Present and future of membrane protein structure determination by electron crystallography
Ubarretxena-Belandia, Iban; Stokes, David L.
2011-01-01
Membrane proteins are critical to cell physiology, playing roles in signaling, trafficking, transport, adhesion, and recognition. Despite their relative abundance in the proteome and their prevalence as targets of therapeutic drugs, structural information about membrane proteins is in short supply. This review describes the use of electron crystallography as a tool for determining membrane protein structures. Electron crystallography offers distinct advantages relative to the alternatives of X-ray crystallography and NMR spectroscopy. Namely, membrane proteins are placed in their native membranous environment, which is likely to favor a native conformation and allow changes in conformation in response to physiological ligands. Nevertheless, there are significant logistical challenges in finding appropriate conditions for inducing membrane proteins to form two-dimensional arrays within the membrane and in using electron cryo-microscopy to collect the data required for structure determination. A number of developments are described for high-throughput screening of crystallization trials and for automated imaging of crystals with the electron microscope. These tools are critical for exploring the necessary range of factors governing the crystallization process. There have also been recent software developments to facilitate the process of structure determination. However, further innovations in the algorithms used for processing images and electron diffraction are necessary to improve throughput and to make electron crystallography truly viable as a method for determining atomic structures of membrane proteins. PMID:21115172
Tahmasbi, Vahid; Ghoreishi, Majid; Zolfaghari, Mojtaba
2017-11-01
The bone drilling process is very prominent in orthopedic surgeries and in the repair of bone fractures. It is also very common in dentistry and bone sampling operations. Due to the complexity of bone and the sensitivity of the process, bone drilling is one of the most important and sensitive processes in biomedical engineering. Orthopedic surgeries can be improved using robotic systems and mechatronic tools. The most crucial problem during drilling is an unwanted increase in process temperature (higher than 47 °C), which causes thermal osteonecrosis or cell death and local burning of the bone tissue. Moreover, imposing higher forces to the bone may lead to breaking or cracking and consequently cause serious damage. In this study, a mathematical second-order linear regression model as a function of tool drilling speed, feed rate, tool diameter, and their effective interactions is introduced to predict temperature and force during the bone drilling process. This model can determine the maximum speed of surgery that remains within an acceptable temperature range. Moreover, for the first time, using designed experiments, the bone drilling process was modeled, and the drilling speed, feed rate, and tool diameter were optimized. Then, using response surface methodology and applying a multi-objective optimization, drilling force was minimized to sustain an acceptable temperature range without damaging the bone or the surrounding tissue. In addition, for the first time, Sobol statistical sensitivity analysis is used to ascertain the effect of process input parameters on process temperature and force. The results show that among all effective input parameters, tool rotational speed, feed rate, and tool diameter have the highest influence on process temperature and force, respectively. The behavior of each output parameters with variation in each input parameter is further investigated. Finally, a multi-objective optimization has been performed considering all the aforementioned parameters. This optimization yielded a set of data that can considerably improve orthopedic osteosynthesis outcomes.
Dimensional Analysis in Physics and the Buckingham Theorem
ERIC Educational Resources Information Center
Misic, Tatjana; Najdanovic-Lukic, Marina; Nesic, Ljubisa
2010-01-01
Dimensional analysis is a simple, clear and intuitive method for determining the functional dependence of physical quantities that are of importance to a certain process. However, in physics textbooks, very little space is usually given to this approach and it is often presented only as a diagnostic tool used to determine the validity of…
The study on dynamic properties of monolithic ball end mills with various slenderness
NASA Astrophysics Data System (ADS)
Wojciechowski, Szymon; Tabaszewski, Maciej; Krolczyk, Grzegorz M.; Maruda, Radosław W.
2017-10-01
The reliable determination of modal mass, damping and stiffness coefficient (modal parameters) for the particular machine-toolholder-tool system is essential for the accurate estimation of vibrations, stability and thus the machined surface finish formed during the milling process. Therefore, this paper focuses on the analysis of ball end mill's dynamical properties. The tools investigated during this study are monolithic ball end mills with different slenderness values, made of coated cemented carbide. These kinds of tools are very often applied during the precise milling of curvilinear surfaces. The research program included the impulse test carried out for the investigated tools clamped in the hydraulic toolholder. The obtained modal parameters were further applied in the developed tool's instantaneous deflection model, in order to estimate the tool's working part vibrations during precise milling. The application of the proposed dynamics model involved also the determination of instantaneous cutting forces on the basis of the mechanistic approach. The research revealed that ball end mill's slenderness can be considered as an important milling dynamics and machined surface quality indicator.
Wang, Pei; Zhang, Hui; Yang, Hailong; Nie, Lei; Zang, Hengchang
2015-02-25
Near-infrared (NIR) spectroscopy has been developed into an indispensable tool for both academic research and industrial quality control in a wide field of applications. The feasibility of NIR spectroscopy to monitor the concentration of puerarin, daidzin, daidzein and total isoflavonoid (TIF) during the extraction process of kudzu (Pueraria lobata) was verified in this work. NIR spectra were collected in transmission mode and pretreated with smoothing and derivative. Partial least square regression (PLSR) was used to establish calibration models. Three different variable selection methods, including correlation coefficient method, interval partial least squares (iPLS), and successive projections algorithm (SPA) were performed and compared with models based on all of the variables. The results showed that the approach was very efficient and environmentally friendly for rapid determination of the four quality indices (QIs) in the kudzu extraction process. This method established may have the potential to be used as a process analytical technological (PAT) tool in the future. Copyright © 2014 Elsevier B.V. All rights reserved.
Logic gate scanner focus control in high-volume manufacturing using scatterometry
NASA Astrophysics Data System (ADS)
Dare, Richard J.; Swain, Bryan; Laughery, Michael
2004-05-01
Tool matching and optimal process control are critical requirements for success in semiconductor manufacturing. It is imperative that a tool"s operating conditions are understood and controlled in order to create a process that is repeatable and produces devices within specifications. Likewise, it is important where possible to match multiple systems using some methodology, so that regardless of which tool is used the process remains in control. Agere Systems is currently using Timbre Technologies" Optical Digital Profilometry (ODP) scatterometry for controlling Nikon scanner focus at the most critical lithography layer; logic gate. By adjusting focus settings and verifying the resultant changes in resist profile shape using ODP, it becomes possible to actively control scanner focus to achieve a desired resist profile. Since many critical lithography processes are designed to produce slightly re-entrant resist profiles, this type of focus control is not possible via Critical Dimension Scanning Electron Microscopy (CDSEM) where reentrant profiles cannot be accurately determined. Additionally, the high throughput and non-destructive nature of this measurement technique saves both cycle time and wafer costs compared to cross-section SEM. By implementing an ODP daily process check and after any maintenance on a scanner, Agere successfully enabled focus drift control, i.e. making necessary focus or equipment changes in order to maintain a desired resist profile.
Technology Readiness Level Guidebook
DOT National Transportation Integrated Search
2017-09-01
This guidebook provides the necessary information for conducting a Technology Readiness Level (TRL) Assessment. TRL Assessments are a tool for determining the maturity of technologies and identifying next steps in the research process. This guidebook...
Development of gas-pressure bonding process for air-cooled turbine blades
NASA Technical Reports Server (NTRS)
Meiners, K. E.
1972-01-01
An investigation was conducted on the application of gas-pressure bonding to the joining of components for convectively cooled turbine blades and vanes. A processing procedure was established for joining the fins of Udimet 700 and TD NiCr sheet metal airfoil shells to cast B1900 struts without the use of internal support tooling. Alternative methods employing support tooling were investigated. Testing procedures were developed and employed to determine shear strengths and internal burst pressures of flat and cylindrical bonded finned shell configurations at room temperature and 1750 F. Strength values were determined parallel and transverse to the cooling fin direction. The effect of thermal cycles from 1750 F to room temperature on strength was also investigated.
On the impact of rolling direction and tool orientation angle in Rotary Peen Forming
NASA Astrophysics Data System (ADS)
Gottschalk, M.; Hirt, G.
2016-10-01
Shot Peen Forming processes are suitable to produce surface curvatures that are commonly required for aircraft fuselage as well as structural components. The so called Rotary Peen Forming is an alternative process for manufacturing sheet metals with slight curvature. The forming tool consists of impactors which are connected flexibly to a rotating hub and thus moving on a circular trajectory. An industrial robot guides the Rotary Peen Forming tools. As a result, the machine design is more compact compared to traditional Shot Peen Forming. In the present work, the impact of both, the tool orientation angle and the rolling direction, on the curvature of aluminum AA5083 samples is examined. By means of a point laser measurement, the set-up enables a distance control to adjust a determined indentation depth. It can be shown, that the highest curvature is achieved when the tool is orientated parallel and when the rolling direction of the sheet metal is transversal to the curvature plane.
Interactive visualization of public health indicators to support policymaking: An exploratory study
Zakkar, Moutasem; Sedig, Kamran
2017-01-01
Purpose The purpose of this study is to examine the use of interactive visualizations to represent data/information related to social determinants of health and public health indicators, and to investigate the benefits of such visualizations for health policymaking. Methods: The study developed a prototype for an online interactive visualization tool that represents the social determinants of health. The study participants explored and used the tool. The tool was evaluated using the informal user experience evaluation method. This method involves the prospective users of a tool to use and play with it and their feedback to be collected through interviews. Results: Using visualizations to represent and interact with health indicators has advantages over traditional representation techniques that do not allow users to interact with the information. Communicating healthcare indicators to policymakers is a complex task because of the complexity of the indicators, diversity of audiences, and different audience needs. This complexity can lead to information misinterpretation, which occurs when users of the health data ignore or do not know why, where, and how the data has been produced, or where and how it can be used. Conclusions: Public health policymaking is a complex process, and data is only one element among others needed in this complex process. Researchers and healthcare organizations should conduct a strategic evaluation to assess the usability of interactive visualizations and decision support tools before investing in these tools. Such evaluation should take into consideration the cost, ease of use, learnability, and efficiency of those tools, and the factors that influence policymaking. PMID:29026455
Johnson, Christina; Wilhelmsson, Susan; Börjeson, Sussanne; Lindberg, Malou
2015-06-01
The aim of this study was to develop a self-assessment tool aiming to raise telenurses' awareness of their communication and interpersonal competence, and highlight areas in need of improvement. Several studies have revealed the need for development of communication competence in telenursing. Structured analyses of conversations with patients/callers, is one way to increase telenurses' awareness of their unique communication and interpersonal competence. Instrument development, Validation assessment using the method Content Validity Index. The process to determine content validity was done in two stages; the development stage and the assessment stage. The development stage started with a literature search. The assessment stage was separated into two phases, assessment by an expert group and assessment and test by telenurses. The telenurses also participated in consensus discussions. A telenursing self-assessment tool with 58 items was developed. The items were sorted into five sections according to the nursing process. This study describes the thorough development process of the telenursing self-assessment tool to be used by telenurses in order to become aware of their unique communication and interpersonal competence when analysing their own conversations with patients/callers. As a formative tool it is meant to provide self-direction, feedback and coaching, and create learning opportunities. The self-assessment tool helps the telenurse to follow the nursing process, to be patient-centred, and it is meant to provide self-direction, feedback, and coaching, as well as create learning opportunities. The tool can contribute to the development of communication and interpersonal competence in telephone advice nursing. Further development of the tool may provide an objective scoring instrument for evaluating communication training and education in the field. © 2014 John Wiley & Sons Ltd.
Emission of nanoparticles during friction stir welding (FSW) of aluminium alloys.
Gomes, J F; Miranda, R M; Santos, T J; Carvalho, P A
2014-01-01
Friction stir welding (FSW) is now well established as a welding process capable of joining some different types of metallic materials, as it was (1) found to be a reliable and economical way of producing high quality welds, and (2) considered a "clean" welding process that does not involve fusion of metal, as is the case with other traditional welding processes. The aim of this study was to determine whether the emission of particles during FSW in the nanorange of the most commonly used aluminum (Al) alloys, AA 5083 and AA 6082, originated from the Al alloy itself due to friction of the welding tool against the item that was being welded. Another goal was to measure Al alloys in the alveolar deposited surface area during FSW. Nanoparticles dimensions were predominantly in the 40- and 70-nm range. This study demonstrated that microparticles were also emitted during FSW but due to tool wear. However, the biological relevance and toxic manifestations of these microparticles remain to be determined.
Depth of manual dismantling analysis: A cost–benefit approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Achillas, Ch., E-mail: c.achillas@ihu.edu.gr; Aidonis, D.; Vlachokostas, Ch.
Highlights: ► A mathematical modeling tool for OEMs. ► The tool can be used by OEMs, recyclers of electr(on)ic equipment or WEEE management systems’ regulators. ► The tool makes use of cost–benefit analysis in order to determine the optimal depth of product disassembly. ► The reusable materials and the quantity of metals and plastics recycled can be quantified in an easy-to-comprehend manner. - Abstract: This paper presents a decision support tool for manufacturers and recyclers towards end-of-life strategies for waste electrical and electronic equipment. A mathematical formulation based on the cost benefit analysis concept is herein analytically described in ordermore » to determine the parts and/or components of an obsolete product that should be either non-destructively recovered for reuse or be recycled. The framework optimally determines the depth of disassembly for a given product, taking into account economic considerations. On this basis, it embeds all relevant cost elements to be included in the decision-making process, such as recovered materials and (depreciated) parts/components, labor costs, energy consumption, equipment depreciation, quality control and warehousing. This tool can be part of the strategic decision-making process in order to maximize profitability or minimize end-of-life management costs. A case study to demonstrate the models’ applicability is presented for a typical electronic product in terms of structure and material composition. Taking into account the market values of the pilot product’s components, the manual disassembly is proven profitable with the marginal revenues from recovered reusable materials to be estimated at 2.93–23.06 €, depending on the level of disassembly.« less
Video analysis of projectile motion using tablet computers as experimental tools
NASA Astrophysics Data System (ADS)
Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.
2014-01-01
Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and g in order to explore the underlying laws of motion. This experiment can easily be performed by students themselves, providing more autonomy in their problem-solving processes than traditional learning approaches. We believe that this autonomy and the authenticity of the experimental tool both foster their motivation.
Linking evidence to action on social determinants of health using Urban HEART in the Americas.
Prasad, Amit; Groot, Ana Maria Mahecha; Monteiro, Teofilo; Murphy, Kelly; O'Campo, Patricia; Broide, Emilia Estivalet; Kano, Megumi
2013-12-01
To evaluate the experience of select cities in the Americas using the Urban Health Equity Assessment and Response Tool (Urban HEART) launched by the World Health Organization in 2010 and to determine its utility in supporting government efforts to improve health equity using the social determinants of health (SDH) approach. The Urban HEART experience was evaluated in four cities from 2010-2013: Guarulhos (Brazil), Toronto (Canada), and Bogotá and Medellín (Colombia). Reports were submitted by Urban HEART teams in each city and supplemented by first-hand accounts of key informants. The analysis considered each city's networks and the resources it used to implement Urban HEART; the process by which each city identified equity gaps and prioritized interventions; and finally, the facilitators and barriers encountered, along with next steps. In three cities, local governments spearheaded the process, while in the fourth (Toronto), academia initiated and led the process. All cities used Urban HEART as a platform to engage multiple stakeholders. Urban HEART's Matrix and Monitor were used to identify equity gaps within cities. While Bogotá and Medellín prioritized among existing interventions, Guarulhos adopted new interventions focused on deprived districts. Actions were taken on intermediate determinants, e.g., health systems access, and structural SDH, e.g., unemployment and human rights. Urban HEART provides local governments with a simple and systematic method for assessing and responding to health inequity. Through the SDH approach, the tool has provided a platform for intersectoral action and community involvement. While some areas of guidance could be strengthened, Urban HEART is a useful tool for directing local action on health inequities, and should be scaled up within the Region of the Americas, building upon current experience.
Strategic thinking for radiology.
Schilling, R B
1997-08-01
We have now analyzed the use and benefits of four Strategic Thinking Tools for Radiology: the Vision Statement, the High Five, the Two-by-Two, and Real-Win-Worth. Additional tools will be provided during the tutorial. The tools provided above should be considered as examples. They all contain the 10 benefits outlined earlier to varying degrees. It is extremely important that the tools be used in a manner consistent with the Vision Statement of the organization. The specific situation, the effectiveness of the team, and the experience developed with the tools over time will determine the true benefits of the process. It has also been shown that with active use of the types of tools provided above, teams have learned to modify the tools for increased effectiveness and have created additional tools for specific purposes. Once individuals in the organization become committed to improving communication and to using tools/frameworks for solving problems as a team, effectiveness becomes boundless.
Numerical modelling of tool wear in turning with cemented carbide cutting tools
NASA Astrophysics Data System (ADS)
Franco, P.; Estrems, M.; Faura, F.
2007-04-01
A numerical model is proposed for analysing the flank and crater wear resulting from the loss of material on cutting tool surface in turning processes due to wear mechanisms of adhesion, abrasion and fracture. By means of this model, the material loss along cutting tool surface can be analysed, and the worn surface shape during the workpiece machining can be determined. The proposed model analyses the gradual degradation of cutting tool during turning operation, and tool wear can be estimated as a function of cutting time. Wear-land width (VB) and crater depth (KT) can be obtained for description of material loss on cutting tool surface, and the effects of the distinct wear mechanisms on surface shape can be studied. The parameters required for the tool wear model are obtained from bibliography and experimental observation for AISI 4340 steel turning with WC-Co cutting tools.
Fraser, Kirk A.; St-Georges, Lyne; Kiss, Laszlo I.
2014-01-01
Recognition of the friction stir welding process is growing in the aeronautical and aero-space industries. To make the process more available to the structural fabrication industry (buildings and bridges), being able to model the process to determine the highest speed of advance possible that will not cause unwanted welding defects is desirable. A numerical solution to the transient two-dimensional heat diffusion equation for the friction stir welding process is presented. A non-linear heat generation term based on an arbitrary piecewise linear model of friction as a function of temperature is used. The solution is used to solve for the temperature distribution in the Al 6061-T6 work pieces. The finite difference solution of the non-linear problem is used to perform a Monte-Carlo simulation (MCS). A polynomial response surface (maximum welding temperature as a function of advancing and rotational speed) is constructed from the MCS results. The response surface is used to determine the optimum tool speed of advance and rotational speed. The exterior penalty method is used to find the highest speed of advance and the associated rotational speed of the tool for the FSW process considered. We show that good agreement with experimental optimization work is possible with this simplified model. Using our approach an optimal weld pitch of 0.52 mm/rev is obtained for 3.18 mm thick AA6061-T6 plate. Our method provides an estimate of the optimal welding parameters in less than 30 min of calculation time. PMID:28788627
Fraser, Kirk A; St-Georges, Lyne; Kiss, Laszlo I
2014-04-30
Recognition of the friction stir welding process is growing in the aeronautical and aero-space industries. To make the process more available to the structural fabrication industry (buildings and bridges), being able to model the process to determine the highest speed of advance possible that will not cause unwanted welding defects is desirable. A numerical solution to the transient two-dimensional heat diffusion equation for the friction stir welding process is presented. A non-linear heat generation term based on an arbitrary piecewise linear model of friction as a function of temperature is used. The solution is used to solve for the temperature distribution in the Al 6061-T6 work pieces. The finite difference solution of the non-linear problem is used to perform a Monte-Carlo simulation (MCS). A polynomial response surface (maximum welding temperature as a function of advancing and rotational speed) is constructed from the MCS results. The response surface is used to determine the optimum tool speed of advance and rotational speed. The exterior penalty method is used to find the highest speed of advance and the associated rotational speed of the tool for the FSW process considered. We show that good agreement with experimental optimization work is possible with this simplified model. Using our approach an optimal weld pitch of 0.52 mm/rev is obtained for 3.18 mm thick AA6061-T6 plate. Our method provides an estimate of the optimal welding parameters in less than 30 min of calculation time.
NASA Astrophysics Data System (ADS)
Chan-Amaya, Alejandro; Anaya-Pérez, María Elena; Benítez-Baltazar, Víctor Hugo
2017-08-01
Companies are constantly looking for improvements in productivity to increase their competitiveness. The use of automation technologies is a tool that have been proven to be effective to achieve this. There are companies that are not familiar with the process to acquire automation technologies, therefore, they abstain from investments and thereby miss the opportunity to take advantage of it. The present document proposes a methodology to determine the level of automation appropriate for the production process and thus minimize automation and improve production taking in consideration the ergonomics factor.
Implementation of a Distributed Object-Oriented Database Management System
1989-03-01
and heuristic algorithms. A method for determining ueit allocation by splitting relations in the conceptual schema base on queries and updates is...level framworks can provide to the user the appearance of many tools to be closely integrated. In particular, the KBSA tools use many high level...development process should begin first with conceptual design of the system. Approximately one month should be used to decide how the new projects
Developing the Scale of Teacher Self-Efficacy in Teaching Process
ERIC Educational Resources Information Center
Korkmaz, Fahrettin; Unsal, Serkan
2016-01-01
The purpose of this study is to develop a reliable and valid measurement tool which will reveal teachers' self-competence in education process. Participants of the study are 300 teachers working at state primary schools in the province of Gaziantep. Results of the exploratory factor analysis administered to the scale in order to determine its…
A design aid for determining width of filter strips
M.G. Dosskey; M.J. Helmers; D.E. Eisenhauer
2008-01-01
watershed planners need a tool for determining width of filter strips that is accurate enough for developing cost-effective site designs and easy enough to use for making quick determinations on a large number and variety of sites.This study employed the process-based Vegetative Filter Strip Model to evaluate the relationship between filter strip width and trapping...
Hand-rearing and sex determination tool for the Taveta golden weaver (Ploceus castaneiceps).
Breeding, Shawnlei; Ferrie, Gina M; Schutz, Paul; Leighty, Katherine A; Plassé, Chelle
2012-01-01
Improvements in the ability to hand-rear birds in captivity have aided zoological institutions in the sustainable management of these species, and have provided opportunities to examine their physical growth in varying conditions. Monitoring the weight gain and development of chicks is an important aspect of developing a hand-rearing protocol. In this paper we provide the institutional history for a colonial species of passerine, the Taveta golden weaver, at Disney's Animal Kingdom®, in order to demonstrate the methods of establishing a successful breeding program which largely incorporates hand-rearing in management of the population. We also tested if we could accurately predict sex of chicks using weights collected on Day 14 during the hand-rearing process. Using this tool, we were able to correctly determine sex before fledging in more than 83% of chicks. Early sex determination is important in captive species for genetic management and husbandry purposes. While genetic sexing can be expensive, we found that using growth curves to determine sex can be a reliable and cost-effective tool for population management of a colonial passerine. © 2012 Wiley Periodicals, Inc.
Tool wear modeling using abductive networks
NASA Astrophysics Data System (ADS)
Masory, Oren
1992-09-01
A tool wear model based on Abductive Networks, which consists of a network of `polynomial' nodes, is described. The model relates the cutting parameters, components of the cutting force, and machining time to flank wear. Thus real time measurements of the cutting force can be used to monitor the machining process. The model is obtained by a training process in which the connectivity between the network's nodes and the polynomial coefficients of each node are determined by optimizing a performance criteria. Actual wear measurements of coated and uncoated carbide inserts were used for training and evaluating the established model.
A senior manufacturing laboratory for determining injection molding process capability
NASA Technical Reports Server (NTRS)
Wickman, Jerry L.; Plocinski, David
1992-01-01
The following is a laboratory experiment designed to further understanding of materials science. This subject material is directed at an upper level undergraduate/graduate student in an Engineering or Engineering Technology program. It is assumed that the student has a thorough understanding of the process and quality control. The format of this laboratory does not follow that which is normally recommended because of the nature of process capability and that of the injection molding equipment and tooling. This laboratory is instead developed to be used as a point of departure for determining process capability for any process in either a quality control laboratory or a manufacturing environment where control charts, process capability, and experimental or product design are considered important topics.
NASA Astrophysics Data System (ADS)
Chernov, Ya. B.; Filatov, E. S.
2017-08-01
The kinetics of thermal diffusion boriding in a melt based on calcium chloride with a boron oxide additive is studied using reversed current. The main temperature, concentration, and current parameters of the process are determined. The phase composition of the coating is determined by a metallographic method.
Evaluation of mixed hardwood studs manufactured by the Saw-Dry-Rip (SDR) process
R. R. Maeglin; R. S. Boone
1985-01-01
This paper describes increment cores (a useful tool in forestry and wood technology) and their uses which include age determination, growth increment, specific gravity determination, fiber length measurements, fibril angle measurements, cell measurements, and pathological investigations. Also described is the use and care of the increment borer which is essential in...
Life cycle impacts of manufacturing redwood decking in Northern California
Richard D. Bergman; Elaine Oneil; Ivan L. Eastin; Han-Sup Han
2014-01-01
Awareness of the environmental footprint of building construction and use has led to increasing interest in green building. Defining a green building is an evolving process with life cycle inventory and life cycle impact assessment (LCIA) emerging as key tools in that evolution and definition process. This study used LCIA to determine the environmental footprint...
ERIC Educational Resources Information Center
Abramov, Alexsandr P.; Chuikov, Oleg E.; Gavrikov, Fedor A.; Ludwig, Sergey D.
2017-01-01
The article reveals the essence of the sociocultural approach as a universal tool which allows considering the process of modernization of cadet education in modern Russia in the complex determining its conditions and factors. The basic mechanisms of functioning of cadet education system are the processes that form the equilibrium diad…
Hegazy, Maha A; Lotfy, Hayam M; Mowaka, Shereen; Mohamed, Ekram Hany
2016-07-05
Wavelets have been adapted for a vast number of signal-processing applications due to the amount of information that can be extracted from a signal. In this work, a comparative study on the efficiency of continuous wavelet transform (CWT) as a signal processing tool in univariate regression and a pre-processing tool in multivariate analysis using partial least square (CWT-PLS) was conducted. These were applied to complex spectral signals of ternary and quaternary mixtures. CWT-PLS method succeeded in the simultaneous determination of a quaternary mixture of drotaverine (DRO), caffeine (CAF), paracetamol (PAR) and p-aminophenol (PAP, the major impurity of paracetamol). While, the univariate CWT failed to simultaneously determine the quaternary mixture components and was able to determine only PAR and PAP, the ternary mixtures of DRO, CAF, and PAR and CAF, PAR, and PAP. During the calculations of CWT, different wavelet families were tested. The univariate CWT method was validated according to the ICH guidelines. While for the development of the CWT-PLS model a calibration set was prepared by means of an orthogonal experimental design and their absorption spectra were recorded and processed by CWT. The CWT-PLS model was constructed by regression between the wavelet coefficients and concentration matrices and validation was performed by both cross validation and external validation sets. Both methods were successfully applied for determination of the studied drugs in pharmaceutical formulations. Copyright © 2016 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Rosa, Victor M.
2013-01-01
Purpose: The purpose of this study was to determine the extent to which California public high school principals perceive the WASC Self-Study Process as a valuable tool for bringing about school improvement. The study specifically examines the principals' perceptions of five components within the Self-Study Process: (1) The creation of the…
Wang, Lu; Zeng, Shanshan; Chen, Teng; Qu, Haibin
2014-03-01
A promising process analytical technology (PAT) tool has been introduced for batch processes monitoring. Direct analysis in real time mass spectrometry (DART-MS), a means of rapid fingerprint analysis, was applied to a percolation process with multi-constituent substances for an anti-cancer botanical preparation. Fifteen batches were carried out, including ten normal operations and five abnormal batches with artificial variations. The obtained multivariate data were analyzed by a multi-way partial least squares (MPLS) model. Control trajectories were derived from eight normal batches, and the qualification was tested by R(2) and Q(2). Accuracy and diagnosis capability of the batch model were then validated by the remaining batches. Assisted with high performance liquid chromatography (HPLC) determination, process faults were explained by corresponding variable contributions. Furthermore, a batch level model was developed to compare and assess the model performance. The present study has demonstrated that DART-MS is very promising in process monitoring in botanical manufacturing. Compared with general PAT tools, DART-MS offers a particular account on effective compositions and can be potentially used to improve batch quality and process consistency of samples in complex matrices. Copyright © 2014 Elsevier B.V. All rights reserved.
Reducing the stair step effect of layer manufactured surfaces by ball burnishing
NASA Astrophysics Data System (ADS)
Hiegemann, Lars; Agarwal, Chiranshu; Weddeling, Christian; Tekkaya, A. Erman
2016-10-01
The layer technology enables fast and flexible additive manufacturing of forming tools. The disadvantages of this system is the formation of stair steps in the range of tool radii. Within this work a new method to smooth this stair steps by ball burnishing is introduced. This includes studies on the general feasibility of the process and the determination of the influence of the rolling parameters. The investigations are carried out experimentally and numerically. Ultimately, the gained knowledge is applied to finish a deep drawing tool which is manufactured by layer technology.
ERIC Educational Resources Information Center
Schmidbauer, Hollace J.
2010-01-01
In the late 1990s, teachers in five pilot districts in Ohio were trained during the Baldrige in Education Initiative (BiE IN). Training included Baldrige's theory, quality process and quality tools. The study was a follow-up to determine the effect of the use of the Ohio Baldrige Initiative training in the pilot districts (and other early…
Van Meer, Ryan; Hohenadel, Karin; Fitzgerald-Husek, Alanna; Warshawsky, Bryna; Sider, Doug; Schwartz, Brian; Nelder, Mark P
To determine the Ontario-specific risk of local and travel-related Zika virus transmission in the context of a public health emergency of international concern, Public Health Ontario (PHO) completed a rapid risk assessment (RRA) on January 29, 2016, using a newly developed RRA guidance tool. The RRA concluded that risk of local mosquito-borne transmission was low, with a high risk of imported cases through travel. The RRA was updated 3 times based on predetermined triggers. An independent evaluation assessed both the application of the RRA guidance tool (process evaluation) and the usefulness of the RRA (outcome evaluation). We conducted face-to-face, semi-structured interviews with 7 individuals who participated in the creation or review of the Zika virus RRA and 4 end-users at PHO and the Ministry of Health and Long-Term Care. An inductive thematic analysis of responses was undertaken, whereby themes were directly informed by the data. The process evaluation determined that most steps outlined in the RRA guidance tool were adhered to, including forming a cross-functional writing team, clarifying the scope and describing context, completing the RRA summary report, and updating the RRA based on predefined triggers. The outcome evaluation found that end-users judged the Zika virus RRA as evidence-informed, useful, consistent, and timely. The evaluation established that the locally tailored guidance tool, adapted from national and international approaches to RRAs, facilitated a systematic, evidence-informed, and timely formal RRA process at PHO for the Zika virus RRA, which met the needs of end-users. Based on the evaluation, PHO will modify future RRAs by incorporating some flexibility into the literature review process to support timeliness of the RRA, explicitly describing the limitations of studies used to inform the RRA, and refining risk algorithms to better suit emerging infectious disease threats. It is anticipated that these refinements will improve upon the timely assessment of novel or reemerging infectious diseases.
NASA Technical Reports Server (NTRS)
Tahmasebi, Farhad; Pearce, Robert
2016-01-01
Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. The strategic planning process for determining the community Outcomes is also briefly described. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples are also presented.
Rapid tooling for functional prototyping of metal mold processes. CRADA final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zacharia, T.; Ludtka, G.M.; Bjerke, M.A.
1997-12-01
The overall scope of this endeavor was to develop an integrated computer system, running on a network of heterogeneous computers, that would allow the rapid development of tool designs, and then use process models to determine whether the initial tooling would have characteristics which produce the prototype parts. The major thrust of this program for ORNL was the definition of the requirements for the development of the integrated die design system with the functional purpose to link part design, tool design, and component fabrication through a seamless software environment. The principal product would be a system control program that wouldmore » coordinate the various application programs and implement the data transfer so that any networked workstation would be useable. The overall system control architecture was to be required to easily facilitate any changes, upgrades, or replacements of the model from either the manufacturing end or the design criteria standpoint. The initial design of such a program is described in the section labeled ``Control Program Design``. A critical aspect of this research was the design of the system flow chart showing the exact system components and the data to be transferred. All of the major system components would have been configured to ensure data file compatibility and transferability across the Internet. The intent was to use commercially available packages to model the various manufacturing processes for creating the die and die inserts in addition to modeling the processes for which these parts were to be used. In order to meet all of these requirements, investigative research was conducted to determine the system flow features and software components within the various organizations contributing to this project. This research is summarized.« less
Determination of optimal tool parameters for hot mandrel bending of pipe elbows
NASA Astrophysics Data System (ADS)
Tabakajew, Dmitri; Homberg, Werner
2018-05-01
Seamless pipe elbows are important components in mechanical, plant and apparatus engineering. Typically, they are produced by the so-called `Hamburg process'. In this hot forming process, the initial pipes are subsequently pushed over an ox-horn-shaped bending mandrel. The geometric shape of the mandrel influences the diameter, bending radius and wall thickness distribution of the pipe elbow. This paper presents the numerical simulation model of the hot mandrel bending process created to ensure that the optimum mandrel geometry can be determined at an early stage. A fundamental analysis was conducted to determine the influence of significant parameters on the pipe elbow quality. The chosen methods and approach as well as the corresponding results are described in this paper.
Automated clustering-based workload characterization
NASA Technical Reports Server (NTRS)
Pentakalos, Odysseas I.; Menasce, Daniel A.; Yesha, Yelena
1996-01-01
The demands placed on the mass storage systems at various federal agencies and national laboratories are continuously increasing in intensity. This forces system managers to constantly monitor the system, evaluate the demand placed on it, and tune it appropriately using either heuristics based on experience or analytic models. Performance models require an accurate workload characterization. This can be a laborious and time consuming process. It became evident from our experience that a tool is necessary to automate the workload characterization process. This paper presents the design and discusses the implementation of a tool for workload characterization of mass storage systems. The main features of the tool discussed here are: (1)Automatic support for peak-period determination. Histograms of system activity are generated and presented to the user for peak-period determination; (2) Automatic clustering analysis. The data collected from the mass storage system logs is clustered using clustering algorithms and tightness measures to limit the number of generated clusters; (3) Reporting of varied file statistics. The tool computes several statistics on file sizes such as average, standard deviation, minimum, maximum, frequency, as well as average transfer time. These statistics are given on a per cluster basis; (4) Portability. The tool can easily be used to characterize the workload in mass storage systems of different vendors. The user needs to specify through a simple log description language how the a specific log should be interpreted. The rest of this paper is organized as follows. Section two presents basic concepts in workload characterization as they apply to mass storage systems. Section three describes clustering algorithms and tightness measures. The following section presents the architecture of the tool. Section five presents some results of workload characterization using the tool.Finally, section six presents some concluding remarks.
Characterization of the interfacial heat transfer coefficient for hot stamping processes
NASA Astrophysics Data System (ADS)
Luan, Xi; Liu, Xiaochuan; Fang, Haomiao; Ji, Kang; El Fakir, Omer; Wang, LiLiang
2016-08-01
In hot stamping processes, the interfacial heat transfer coefficient (IHTC) between the forming tools and hot blank is an essential parameter which determines the quenching rate of the process and hence the resulting material microstructure. The present work focuses on the characterization of the IHTC between an aluminium alloy 7075-T6 blank and two different die materials, cast iron (G3500) and H13 die steel, at various contact pressures. It was found that the IHTC between AA7075 and cast iron had values 78.6% higher than that obtained between AA7075 and H13 die steel. Die materials and contact pressures had pronounced effects on the IHTC, suggesting that the IHTC can be used to guide the selection of stamping tool materials and the precise control of processing parameters.
Relationship between cattle temperament as determined by exit velocity carcass merit in beef cattle
USDA-ARS?s Scientific Manuscript database
The objective of this trial was to use cattle temperament, as determined by exit velocity only, as a means to evaluate the impact of temperament on carcass merit and the possible utilization of exit velocity alone as a sorting tool within the feedlot. At the time of processing, exit velocity and bod...
Buying a Car: Using On-Line Tools. Technology Update.
ERIC Educational Resources Information Center
McCoy, Kimberly
This lesson plan was created to assist learners in adult basic and literacy education programs with the car-buying process. The goal for the lesson is to "effectively use the Internet to research necessary items before purchasing a car." These nine learning objectives are set: (1) determine what kind of car is needed; (2) determine how…
NASA Astrophysics Data System (ADS)
Hoverman, Suzanne; Ayre, Margaret
2012-12-01
SummaryIndigenous land owners of the Tiwi Islands, Northern Territory Australia have begun the first formal freshwater allocation planning process in Australia entirely within Indigenous lands and waterways. The process is managed by the Northern Territory government agency responsible for water planning, the Department of Natural Resources, Environment, The Arts and Sport, in partnership with the Tiwi Land Council, the principal representative body for Tiwi Islanders on matters of land and water management and governance. Participatory planning methods ('tools') were developed to facilitate community participation in Tiwi water planning. The tools, selected for their potential to generate involvement in the planning process needed both to incorporate Indigenous knowledge of water use and management and raise awareness in the Indigenous community of Western science and water resources management. In consultation with the water planner and Tiwi Land Council officers, the researchers selected four main tools to develop, trial and evaluate. Results demonstrate that the tools provided mechanisms which acknowledge traditional management systems, improve community engagement, and build confidence in the water planning process. The researchers found that participatory planning approaches supported Tiwi natural resource management institutions both in determining appropriate institutional arrangements and clarifying roles and responsibilities in the Islands' Water Management Strategy.
Automated infrasound signal detection algorithms implemented in MatSeis - Infra Tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, Darren
2004-07-01
MatSeis's infrasound analysis tool, Infra Tool, uses frequency slowness processing to deconstruct the array data into three outputs per processing step: correlation, azimuth and slowness. Until now, an experienced analyst trained to recognize a pattern observed in outputs from signal processing manually accomplished infrasound signal detection. Our goal was to automate the process of infrasound signal detection. The critical aspect of infrasound signal detection is to identify consecutive processing steps where the azimuth is constant (flat) while the time-lag correlation of the windowed waveform is above background value. These two statements describe the arrival of a correlated set of wavefrontsmore » at an array. The Hough Transform and Inverse Slope methods are used to determine the representative slope for a specified number of azimuth data points. The representative slope is then used in conjunction with associated correlation value and azimuth data variance to determine if and when an infrasound signal was detected. A format for an infrasound signal detection output file is also proposed. The detection output file will list the processed array element names, followed by detection characteristics for each method. Each detection is supplied with a listing of frequency slowness processing characteristics: human time (YYYY/MM/DD HH:MM:SS.SSS), epochal time, correlation, fstat, azimuth (deg) and trace velocity (km/s). As an example, a ground truth event was processed using the four-element DLIAR infrasound array located in New Mexico. The event is known as the Watusi chemical explosion, which occurred on 2002/09/28 at 21:25:17 with an explosive yield of 38,000 lb TNT equivalent. Knowing the source and array location, the array-to-event distance was computed to be approximately 890 km. This test determined the station-to-event azimuth (281.8 and 282.1 degrees) to within 1.6 and 1.4 degrees for the Inverse Slope and Hough Transform detection algorithms, respectively, and the detection window closely correlated to the theoretical stratospheric arrival time. Further testing will be required for tuning of detection threshold parameters for different types of infrasound events.« less
EMHP: an accurate automated hole masking algorithm for single-particle cryo-EM image processing.
Berndsen, Zachary; Bowman, Charles; Jang, Haerin; Ward, Andrew B
2017-12-01
The Electron Microscopy Hole Punch (EMHP) is a streamlined suite of tools for quick assessment, sorting and hole masking of electron micrographs. With recent advances in single-particle electron cryo-microscopy (cryo-EM) data processing allowing for the rapid determination of protein structures using a smaller computational footprint, we saw the need for a fast and simple tool for data pre-processing that could run independent of existing high-performance computing (HPC) infrastructures. EMHP provides a data preprocessing platform in a small package that requires minimal python dependencies to function. https://www.bitbucket.org/chazbot/emhp Apache 2.0 License. bowman@scripps.edu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
Investigation of fatigue strength of tool steels in sheet-bulk metal forming
NASA Astrophysics Data System (ADS)
Pilz, F.; Gröbel, D.; Merklein, M.
2018-05-01
To encounter trends regarding an efficient production of complex functional components in forming technology, the process class of sheet-bulk metal forming (SBMF) can be applied. SBMF is characterized by the application of bulk forming operations on sheet metal, often in combination with sheet forming operations [1]. The combination of these conventional process classes leads to locally varying load conditions. The resulting load conditions cause high tool loads, which lead to a reduced tool life, and an uncontrolled material flow. Several studies have shown that locally modified tool surfaces, so-called tailored surfaces, have the potential to control the material flow and thus to increase the die filling of functional elements [2]. A combination of these modified tool surfaces and high tool loads in SBMF is furthermore critical for the tool life and leads to fatigue. Tool fatigue is hardly predictable and due to a lack of data [3], a challenge in tool design. Thus, it is necessary to provide such data for tool steels used in SBMF. The aim of this study is the investigation of the influence of tailored surfaces on the fatigue strength of the powder metallurgical tool steel ASP2023 (1.3344, AISI M3:2), which is typically used in cold forging applications, with a hardness 60 HRC ± 1 HRC. To conduct this investigation, the rotating bending test is chosen. As tailored surfaces, a DLC-coating and a surface manufactured by a high-feed-milling process are chosen. As reference a polished surface which is typical for cold forging tools is used. Before the rotating bending test, the surface integrity is characterized by measuring topography and residual stresses. After testing, the determined values of the surface integrity are correlated with the reached fracture load cycle to derive functional relations. Based on the gained results the investigated tailored surfaces are evaluated regarding their feasibility to modify tool surfaces within SBMF.
ERIC Educational Resources Information Center
de Freitas Guilhermino Trindade, Daniela; Guimaraes, Cayley; Antunes, Diego Roberto; Garcia, Laura Sanchez; Lopes da Silva, Rafaella Aline; Fernandes, Sueli
2012-01-01
This study analysed the role of knowledge management (KM) tools used to cultivate a community of practice (CP) in its knowledge creation (KC), transfer, learning processes. The goal of such observations was to determine requirements that KM tools should address for the specific CP formed by Deaf and non-Deaf members of the CP. The CP studied is a…
Spray Cooling Processes for Space Applications
NASA Technical Reports Server (NTRS)
Kizito, John P.; VanderWal, Randy L.; Berger, Gordon; Tryggvason, Gretar
2004-01-01
The present paper reports ongoing work to develop numerical and modeling tools used to design efficient and effective spray cooling processes and to determine characteristic non-dimensional parametric dependence for practical fluids and conditions. In particular, we present data that will delineate conditions towards control of the impingement dynamics of droplets upon a heated substrate germane to practical situations.
NASA Astrophysics Data System (ADS)
Behrens, B.-A.; Bouguecha, A.; Bonk, C.; Matthias, T.
2017-10-01
Solid-forming components are often used in areas where they are subjected to very high loads. For most solid components locally divergent and sometimes contradictory requirements exist. Despite these contradictory requirements, almost exclusively monomaterials are nowadays used for the production of solid components. These components often reach their material-specific limits because of increasing demands on the products. Thus a significant increase in product quality and profitability would result from combining different materials in order to create tailored properties. In the Collaborative Research Center (CRC) 1153 "Tailored Forming" at the Leibniz Universität Hannover, this topic is investigated. The primary objective of the CRC 1153 is to develop and investigate new tailored manufacturing processes to produce reliable hybrid solid semi-finished components. In contrast to existing production processes of hybrid solid components, semi-finished workpieces in the CRC 1153 are joined before the forming phase. Thus, it will be possible to produce complex and highly stressable solid components made of different metals, which cannot be produced yet with the current used technologies. In this work the material and friction characteristics are investigated and the forming tool for the production of hybrid bevel gears made of different steel alloys (C22 and 41Cr4) is designed by numerical simulations. For this purpose, flow curves of both materials are determined by means of upsetting tests at process-related forming temperatures and strain rates. The temperature range for the forming process of the semi-finished product is determined by comparing the respective flow curves regarding similar flow stresses. Furthermore, the friction between the tool and the joining materials is investigated by means of ring upsetting tests at a process-relevant temperature. Finally, a stress analysis of the forming tools is carried out.
Friction Stir Welding at MSFC: Kinematics
NASA Technical Reports Server (NTRS)
Nunes, A. C., Jr.
2001-01-01
In 1991 The Welding Institute of the United Kingdom patented the Friction Stir Welding (FSW) process. In FSW a rotating pin-tool is inserted into a weld seam and literally stirs the faying surfaces together as it moves up the seam. By April 2000 the American Welding Society International Welding and Fabricating Exposition featured several exhibits of commercial FSW processes and the 81st Annual Convention devoted a technical session to the process. The FSW process is of interest to Marshall Space Flight Center (MSFC) as a means of avoiding hot-cracking problems presented by the 2195 aluminum-lithium alloy, which is the primary constituent of the Lightweight Space Shuttle External Tank. The process has been under development at MSFC for External Tank applications since the early 1990's. Early development of the FSW process proceeded by cut-and-try empirical methods. A substantial and complex body of data resulted. A theoretical model was wanted to deal with the complexity and reduce the data to concepts serviceable for process diagnostics, optimization, parameter selection, etc. A first step in understanding the FSW process is to determine the kinematics, i.e., the flow field in the metal in the vicinity of the pin-tool. Given the kinematics, the dynamics, i.e., the forces, can be targeted. Given a completed model of the FSW process, attempts at rational design of tools and selection of process parameters can be made.
Ellis, Heidi J C; Nowling, Ronald J; Vyas, Jay; Martyn, Timothy O; Gryk, Michael R
2011-04-11
The CONNecticut Joint University Research (CONNJUR) team is a group of biochemical and software engineering researchers at multiple institutions. The vision of the team is to develop a comprehensive application that integrates a variety of existing analysis tools with workflow and data management to support the process of protein structure determination using Nuclear Magnetic Resonance (NMR). The use of multiple disparate tools and lack of data management, currently the norm in NMR data processing, provides strong motivation for such an integrated environment. This manuscript briefly describes the domain of NMR as used for protein structure determination and explains the formation of the CONNJUR team and its operation in developing the CONNJUR application. The manuscript also describes the evolution of the CONNJUR application through four prototypes and describes the challenges faced while developing the CONNJUR application and how those challenges were met.
Kozunov, Vladimir; Nikolaeva, Anastasia; Stroganova, Tatiana A
2017-01-01
The brain mechanisms that integrate the separate features of sensory input into a meaningful percept depend upon the prior experience of interaction with the object and differ between categories of objects. Recent studies using representational similarity analysis (RSA) have characterized either the spatial patterns of brain activity for different categories of objects or described how category structure in neuronal representations emerges in time, but never simultaneously. Here we applied a novel, region-based, multivariate pattern classification approach in combination with RSA to magnetoencephalography data to extract activity associated with qualitatively distinct processing stages of visual perception. We asked participants to name what they see whilst viewing bitonal visual stimuli of two categories predominantly shaped by either value-dependent or sensorimotor experience, namely faces and tools, and meaningless images. We aimed to disambiguate the spatiotemporal patterns of brain activity between the meaningful categories and determine which differences in their processing were attributable to either perceptual categorization per se , or later-stage mentalizing-related processes. We have extracted three stages of cortical activity corresponding to low-level processing, category-specific feature binding, and supra-categorical processing. All face-specific spatiotemporal patterns were associated with bilateral activation of ventral occipito-temporal areas during the feature binding stage at 140-170 ms. The tool-specific activity was found both within the categorization stage and in a later period not thought to be associated with binding processes. The tool-specific binding-related activity was detected within a 210-220 ms window and was located to the intraparietal sulcus of the left hemisphere. Brain activity common for both meaningful categories started at 250 ms and included widely distributed assemblies within parietal, temporal, and prefrontal regions. Furthermore, we hypothesized and tested whether activity within face and tool-specific binding-related patterns would demonstrate oppositely acting effects following procedural perceptual learning. We found that activity in the ventral, face-specific network increased following the stimuli repetition. In contrast, tool processing in the dorsal network adapted by reducing its activity over the repetition period. Altogether, we have demonstrated that activity associated with visual processing of faces and tools during the categorization stage differ in processing timing, brain areas involved, and in their dynamics underlying stimuli learning.
A GUI-based Tool for Bridging the Gap between Models and Process-Oriented Studies
NASA Astrophysics Data System (ADS)
Kornfeld, A.; Van der Tol, C.; Berry, J. A.
2014-12-01
Models used for simulation of photosynthesis and transpiration by canopies of terrestrial plants typically have subroutines such as STOMATA.F90, PHOSIB.F90 or BIOCHEM.m that solve for photosynthesis and associated processes. Key parameters such as the Vmax for Rubisco and temperature response parameters are required by these subroutines. These are often taken from the literature or determined by separate analysis of gas exchange experiments. It is useful to note however that subroutines can be extracted and run as standalone models to simulate leaf responses collected in gas exchange experiments. Furthermore, there are excellent non-linear fitting tools that can be used to optimize the parameter values in these models to fit the observations. Ideally the Vmax fit in this way should be the same as that determined by a separate analysis, but it may not because of interactions with other kinetic constants and the temperature dependence of these in the full subroutine. We submit that it is more useful to fit the complete model to the calibration experiments rather as disaggregated constants. We designed a graphical user interface (GUI) based tool that uses gas exchange photosynthesis data to directly estimate model parameters in the SCOPE (Soil Canopy Observation, Photochemistry and Energy fluxes) model and, at the same time, allow researchers to change parameters interactively to visualize how variation in model parameters affect predicted outcomes such as photosynthetic rates, electron transport, and chlorophyll fluorescence. We have also ported some of this functionality to an Excel spreadsheet, which could be used as a teaching tool to help integrate process-oriented and model-oriented studies.
Multiwavelength digital holography for polishing tool shape measurement
NASA Astrophysics Data System (ADS)
Lédl, Vít.; Psota, Pavel; Václavík, Jan; Doleček, Roman; Vojtíšek, Petr
2013-09-01
Classical mechano-chemical polishing is still a valuable technique, which gives unbeatable results for some types of optical surfaces. For example, optics for high power lasers requires minimized subsurface damage, very high cosmetic quality, and low mid spatial frequency error. One can hardly achieve this with use of subaperture polishing. The shape of the polishing tool plays a crucial role in achieving the required form of the optical surface. Often the shape of the polishing tool or pad is not known precisely enough during the manufacturing process. The tool shape is usually premachined and later is changed during the polishing procedure. An experienced worker could estimate the shape of the tool indirectly from the shape of the polished element, and that is why he can achieve the required shape in few reasonably long iterative steps. Therefore the lack of the exact tool shape knowledge is tolerated. Sometimes, this indirect method is not feasible even if small parts are considered. Moreover, if processes on machines like planetary (continuous) polishers are considered, the incorrect shape of the polishing pad could extend the polishing times extremely. Every iteration step takes hours. Even worse, polished piece could be wasted if the pad has a poor shape. The ability of the tool shape determination would be very valuable in those types of lengthy processes. It was our primary motivation to develop a contactless measurement method for large diffusive surfaces and demonstrate its usability. The proposed method is based on application of multiwavelength digital holographic interferometry with phase shift.
Van de Velde, Stijn; Roshanov, Pavel; Kortteisto, Tiina; Kunnamo, Ilkka; Aertgeerts, Bert; Vandvik, Per Olav; Flottorp, Signe
2016-03-05
A computerised clinical decision support system (CCDSS) is a technology that uses patient-specific data to provide relevant medical knowledge at the point of care. It is considered to be an important quality improvement intervention, and the implementation of CCDSS is growing substantially. However, the significant investments do not consistently result in value for money due to content, context, system and implementation issues. The Guideline Implementation with Decision Support (GUIDES) project aims to improve the impact of CCDSS through optimised implementation based on high-quality evidence-based recommendations. To achieve this, we will develop tools that address the factors that determine successful CCDSS implementation. We will develop the GUIDES tools in four steps, using the methods and results of the Tailored Implementation for Chronic Diseases (TICD) project as a starting point: (1) a review of research evidence and frameworks on the determinants of implementing recommendations using CCDSS; (2) a synthesis of a comprehensive framework for the identified determinants; (3) the development of tools for use of the framework and (4) pilot testing the utility of the tools through the development of a tailored CCDSS intervention in Norway, Belgium and Finland. We selected the conservative management of knee osteoarthritis as a prototype condition for the pilot. During the process, the authors will collaborate with an international expert group to provide input and feedback on the tools. This project will provide guidance and tools on methods of identifying implementation determinants and selecting strategies to implement evidence-based recommendations through CCDSS. We will make the GUIDES tools available to CCDSS developers, implementers, researchers, funders, clinicians, managers, educators, and policymakers internationally. The tools and recommendations will be generic, which makes them scalable to a large spectrum of conditions. Ultimately, the better implementation of CCDSS may lead to better-informed decisions and improved care and patient outcomes for a wide range of conditions. PROSPERO, CRD42016033738.
An innovative framework for psychosocial assessment in complex mental capacity evaluations.
Newberry, Andrea M; Pachet, Arlin K
2008-08-01
This study describes an innovative tool developed by the Regional Capacity Assessment Team (RCAT) to assess unique psychosocial factors related to capacity evaluations. Capacity is a socio-legal construct entailing the ability to understand choices, appreciate consequences and follow through (or direct a surrogate) with chosen options. RCAT's targeted psychosocial assessment includes medico-legal factors, social history and supports, coping skills, religious/cultural factors and risk of abuse. RCAT completes the psychosocial assessment to determine whether a full capacity assessment is required (referral disposition) and to determine the impact of an adult's social functioning on their decision-making capacity (capacity determination). RCAT's psychosocial assessment protocol was developed after a comprehensive literature review of capacity assessment and incorporates recommended practices in geriatric social work and psychology. This study will synthesise the pertinent literature, discuss cultural interviewing processes significant to capacity, caregiver assessment and describe the tool itself. Suggestions for future research and appropriate implementation of this tool are provided.
Improving material removal determinacy based on the compensation of tool influence function
NASA Astrophysics Data System (ADS)
Zhong, Bo; Chen, Xian-hua; Deng, Wen-hui; Zhao, Shi-jie; Zheng, Nan
2018-03-01
In the process of computer-controlled optical surfacing (CCOS), the key of correcting the surface error of optical components is to ensure the consistency between the simulated tool influence function and the actual tool influence function (TIF). The existing removal model usually adopts the fixed-point TIF to remove the material with the planning path and velocity, and it considers that the polishing process is linear and time invariant. However, in the actual polishing process, the TIF is a function related to the feed speed. In this paper, the relationship between the actual TIF and the feed speed (i.e. the compensation relationship between static removal and dynamic removal) is determined by experimental method. Then, the existing removal model is modified based on the compensation relationship, to improve the conformity between simulated and actual processing. Finally, the surface error modification correction test are carried out. The results show that the fitting degree of the simulated surface and the experimental surface is better than 88%, and the surface correction accuracy can be better than 1/10 λ (Λ=632.8nm).
“Investigations on the machinability of Waspaloy under dry environment”
NASA Astrophysics Data System (ADS)
Deepu, J.; Kuppan, P.; SBalan, A. S.; Oyyaravelu, R.
2016-09-01
Nickel based superalloy, Waspaloy is extensively used in gas turbine, aerospace and automobile industries because of their unique combination of properties like high strength at elevated temperatures, resistance to chemical degradation and excellent wear resistance in many hostile environments. It is considered as one of the difficult to machine superalloy due to excessive tool wear and poor surface finish. The present paper is an attempt for removing cutting fluids from turning process of Waspaloy and to make the processes environmentally safe. For this purpose, the effect of machining parameters such as cutting speed and feed rate on the cutting force, cutting temperature, surface finish and tool wear were investigated barrier. Consequently, the strength and tool wear resistance and tool life increased significantly. Response Surface Methodology (RSM) has been used for developing and analyzing a mathematical model which describes the relationship between machining parameters and output variables. Subsequently ANOVA was used to check the adequacy of the regression model as well as each machining variables. The optimal cutting parameters were determined based on multi-response optimizations by composite desirability approach in order to minimize cutting force, average surface roughness and maximum flank wear. The results obtained from the experiments shown that machining of Waspaloy using coated carbide tool with special ranges of parameters, cutting fluid could be completely removed from machining process
Mathematical tool from corn stover TGA to determine its composition.
Freda, Cesare; Zimbardi, Francesco; Nanna, Francesco; Viola, Egidio
2012-08-01
Corn stover was treated by steam explosion process at four different temperatures. A fraction of the four exploded matters was extracted by water. The eight samples (four from steam explosion and four from water extraction of exploded matters) were analysed by wet chemical way to quantify the amount of cellulose, hemicellulose and lignin. Thermogravimetric analysis in air atmosphere was executed on the eight samples. A mathematical tool was developed, using TGA data, to determine the composition of corn stover in terms of cellulose, hemicellulose and lignin. It uses the biomass degradation temperature as multiple linear function of the cellulose, hemicellulose and lignin content of the biomass with interactive terms. The mathematical tool predicted cellulose, hemicellulose and lignin contents with average absolute errors of 1.69, 5.59 and 0.74 %, respectively, compared to the wet chemical method.
Optimization and Simulation of SLM Process for High Density H13 Tool Steel Parts
NASA Astrophysics Data System (ADS)
Laakso, Petri; Riipinen, Tuomas; Laukkanen, Anssi; Andersson, Tom; Jokinen, Antero; Revuelta, Alejandro; Ruusuvuori, Kimmo
This paper demonstrates the successful printing and optimization of processing parameters of high-strength H13 tool steel by Selective Laser Melting (SLM). D-Optimal Design of Experiments (DOE) approach is used for parameter optimization of laser power, scanning speed and hatch width. With 50 test samples (1×1×1cm) we establish parameter windows for these three parameters in relation to part density. The calculated numerical model is found to be in good agreement with the density data obtained from the samples using image analysis. A thermomechanical finite element simulation model is constructed of the SLM process and validated by comparing the calculated densities retrieved from the model with the experimentally determined densities. With the simulation tool one can explore the effect of different parameters on density before making any printed samples. Establishing a parameter window provides the user with freedom for parameter selection such as choosing parameters that result in fastest print speed.
Research on criticality analysis method of CNC machine tools components under fault rate correlation
NASA Astrophysics Data System (ADS)
Gui-xiang, Shen; Xian-zhuo, Zhao; Zhang, Ying-zhi; Chen-yu, Han
2018-02-01
In order to determine the key components of CNC machine tools under fault rate correlation, a system component criticality analysis method is proposed. Based on the fault mechanism analysis, the component fault relation is determined, and the adjacency matrix is introduced to describe it. Then, the fault structure relation is hierarchical by using the interpretive structure model (ISM). Assuming that the impact of the fault obeys the Markov process, the fault association matrix is described and transformed, and the Pagerank algorithm is used to determine the relative influence values, combined component fault rate under time correlation can obtain comprehensive fault rate. Based on the fault mode frequency and fault influence, the criticality of the components under the fault rate correlation is determined, and the key components are determined to provide the correct basis for equationting the reliability assurance measures. Finally, taking machining centers as an example, the effectiveness of the method is verified.
APT: what it has enabled us to do
NASA Astrophysics Data System (ADS)
Blacker, Brett S.; Golombek, Daniel
2004-09-01
With the development and operations deployment of the Astronomer's Proposal Tool (APT), Hubble Space Telescope (HST) proposers have been provided with an integrated toolset for Phase I and Phase II. This toolset consists of editors for filling out proposal information, an Orbit Planner for determining observation feasibility, a Visit Planner for determining schedulability, diagnostic and reporting tools and an integrated Visual Target Tuner (VTT) for viewing exposure specifications. The VTT can also overlay HST"s field of view on user-selected Flexible Image Transport System (FITS) images, perform bright object checks and query the HST archive. In addition to these direct benefits for the HST user, STScI"s internal Phase I process has been able to take advantage of the APT products. APT has enabled a substantial streamlining of the process and software processing tools, which enabled a compression by three months of the Phase I to Phase II schedule, allowing to schedule observations earlier and thus further benefiting HST observers. Some of the improvements to our process include: creating a compact disk (CD) of Phase I products; being able to print all proposals on the day of the deadline; link the proposal in Portable Document Format (PDF) with a database, and being able to run all Phase I software on a single platform. In this paper we will discuss the operational results of using APT for HST's Cycles 12 and 13 Phase I process and will show the improvements for the users and the overall process that is allowing STScI to obtain scientific results with HST three months earlier than in previous years. We will also show how APT can be and is being used for multiple missions.
Awotwe Otoo, David; Agarabi, Cyrus; Khan, Mansoor A
2014-07-01
The aim of the present study was to apply an integrated process analytical technology (PAT) approach to control and monitor the effect of the degree of supercooling on critical process and product parameters of a lyophilization cycle. Two concentrations of a mAb formulation were used as models for lyophilization. ControLyo™ technology was applied to control the onset of ice nucleation, whereas tunable diode laser absorption spectroscopy (TDLAS) was utilized as a noninvasive tool for the inline monitoring of the water vapor concentration and vapor flow velocity in the spool during primary drying. The instantaneous measurements were then used to determine the effect of the degree of supercooling on critical process and product parameters. Controlled nucleation resulted in uniform nucleation at lower degrees of supercooling for both formulations, higher sublimation rates, lower mass transfer resistance, lower product temperatures at the sublimation interface, and shorter primary drying times compared with the conventional shelf-ramped freezing. Controlled nucleation also resulted in lyophilized cakes with more elegant and porous structure with no visible collapse or shrinkage, lower specific surface area, and shorter reconstitution times compared with the uncontrolled nucleation. Uncontrolled nucleation however resulted in lyophilized cakes with relatively lower residual moisture contents compared with controlled nucleation. TDLAS proved to be an efficient tool to determine the endpoint of primary drying. There was good agreement between data obtained from TDLAS-based measurements and SMART™ technology. ControLyo™ technology and TDLAS showed great potential as PAT tools to achieve enhanced process monitoring and control during lyophilization cycles. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.
Villoria Sáez, Paola; del Río Merino, Mercedes; Porras-Amores, César
2012-02-01
The management planning of construction and demolition (C&D) waste uses a single indicator which does not provide enough detailed information. Therefore the determination and implementation of other innovative and precise indicators should be determined. The aim of this research work is to improve existing C&D waste quantification tools in the construction of new residential buildings in Spain. For this purpose, several housing projects were studied to determine an estimation of C&D waste generated during their construction process. This paper determines the values of three indicators to estimate the generation of C&D waste in new residential buildings in Spain, itemizing types of waste and construction stages. The inclusion of two more accurate indicators, in addition to the global one commonly in use, provides a significant improvement in C&D waste quantification tools and management planning.
Heat Treatment Optimization and Properties Correlation for H11-Type Hot-Work Tool Steel
NASA Astrophysics Data System (ADS)
Podgornik, B.; Puš, G.; Žužek, B.; Leskovšek, V.; Godec, M.
2018-02-01
The aim of this research was to determine the effect of vacuum-heat-treatment process parameters on the material properties and their correlations for low-Si-content AISI H11-type hot-work tool steel using a single Circumferentially Notched and fatigue Pre-cracked Tensile Bar (CNPTB) test specimen. The work was also focused on the potential of the proposed approach for designing advanced tempering diagrams and optimizing the vacuum heat treatment and design of forming tools. The results show that the CNPTB specimen allows a simultaneous determination and correlation of multiple properties for hot-work tool steels, with the compression and bending strength both increasing with hardness, and the strain-hardening exponent and bending strain increasing with the fracture toughness. On the other hand, the best machinability and surface quality of the hardened hot-work tool steel are obtained for hardness values between 46 and 50 HRC and a fracture toughness below 60 MPa√m.
Heat Treatment Optimization and Properties Correlation for H11-Type Hot-Work Tool Steel
NASA Astrophysics Data System (ADS)
Podgornik, B.; Puš, G.; Žužek, B.; Leskovšek, V.; Godec, M.
2017-12-01
The aim of this research was to determine the effect of vacuum-heat-treatment process parameters on the material properties and their correlations for low-Si-content AISI H11-type hot-work tool steel using a single Circumferentially Notched and fatigue Pre-cracked Tensile Bar (CNPTB) test specimen. The work was also focused on the potential of the proposed approach for designing advanced tempering diagrams and optimizing the vacuum heat treatment and design of forming tools. The results show that the CNPTB specimen allows a simultaneous determination and correlation of multiple properties for hot-work tool steels, with the compression and bending strength both increasing with hardness, and the strain-hardening exponent and bending strain increasing with the fracture toughness. On the other hand, the best machinability and surface quality of the hardened hot-work tool steel are obtained for hardness values between 46 and 50 HRC and a fracture toughness below 60 MPa√m.
ERIC Educational Resources Information Center
Dastjerdi, Negin Barat
2016-01-01
The research aims at the evaluation of ICT use in teaching-learning process to the students of Isfahan elementary schools. The method of this research is descriptive-surveying. The statistical population of the study was all teachers of Isfahan elementary schools. The sample size was determined 350 persons that selected through cluster sampling…
Distinguishing dose, focus, and blur for lithography characterization and control
NASA Astrophysics Data System (ADS)
Ausschnitt, Christopher P.; Brunner, Timothy A.
2007-03-01
We derive a physical model to describe the dependence of pattern dimensions on dose, defocus and blur. The coefficients of our model are constants of a given lithographic process. Model inversion applied to dimensional measurements then determines effective dose, defocus and blur for wafers patterned with the same process. In practice, our approach entails the measurement of proximate grating targets of differing dose and focus sensitivity. In our embodiment, the measured attribute of one target is exclusively sensitive to dose, whereas the measured attributes of a second target are distinctly sensitive to defocus and blur. On step-and-scan exposure tools, z-blur is varied in a controlled manner by adjusting the across slit tilt of the image plane. The effects of z-blur and x,y-blur are shown to be equivalent. Furthermore, the exposure slit width is shown to determine the tilt response of the grating attributes. Thus, the response of the measured attributes can be characterized by a conventional focus-exposure matrix (FEM), over which the exposure tool settings are intentionally changed. The model coefficients are determined by a fit to the measured FEM response. The model then fully defines the response for wafers processed under "fixed" dose, focus and blur conditions. Model inversion applied to measurements from the same targets on all such wafers enables the simultaneous determination of effective dose and focus/tilt (DaFT) at each measurement site.
Dataflow Design Tool: User's Manual
NASA Technical Reports Server (NTRS)
Jones, Robert L., III
1996-01-01
The Dataflow Design Tool is a software tool for selecting a multiprocessor scheduling solution for a class of computational problems. The problems of interest are those that can be described with a dataflow graph and are intended to be executed repetitively on a set of identical processors. Typical applications include signal processing and control law problems. The software tool implements graph-search algorithms and analysis techniques based on the dataflow paradigm. Dataflow analyses provided by the software are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool provides performance optimization through the inclusion of artificial precedence constraints among the schedulable tasks. The user interface and tool capabilities are described. Examples are provided to demonstrate the analysis, scheduling, and optimization functions facilitated by the tool.
Development of the Gliding Hole of the Dynamics Compression Plate
NASA Astrophysics Data System (ADS)
Salim, U. A.; Suyitno; Magetsari, R.; Mahardika, M.
2017-02-01
The gliding hole of the dynamics compression plate is designed to facilitate relative movement of pedicle screw during surgery application. The gliding hole shape is then geometrically complex. The gliding hole manufactured using machining processes used to employ ball-nose cutting tool. Then, production cost is expensive due to long production time. This study proposed to increase productivity of DCP products by introducing forming process (cold forming). The forming process used to involve any press tool devices. In the closed die forming press tool is designed with little allowance, then work-pieces is trapped in the mould after forming. Therefore, it is very important to determine hole geometry and dimensions of raw material in order to success on forming process. This study optimized the hole sizes with both geometry analytics and experiments. The success of the forming process was performed by increasing the holes size on the raw materials. The holes size need to be prepared is diameter of 5.5 mm with a length of 11.4 mm for the plate thickness 3 mm and diameter of 6 mm with a length of 12.5 mm for the plate thickness 4 mm.
Stability analysis of multipoint tool equipped with metal cutting ceramics
NASA Astrophysics Data System (ADS)
Maksarov, V. V.; Khalimonenko, A. D.; Matrenichev, K. G.
2017-10-01
The article highlights the issues of determining the stability of the cutting process by a multipoint cutting tool equipped with cutting ceramics. There were some recommendations offered on the choice of parameters of replaceable cutting ceramic plates for milling based of the conducted researches. Ceramic plates for milling are proposed to be selected on the basis of value of their electrical volume resistivity.
TaN resistor process development and integration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romero, Kathleen; Martinez, Marino John; Clevenger, Jascinda
This paper describes the development and implementation of an integrated resistor process based on reactively sputtered tantalum nitride. Image reversal lithography was shown to be a superior method for liftoff patterning of these films. The results of a response surface DOE for the sputter deposition of the films are discussed. Several approaches to stabilization baking were examined and the advantages of the hot plate method are shown. In support of a new capability to produce special-purpose HBT-based Small-Scale Integrated Circuits (SSICs), we developed our existing TaN resistor process, designed for research prototyping, into one with greater maturity and robustness. Includedmore » in this work was the migration of our TaN deposition process from a research-oriented tool to a tool more suitable for production. Also included was implementation and optimization of a liftoff process for the sputtered TaN to avoid the complicating effects of subtractive etching over potentially sensitive surfaces. Finally, the method and conditions for stabilization baking of the resistors was experimentally determined to complete the full implementation of the resistor module. Much of the work to be described involves the migration between sputter deposition tools - from a Kurt J. Lesker CMS-18 to a Denton Discovery 550. Though they use nominally the same deposition technique (reactive sputtering of Ta with N{sup +} in a RF-excited Ar plasma), they differ substantially in their design and produce clearly different results in terms of resistivity, conformity of the film and the difference between as-deposited and stabilized films. We will describe the design of and results from the design of experiments (DOE)-based method of process optimization on the new tool and compare this to what had been used on the old tool.« less
Statistical process control: separating signal from noise in emergency department operations.
Pimentel, Laura; Barrueto, Fermin
2015-05-01
Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.
Lovley, D.R.; Chapelle, F.H.; Woodward, J.C.
1994-01-01
The potential for using concentrations of dissolved H2 to determine the distribution of redox processes in anoxic groundwaters was evaluated. In pristine aquifers in which standard geochemical measurements indicated that Fe-(III) reduction, sulfate reduction, or methanogenesis was the terminal electron accepting process (TEAP), the H2 concentrations were similar to the H2 concentrations that have previously been reported for aquatic sediments with the same TEAPs. In two aquifers contaminated with petroleum products, it was impossible with standard geochemical analyses to determine which TEAPs predominated in specific locations. However, the TEAPs predicted from measurements of dissolved H2 were the same as those determined directly through measurements of microbial processes in incubated aquifer material. These results suggest that H2 concentrations may be a useful tool for analyzing the redox chemistry of nonequilibrium groundwaters.
Evaluation of reliability modeling tools for advanced fault tolerant systems
NASA Technical Reports Server (NTRS)
Baker, Robert; Scheper, Charlotte
1986-01-01
The Computer Aided Reliability Estimation (CARE III) and Automated Reliability Interactice Estimation System (ARIES 82) reliability tools for application to advanced fault tolerance aerospace systems were evaluated. To determine reliability modeling requirements, the evaluation focused on the Draper Laboratories' Advanced Information Processing System (AIPS) architecture as an example architecture for fault tolerance aerospace systems. Advantages and limitations were identified for each reliability evaluation tool. The CARE III program was designed primarily for analyzing ultrareliable flight control systems. The ARIES 82 program's primary use was to support university research and teaching. Both CARE III and ARIES 82 were not suited for determining the reliability of complex nodal networks of the type used to interconnect processing sites in the AIPS architecture. It was concluded that ARIES was not suitable for modeling advanced fault tolerant systems. It was further concluded that subject to some limitations (the difficulty in modeling systems with unpowered spare modules, systems where equipment maintenance must be considered, systems where failure depends on the sequence in which faults occurred, and systems where multiple faults greater than a double near coincident faults must be considered), CARE III is best suited for evaluating the reliability of advanced tolerant systems for air transport.
The Impact of Concept Mapping on the Process of Problem-Based Learning
ERIC Educational Resources Information Center
Zwaal, Wichard; Otting, Hans
2012-01-01
A concept map is a graphical tool to activate and elaborate on prior knowledge, to support problem solving, promote conceptual thinking and understanding, and to organize and memorize knowledge. The aim of this study is to determine if the use of concept mapping (CM) in a problem-based learning (PBL) curriculum enhances the PBL process. The paper…
Methods and Research for Multi-Component Cutting Force Sensing Devices and Approaches in Machining
Liang, Qiaokang; Zhang, Dan; Wu, Wanneng; Zou, Kunlin
2016-01-01
Multi-component cutting force sensing systems in manufacturing processes applied to cutting tools are gradually becoming the most significant monitoring indicator. Their signals have been extensively applied to evaluate the machinability of workpiece materials, predict cutter breakage, estimate cutting tool wear, control machine tool chatter, determine stable machining parameters, and improve surface finish. Robust and effective sensing systems with capability of monitoring the cutting force in machine operations in real time are crucial for realizing the full potential of cutting capabilities of computer numerically controlled (CNC) tools. The main objective of this paper is to present a brief review of the existing achievements in the field of multi-component cutting force sensing systems in modern manufacturing. PMID:27854322
Methods and Research for Multi-Component Cutting Force Sensing Devices and Approaches in Machining.
Liang, Qiaokang; Zhang, Dan; Wu, Wanneng; Zou, Kunlin
2016-11-16
Multi-component cutting force sensing systems in manufacturing processes applied to cutting tools are gradually becoming the most significant monitoring indicator. Their signals have been extensively applied to evaluate the machinability of workpiece materials, predict cutter breakage, estimate cutting tool wear, control machine tool chatter, determine stable machining parameters, and improve surface finish. Robust and effective sensing systems with capability of monitoring the cutting force in machine operations in real time are crucial for realizing the full potential of cutting capabilities of computer numerically controlled (CNC) tools. The main objective of this paper is to present a brief review of the existing achievements in the field of multi-component cutting force sensing systems in modern manufacturing.
Development of a Software Tool to Automate ADCO Flight Controller Console Planning Tasks
NASA Technical Reports Server (NTRS)
Anderson, Mark G.
2011-01-01
This independent study project covers the development of the International Space Station (ISS) Attitude Determination and Control Officer (ADCO) Planning Exchange APEX Tool. The primary goal of the tool is to streamline existing manual and time-intensive planning tools into a more automated, user-friendly application that interfaces with existing products and allows the ADCO to produce accurate products and timelines more effectively. This paper will survey the current ISS attitude planning process and its associated requirements, goals, documentation and software tools and how a software tool could simplify and automate many of the planning actions which occur at the ADCO console. The project will be covered from inception through the initial prototype delivery in November 2011 and will include development of design requirements and software as well as design verification and testing.
EpiTools, A software suite for presurgical brain mapping in epilepsy: Intracerebral EEG.
Medina Villalon, S; Paz, R; Roehri, N; Lagarde, S; Pizzo, F; Colombet, B; Bartolomei, F; Carron, R; Bénar, C-G
2018-06-01
In pharmacoresistant epilepsy, exploration with depth electrodes can be needed to precisely define the epileptogenic zone. Accurate location of these electrodes is thus essential for the interpretation of Stereotaxic EEG (SEEG) signals. As SEEG analysis increasingly relies on signal processing, it is crucial to make a link between these results and patient's anatomy. Our aims were thus to develop a suite of software tools, called "EpiTools", able to i) precisely and automatically localize the position of each SEEG contact and ii) display the results of signal analysis in each patient's anatomy. The first tool, GARDEL (GUI for Automatic Registration and Depth Electrode Localization), is able to automatically localize SEEG contacts and to label each contact according to a pre-specified nomenclature (for instance that of FreeSurfer or MarsAtlas). The second tool, 3Dviewer, enables to visualize in the 3D anatomy of the patient the origin of signal processing results such as rate of biomarkers, connectivity graphs or Epileptogenicity Index. GARDEL was validated in 30 patients by clinicians and proved to be highly reliable to determine within the patient's individual anatomy the actual location of contacts. GARDEL is a fully automatic electrode localization tool needing limited user interaction (only for electrode naming or contact correction). The 3Dviewer is able to read signal processing results and to display them in link with patient's anatomy. EpiTools can help speeding up the interpretation of SEEG data and improving its precision. Copyright © 2018 Elsevier B.V. All rights reserved.
Virtual tryout planning in automotive industry based on simulation metamodels
NASA Astrophysics Data System (ADS)
Harsch, D.; Heingärtner, J.; Hortig, D.; Hora, P.
2016-11-01
Deep drawn sheet metal parts are increasingly designed to the feasibility limit, thus achieving a robust manufacturing is often challenging. The fluctuation of process and material properties often lead to robustness problems. Therefore, numerical simulations are used to detect the critical regions. To enhance the agreement with the real process conditions, the material data are acquired through a variety of experiments. Furthermore, the force distribution is taken into account. The simulation metamodel contains the virtual knowledge of a particular forming process, which is determined based on a series of finite element simulations with variable input parameters. Based on the metamodels, virtual process windows can be displayed for different configurations. This helps to improve the operating point as well as to adjust process settings in case the process becomes unstable. Furthermore, the time of tool tryout can be shortened due to transfer of the virtual knowledge contained in the metamodels on the optimisation of the drawbeads. This allows the tool manufacturer to focus on the essential, to save time and to recognize complex relationships.
BMDExpress Data Viewer: A Visualization Tool to Analyze BMDExpress Datasets
Regulatory agencies increasingly apply benchmark dose (BMD) modeling to determine points of departure in human risk assessments. BMDExpress applies BMD modeling to transcriptomics datasets and groups genes to biological processes and pathways for rapid assessment of doses at whic...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moirano, J
Purpose: An accurate dose estimate is necessary for effective patient management after a fetal exposure. In the case of a high-dose exposure, it is critical to use all resources available in order to make the most accurate assessment of the fetal dose. This work will demonstrate a methodology for accurate fetal dose estimation using tools that have recently become available in many clinics, and show examples of best practices for collecting data and performing the fetal dose calculation. Methods: A fetal dose estimate calculation was performed using modern data collection tools to determine parameters for the calculation. The reference pointmore » air kerma as displayed by the fluoroscopic system was checked for accuracy. A cumulative dose incidence map and DICOM header mining were used to determine the displayed reference point air kerma. Corrections for attenuation caused by the patient table and pad were measured and applied in order to determine the peak skin dose. The position and depth of the fetus was determined by ultrasound imaging and consultation with a radiologist. The data collected was used to determine a normalized uterus dose from Monte Carlo simulation data. Fetal dose values from this process were compared to other accepted calculation methods. Results: An accurate high-dose fetal dose estimate was made. Comparison to accepted legacy methods were were within 35% of estimated values. Conclusion: Modern data collection and reporting methods ease the process for estimation of fetal dose from interventional fluoroscopy exposures. Many aspects of the calculation can now be quantified rather than estimated, which should allow for a more accurate estimation of fetal dose.« less
Sustainable cooling method for machining titanium alloy
NASA Astrophysics Data System (ADS)
Boswell, B.; Islam, M. N.
2016-02-01
Hard to machine materials such as Titanium Alloy TI-6AI-4V Grade 5 are notoriously known to generate high temperatures and adverse reactions between the workpiece and the tool tip materials. These conditions all contribute to an increase in the wear mechanisms, reducing tool life. Titanium Alloy, for example always requires coolant to be used during machining. However, traditional flood cooling needs to be replaced due to environmental issues, and an alternative cooling method found that has minimum impact on the environment. For true sustainable cooling of the tool it is necessary to account for all energy used in the cooling process, including the energy involved in producing the coolant. Previous research has established that efficient cooling of the tool interface improves the tool life and cutting action. The objective of this research is to determine the most appropriate sustainable cooling method that can also reduce the rate of wear at the tool interface.
Moffitt, Christine M.
2017-01-01
This project tested and revised a risk assessment/management tool authored by Moffitt and Stockton designed to provide hatchery biologists and others a structure to measure risk and provide tools to control, prevent or eliminate invasive New Zealand mudsnails (NZMS) and other invasive mollusks in fish hatcheries and hatchery operations. The document has two parts: the risk assessment tool, and an appendix that summarizes options for control or management.The framework of the guidance document for risk assessment/hatchery tool combines approaches used by the Hazard Analysis and Critical Control Points (HACCP) process with those developed by the Commission for Environmental Cooperation (CEC), of Canada, Mexico, and the United States, in the Tri-National Risk Assessment Guidelines for Aquatic Alien Invasive Species. The framework approach for this attached first document assesses risk potential with two activities: probability of infestation and consequences of infestation. Each activity is treated equally to determine the risk potential. These two activities are divided into seven basic elements that utilize scientific, technical, and other relevant information in the process of the risk assessment. To determine the probability of infestation four steps are used that have scores reported or determined and averaged. This assessment follows a familiar HACCP process to assess pathways of entry, entry potential, colonization potential, spread potential. The economic, environmental and social consequences are considered as economic impact, environmental impact, and social and cultural influences.To test this document, the Principal Investigator worked to identify interested hatchery managers through contacts at regional aquaculture meetings, fish health meetings, and through the network of invasive species managers and scientists participating in the Western Regional Panel on Aquatic Nuisance Species and the 100th Meridian Initiative's Columbia River Basin Team, and the Western New Zealand Mudsnail Conference in Seattle. Targeted hatchery workshops were conducted with staff at Dworshak National Fish Hatchery Complex (ID), Similkameen Pond, Oroville WA, and Ringold Springs State Hatchery (WA).As a result of communications with hatchery staff, invasive species managers, and on site assessments of hatchery facilities, the document was modified and enhanced. Additional resources were added to keep it up to date. The result is a more simplified tool that can lead hatchery or management personnel through the process of risk assessment and provide an introduction to the risk management and communication process.In addition to the typical HACCP processes, this tool adds steps to rate and consider uncertainty and the weight of evidence regarding options and monitoring results . Uncertainty of outcome exists in most tools that can be used to control or prevent NZMS or other invasive mollusks from infesting an area. In additional this document emphasizes that specific control tools and plans must be tailored to each specific setting to consider the economic, environmental and social influences. From the testing and evaluation process, there was a strong recognition that a number of control and prevention tools previously suggested and reported in the literature from laboratory and small scale trials may not be compatible with regional and national regulations, economic constraints, social or cultural constraints, engineering or water chemistry characteristics of each facility.The options for control are summarized in the second document, Review of Control Measures for Hatcheries Infested with NZMS (Appendix A) that provides sources for additional resources and specific tools, and guidance regarding the feasibility and success of each approach. This tool also emphasizes that management plans need to be adaptive and incorporate oversight from professionals familiar with measuring risks of fish diseases, and treatments (e.g. the fish health practitioners and water quality and effluent management teams). Finally, with such a team, the adaptive management approach must be ongoing, and become a regular component of hatchery operations.Although it was the intent that this two part document would be included as part of the revised National Management and Control Plan for the NZMS proposed by the U.S. Fish and Wildlife Service (USFWS) and others, it is provided as a stand-alone document.
Cultivation of bacteria with ecological capsules in space
NASA Astrophysics Data System (ADS)
Sugiura, K.; Hashimoto, H.; Ishikawa, Y.; Kawasaki, Y.; Kobayashi, K.; Seki, K.; Koike, J.; Saito, T.
1999-01-01
A hermetically materially-closed aquatic microcosm containing bacteria, algae, and invertebrates was developed as a tool for determining the changes of ecological systems in space. The species composition was maintained for more than 365 days. The microcosm could be readily replicated. The results obtained from the simulation models indicated that there is a self-regulation homeostasis in coupling of production and consumption, which make the microcosm remarkably stable, and that the transfer of metabolites by diffusion is one of the important factors determining the behavior of the system. The microcosms were continuously irradiated using a 60 Co source. After 80 days, no elimination of organisms was found at any of the three irradiation levels (0.015, 0.55 and 3.0 mGy/day). The number of radio-resistance bacteria mutants was not increased in the microcosm at three irradiation levels. We proposed to research whether this microcosm is self-sustainable in space. When an aquatic ecosystem comes under stress due to the micro-gravity and enhanced radiation environment in space, whether the ecosystem is self-sustainable is not known. An aquatic ecosystem shows what happens as a result of the self-organizational processes of selection and adaptation. A microcosm is a useful tool for understanding such processes. We have proposed researching whether a microcosm is self-sustainable in space. The benefits of this project will be: (1) To acquire data for design of a Controlled Ecological Life Support System, (2) Possibility of microbial mutation in a space station. We report that a hermetically materially-closed microcosm, which could be a useful tool for determining changes of ecological processes in space, was developed, and that the effects of microgravity and enhanced radiation on the hermetically materially-closed microcosm were estimated through measurements on the Earth and simulation models.
Investigation into the Use of the Concept Laser QM System as an In-Situ Research and Evaluation Tool
NASA Technical Reports Server (NTRS)
Bagg, Stacey
2014-01-01
The NASA Marshall Space Flight Center (MSFC) is using a Concept Laser Fusing (Cusing) M2 powder bed additive manufacturing system for the build of space flight prototypes and hardware. NASA MSFC is collecting and analyzing data from the M2 QM Meltpool and QM Coating systems for builds. This data is intended to aide in understanding of the powder-bed additive manufacturing process, and in the development of a thermal model for the process. The QM systems are marketed by Concept Laser GmbH as in-situ quality management modules. The QM Meltpool system uses both a high-speed near-IR camera and a photodiode to monitor the melt pool generated by the laser. The software determines from the camera images the size of the melt pool. The camera also measures the integrated intensity of the IR radiation, and the photodiode gives an intensity value based on the brightness of the melt pool. The QM coating system uses a high resolution optical camera to image the surface after each layer has been formed. The objective of this investigation was to determine the adequacy of the QM Meltpool system as a research instrument for in-situ measurement of melt pool size and temperature and its applicability to NASA's objectives in (1) Developing a process thermal model and (2) Quantifying feedback measurements with the intent of meeting quality requirements or specifications. Note that Concept Laser markets the system only as capable of giving an indication of changes between builds, not as an in-situ research and evaluation tool. A secondary objective of the investigation is to determine the adequacy of the QM Coating system as an in-situ layer-wise geometry and layer quality evaluation tool.
Using the scanning electron microscope on the production line to assure quality semiconductors
NASA Technical Reports Server (NTRS)
Adolphsen, J. W.; Anstead, R. J.
1972-01-01
The use of the scanning electron microscope to detect metallization defects introduced during batch processing of semiconductor devices is discussed. A method of determining metallization integrity was developed which culminates in a procurement specification using the scanning microscope on the production line as a quality control tool. Batch process control of the metallization operation is monitored early in the manufacturing cycle.
A Pipeline Software Architecture for NMR Spectrum Data Translation
Ellis, Heidi J.C.; Weatherby, Gerard; Nowling, Ronald J.; Vyas, Jay; Fenwick, Matthew; Gryk, Michael R.
2012-01-01
The problem of formatting data so that it conforms to the required input for scientific data processing tools pervades scientific computing. The CONNecticut Joint University Research Group (CONNJUR) has developed a data translation tool based on a pipeline architecture that partially solves this problem. The CONNJUR Spectrum Translator supports data format translation for experiments that use Nuclear Magnetic Resonance to determine the structure of large protein molecules. PMID:24634607
Friction Stir Spot Welding of Advanced High Strength Steels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hovanski, Yuri; Grant, Glenn J.; Santella, M. L.
Friction stir spot welding techniques were developed to successfully join several advanced high strength steels. Two distinct tool materials were evaluated to determine the effect of tool materials on the process parameters and joint properties. Welds were characterized primarily via lap shear, microhardness, and optical microscopy. Friction stir spot welds were compared to the resistance spot welds in similar strength alloys by using the AWS standard for resistance spot welding high strength steels. As further comparison, a primitive cost comparison between the two joining processes was developed, which included an evaluation of the future cost prospects of friction stir spotmore » welding in advanced high strength steels.« less
Sohl, Stephanie Jean; Birdee, Gurjeet; Elam, Roy
2016-11-01
Improving health behaviors is fundamental to preventing and controlling chronic disease. Healthcare providers who have a patient-centered communication style and appropriate behavioral change tools can empower patients to engage in and sustain healthy behaviors. This review highlights motivational interviewing and mindfulness along with other evidence-based strategies for enhancing patient-centered communication and the behavior change process. Motivational interviewing and mindfulness are especially useful for empowering patients to set self-determined, or autonomous, goals for behavior change. This is important because autonomously motivated behavioral change is more sustainable. Additional strategies such as self-monitoring are discussed as useful for supporting the implementation and maintenance of goals. Thus, there is a need for healthcare providers to develop such tools to empower sustained behavior change. The additional support of a new role, a health coach who specializes in facilitating the process of health-related behavior change, may be required to substantially impact public health.
Sohl, Stephanie Jean; Birdee, Gurjeet; Elam, Roy
2015-01-01
Improving health behaviors is fundamental to preventing and controlling chronic disease. Healthcare providers who have a patient-centered communication style and appropriate behavioral change tools can empower patients to engage in and sustain healthy behaviors. This review highlights motivational interviewing and mindfulness along with other evidence-based strategies for enhancing patient-centered communication and the behavior change process. Motivational interviewing and mindfulness are especially useful for empowering patients to set self-determined, or autonomous, goals for behavior change. This is important because autonomously motivated behavioral change is more sustainable. Additional strategies such as self-monitoring are discussed as useful for supporting the implementation and maintenance of goals. Thus, there is a need for healthcare providers to develop such tools to empower sustained behavior change. The additional support of a new role, a health coach who specializes in facilitating the process of health-related behavior change, may be required to substantially impact public health. PMID:28239308
Adaptive Management and Monitoring as Fundamental Tools to Effective Salt Marsh Restoration
Adaptive management as applied to ecological restoration is a systematic decision-making process in which the results of restoration activities are repeatedly monitored and evaluated to provide guidance that can be used in determining any necessary future restoration actions. In...
Skoulikidis, N Th; Amaxidis, Y; Bertahas, I; Laschou, S; Gritzalis, K
2006-06-01
Twenty-nine small- and mid-sized permanent rivers (thirty-six sites) scattered throughout Greece and equally distributed within three geo-chemical-climatic zones, have been investigated in a seasonal base. Hydrochemical types have been determined and spatio-temporal variations have been interpreted in relation to environmental characteristics and anthropogenic pressures. Multivariate statistical techniques have been used to identify the factors and processes affecting hydrochemical variability and the driving forces that control aquatic composition. It has been shown that spatial variation of aquatic quality is mainly governed by geological and hydrogeological factors. Due to geological and climatic variability, the three zones have different hydrochemical characteristics. Temporal hydrological variations in combination with hydrogeological factors control seasonal hydrochemical trends. Respiration processes due to municipal wastewaters, dominate in summer, and enhance nutrient, chloride and sodium concentrations, while nitrate originates primarily from agriculture. Photosynthetic processes dominate in spring. Carbonate chemistry is controlled by hydrogeological factors and biological activity. A possible enrichment of surface waters with nutrients in "pristine" forested catchments is attributed to soil leaching and mineralisation processes. Two management tools have been developed: a nutrient classification system and a rapid prediction of aquatic composition tool.
NASA Astrophysics Data System (ADS)
Devillez, Arnaud; Dudzinski, Daniel
2007-01-01
Today the knowledge of a process is very important for engineers to find optimal combination of control parameters warranting productivity, quality and functioning without defects and failures. In our laboratory, we carry out research in the field of high speed machining with modelling, simulation and experimental approaches. The aim of our investigation is to develop a software allowing the cutting conditions optimisation to limit the number of predictive tests, and the process monitoring to prevent any trouble during machining operations. This software is based on models and experimental data sets which constitute the knowledge of the process. In this paper, we deal with the problem of vibrations occurring during a machining operation. These vibrations may cause some failures and defects to the process, like workpiece surface alteration and rapid tool wear. To measure on line the tool micro-movements, we equipped a lathe with a specific instrumentation using eddy current sensors. Obtained signals were correlated with surface finish and a signal processing algorithm was used to determine if a test is stable or unstable. Then, a fuzzy classification method was proposed to classify the tests in a space defined by the width of cut and the cutting speed. Finally, it was shown that the fuzzy classification takes into account of the measurements incertitude to compute the stability limit or stability lobes of the process.
ArcCN-Runoff: An ArcGIS tool for generating curve number and runoff maps
Zhan, X.; Huang, M.-L.
2004-01-01
The development and the application of ArcCN-Runoff tool, an extension of ESRI@ ArcGIS software, are reported. This tool can be applied to determine curve numbers and to calculate runoff or infiltration for a rainfall event in a watershed. Implementation of GIS techniques such as dissolving, intersecting, and a curve-number reference table improve efficiency. Technical processing time may be reduced from days, if not weeks, to hours for producing spatially varied curve number and runoff maps. An application example for a watershed in Lyon County and Osage County, Kansas, USA, is presented. ?? 2004 Elsevier Ltd. All rights reserved.
Kozunov, Vladimir; Nikolaeva, Anastasia; Stroganova, Tatiana A.
2018-01-01
The brain mechanisms that integrate the separate features of sensory input into a meaningful percept depend upon the prior experience of interaction with the object and differ between categories of objects. Recent studies using representational similarity analysis (RSA) have characterized either the spatial patterns of brain activity for different categories of objects or described how category structure in neuronal representations emerges in time, but never simultaneously. Here we applied a novel, region-based, multivariate pattern classification approach in combination with RSA to magnetoencephalography data to extract activity associated with qualitatively distinct processing stages of visual perception. We asked participants to name what they see whilst viewing bitonal visual stimuli of two categories predominantly shaped by either value-dependent or sensorimotor experience, namely faces and tools, and meaningless images. We aimed to disambiguate the spatiotemporal patterns of brain activity between the meaningful categories and determine which differences in their processing were attributable to either perceptual categorization per se, or later-stage mentalizing-related processes. We have extracted three stages of cortical activity corresponding to low-level processing, category-specific feature binding, and supra-categorical processing. All face-specific spatiotemporal patterns were associated with bilateral activation of ventral occipito-temporal areas during the feature binding stage at 140–170 ms. The tool-specific activity was found both within the categorization stage and in a later period not thought to be associated with binding processes. The tool-specific binding-related activity was detected within a 210–220 ms window and was located to the intraparietal sulcus of the left hemisphere. Brain activity common for both meaningful categories started at 250 ms and included widely distributed assemblies within parietal, temporal, and prefrontal regions. Furthermore, we hypothesized and tested whether activity within face and tool-specific binding-related patterns would demonstrate oppositely acting effects following procedural perceptual learning. We found that activity in the ventral, face-specific network increased following the stimuli repetition. In contrast, tool processing in the dorsal network adapted by reducing its activity over the repetition period. Altogether, we have demonstrated that activity associated with visual processing of faces and tools during the categorization stage differ in processing timing, brain areas involved, and in their dynamics underlying stimuli learning. PMID:29379426
A drilling tool design and in situ identification of planetary regolith mechanical parameters
NASA Astrophysics Data System (ADS)
Zhang, Weiwei; Jiang, Shengyuan; Ji, Jie; Tang, Dewei
2018-05-01
The physical and mechanical properties as well as the heat flux of regolith are critical evidence in the study of planetary origin and evolution. Moreover, the mechanical properties of planetary regolith have great value for guiding future human planetary activities. For planetary subsurface exploration, an inchworm boring robot (IBR) has been proposed to penetrate the regolith, and the mechanical properties of the regolith are expected to be simultaneously investigated during the penetration process using the drilling tool on the IBR. This paper provides a preliminary study of an in situ method for measuring planetary regolith mechanical parameters using a drilling tool on a test bed. A conical-screw drilling tool was designed, and its drilling load characteristics were experimentally analyzed. Based on the drilling tool-regolith interaction model, two identification methods for determining the planetary regolith bearing and shearing parameters are proposed. The bearing and shearing parameters of lunar regolith simulant were successfully determined according to the pressure-sinkage tests and shear tests conducted on the test bed. The effects of the operating parameters on the identification results were also analyzed. The results indicate a feasible scheme for future planetary subsurface exploration.
DOT National Transportation Integrated Search
2006-01-01
The implementation of an effective performance-based construction quality management requires a tool for determining impacts of construction quality on the life-cycle performance of pavements. This report presents an update on the efforts in the deve...
New methodology to baseline and match AME polysilicon etcher using advanced diagnostic tools
NASA Astrophysics Data System (ADS)
Poppe, James; Shipman, John; Reinhardt, Barbara E.; Roussel, Myriam; Hedgecock, Raymond; Fonda, Arturo
1999-09-01
As process controls tighten in the semiconductor industry, the need to understand the variables that determine system performance become more important. For plasma etch systems, process success depends on the control of key parameters such as: vacuum integrity, pressure, gas flows, and RF power. It is imperative to baseline, monitor, and control these variables. This paper presents an overview of the methods and tools used by Motorola BMC fabrication facility to characterize an Applied Materials polysilicon etcher. Tool performance data obtained from our traditional measurement techniques are limited in their scope and do not provide a complete picture of the ultimate tool performance. Presently the BMC traditional characterization tools provide a snapshot of the static operation of the equipment under test (EUT); however, complete evaluation of the dynamic performance cannot be monitored without the aid of specialized diagnostic equipment. To provide us with a complete system baseline evaluation of the polysilicon etcher, three diagnostic tools were utilized: Lucas Labs Vacuum Diagnostic System, Residual Gas Analyzer, and the ENI Voltage/Impedance Probe. The diagnostic methodology used to baseline and match key parameters of qualified production equipment has had an immense impact on other equipment characterization in the facility. It has resulted in reduced cycle time for new equipment introduction as well.
A Holistic Framework for Environmental Flows Determination in Hydropower Contexts
DOE Office of Scientific and Technical Information (OSTI.GOV)
McManamay, Ryan A; Bevelhimer, Mark S
2013-05-01
Among the ecological science community, the consensus view is that the natural flow regime sustains the ecological integrity of river systems. This prevailing viewpoint by many environmental stakeholders has progressively led to increased pressure on hydropower dam owners to change plant operations to affect downstream river flows with the intention of providing better conditions for aquatic biological communities. Identifying the neccessary magnitude, frequency, duration, timing, or rate of change of stream flows to meet ecological needs in a hydropower context is challenging because the ecological responses to changes in flows may not be fully known, there are usually a multitudemore » of competing users of flow, and implementing environmental flows usually comes at a price to energy production. Realistically, hydropower managers must develop a reduced set of goals that provide the most benefit to the identified ecological needs. As a part of the Department of Energy (DOE) Water Power Program, the Instream Flow Project (IFP) was carried out by Oak Ridge National Laboratory (ORNL), Pacific Northwest National Laboratory (PNNL), and Argon National Laboratory (ANL) as an attempt to develop tools aimed at defining environmental flow needs for hydropower operations. The application of these tools ranges from national to site-specific scales; thus, the utility of each tool will depend on various phases of the environmental flow process. Given the complexity and sheer volume of applications used to determine environmentally acceptable flows for hydropower, a framework is needed to organize efforts into a staged process dependent upon spatial, temporal, and functional attributes. By far, the predominant domain for determining environmental flows related to hydropower is within the Federal Energy Regulatory Commission (FERC) relicensing process. This process can take multiple years and can be very expensive depending on the scale of each hydropower project. The utility of such a framework is that it can expedite the environmental flow process by 1) organizing data and applications to identify predictable relationships between flows and ecology, and 2) suggesting when and where tools should be used in the environmental flow process. In addition to regulatory procedures, a framework should also provide the coordination for a comprehensive research agenda to guide the science of environmental flows. This research program has further reaching benefits than just environmental flow determination by providing modeling applications, data, and geospatial layers to inform potential hydropower development. We address several objectives within this document that highlight the limitations of existing environmental flow paradigms and their applications to hydropower while presenting a new framework catered towards hydropower needs. Herein, we address the following objectives: 1) Provide a brief overview of the Natural Flow Regime paradigm and existing environmental flow frameworks that have been used to determine ecologically sensitive stream flows for hydropower operations. 2) Describe a new conceptual framework to aid in determining flows needed to meet ecological objectives with regard to hydropower operations. The framework is centralized around determining predictable relationships between flow and ecological responses. 3) Provide evidence of how efforts from ORNL, PNNL, and ANL have filled some of the gaps in this broader framework, and suggest how the framework can be used to set the stage for a research agenda for environmental flow.« less
Streamlining the Design-to-Build Transition with Build-Optimization Software Tools.
Oberortner, Ernst; Cheng, Jan-Fang; Hillson, Nathan J; Deutsch, Samuel
2017-03-17
Scaling-up capabilities for the design, build, and test of synthetic biology constructs holds great promise for the development of new applications in fuels, chemical production, or cellular-behavior engineering. Construct design is an essential component in this process; however, not every designed DNA sequence can be readily manufactured, even using state-of-the-art DNA synthesis methods. Current biological computer-aided design and manufacture tools (bioCAD/CAM) do not adequately consider the limitations of DNA synthesis technologies when generating their outputs. Designed sequences that violate DNA synthesis constraints may require substantial sequence redesign or lead to price-premiums and temporal delays, which adversely impact the efficiency of the DNA manufacturing process. We have developed a suite of build-optimization software tools (BOOST) to streamline the design-build transition in synthetic biology engineering workflows. BOOST incorporates knowledge of DNA synthesis success determinants into the design process to output ready-to-build sequences, preempting the need for sequence redesign. The BOOST web application is available at https://boost.jgi.doe.gov and its Application Program Interfaces (API) enable integration into automated, customized DNA design processes. The herein presented results highlight the effectiveness of BOOST in reducing DNA synthesis costs and timelines.
Applying Parallel Processing Techniques to Tether Dynamics Simulation
NASA Technical Reports Server (NTRS)
Wells, B. Earl
1996-01-01
The focus of this research has been to determine the effectiveness of applying parallel processing techniques to a sizable real-world problem, the simulation of the dynamics associated with a tether which connects two objects in low earth orbit, and to explore the degree to which the parallelization process can be automated through the creation of new software tools. The goal has been to utilize this specific application problem as a base to develop more generally applicable techniques.
Investigation of effects of process parameters on properties of friction stir welded joints
NASA Astrophysics Data System (ADS)
Chauhan, Atul; Soota, Tarun; Rajput, S. K.
2018-03-01
This work deals with application of friction stir welding (FSW) using application of Taguchi orthogonal array. FSW procedure is used for joining the aluminium alloy AA6063-T0 plates in butt configuration with orthogonal combination of factors and their levels. The combination of factors involving tool rotation speed, tool travel speed and tool pin profile are used in three levels. Grey relational analysis (GRA) has been applied to select optimum level of factors for optimising UTS, ductility and hardness of joint. Experiments have been conducted with two different tool materials (HSS and HCHCr steel) with various factors level combinations for joining AA6063-T0. On the basis of grey relational grades at different levels of factors and analysis of variance (ANOVA) ideal combination of factors are determined. The influence of tool material is also studied.
Harnessing ISO/IEC 12207 to Examine the Extent of SPI Activity in an Organisation
NASA Astrophysics Data System (ADS)
Clarke, Paul; O'Connor, Rory
The quality of the software development process directly affects the quality of the software product. To be successful, software development organisations must respond to changes in technology and business circumstances, and therefore software process improvement (SPI) is required. SPI activity relates to any modification that is performed to the software process in order to improve an aspect of the process. Although multiple process assessments could be employed to examine SPI activity, they present an inefficient tool for such an examination. This paper presents an overview of a new survey-based resource that utilises the process reference model in ISO/IEC 12207 in order to expressly and directly determine the level of SPI activity in a software development organisation. This survey instrument can be used by practitioners, auditors and researchers who are interested in determining the extent of SPI activity in an organisation.
Extending i-line capabilities through variance characterization and tool enhancement
NASA Astrophysics Data System (ADS)
Miller, Dan; Salinas, Adrian; Peterson, Joel; Vickers, David; Williams, Dan
2006-03-01
Continuous economic pressures have moved a large percent of integrated device manufacturing (IDM) operations either overseas or to foundry operations over the last 10 years. These pressures have left the IDM fabs in the U.S. with required COO improvements in order to maintain operations domestically. While the assets of many of these factories are at a very favorable point in the depreciation life cycle, the equipment and processes are constrained to the quality of the equipment in its original state and the degradation over its installed life. With the objective to enhance output and improve process performance, this factory and their primary lithography process tool supplier have been able to extend the usable life of the existing process tools, increase the output of the tool base, and improve the distribution of the CDs on the product produced. Texas Instruments Incorporated lead an investigation with the POLARIS ® Systems & Services business of FSI International to determine the sources of variance in the i-line processing of a wide array of IC device types. Data from the sources of variance were investigated such as PEB temp, PEB delay time, develop recipe, develop time, and develop programming. While PEB processes are a primary driver of acid catalyzed resists, the develop mode is shown in this work to have an overwhelming impact on the wafer to wafer and across wafer CD performance of these i-line processes. These changes have been able to improve the wafer to wafer CD distribution by more than 80 %, and the within wafer CD distribution by more than 50 % while enabling a greater than 50 % increase in lithography cluster throughput. The paper will discuss the contribution from each of the sources of variance and their importance in overall system performance.
Tamjidy, Mehran; Baharudin, B. T. Hang Tuah; Paslar, Shahla; Matori, Khamirul Amin; Sulaiman, Shamsuddin; Fadaeifard, Firouz
2017-01-01
The development of Friction Stir Welding (FSW) has provided an alternative approach for producing high-quality welds, in a fast and reliable manner. This study focuses on the mechanical properties of the dissimilar friction stir welding of AA6061-T6 and AA7075-T6 aluminum alloys. The FSW process parameters such as tool rotational speed, tool traverse speed, tilt angle, and tool offset influence the mechanical properties of the friction stir welded joints significantly. A mathematical regression model is developed to determine the empirical relationship between the FSW process parameters and mechanical properties, and the results are validated. In order to obtain the optimal values of process parameters that simultaneously optimize the ultimate tensile strength, elongation, and minimum hardness in the heat affected zone (HAZ), a metaheuristic, multi objective algorithm based on biogeography based optimization is proposed. The Pareto optimal frontiers for triple and dual objective functions are obtained and the best optimal solution is selected through using two different decision making techniques, technique for order of preference by similarity to ideal solution (TOPSIS) and Shannon’s entropy. PMID:28772893
Tamjidy, Mehran; Baharudin, B T Hang Tuah; Paslar, Shahla; Matori, Khamirul Amin; Sulaiman, Shamsuddin; Fadaeifard, Firouz
2017-05-15
The development of Friction Stir Welding (FSW) has provided an alternative approach for producing high-quality welds, in a fast and reliable manner. This study focuses on the mechanical properties of the dissimilar friction stir welding of AA6061-T6 and AA7075-T6 aluminum alloys. The FSW process parameters such as tool rotational speed, tool traverse speed, tilt angle, and tool offset influence the mechanical properties of the friction stir welded joints significantly. A mathematical regression model is developed to determine the empirical relationship between the FSW process parameters and mechanical properties, and the results are validated. In order to obtain the optimal values of process parameters that simultaneously optimize the ultimate tensile strength, elongation, and minimum hardness in the heat affected zone (HAZ), a metaheuristic, multi objective algorithm based on biogeography based optimization is proposed. The Pareto optimal frontiers for triple and dual objective functions are obtained and the best optimal solution is selected through using two different decision making techniques, technique for order of preference by similarity to ideal solution (TOPSIS) and Shannon's entropy.
Development of a Mobile Tool That Semiautomatically Screens Patients for Stroke Clinical Trials.
Spokoyny, Ilana; Lansberg, Maarten; Thiessen, Rosita; Kemp, Stephanie M; Aksoy, Didem; Lee, YongJae; Mlynash, Michael; Hirsch, Karen G
2016-10-01
Despite several national coordinated research networks, enrollment in many cerebrovascular trials remains challenging. An electronic tool was needed that would improve the efficiency and efficacy of screening for multiple simultaneous acute clinical stroke trials by automating the evaluation of inclusion and exclusion criteria, improving screening procedures and streamlining the communication process between the stroke research coordinators and the stroke clinicians. A multidisciplinary group consisting of physicians, study coordinators, and biostatisticians designed and developed an electronic clinical trial screening tool on a HIPAA (Health Insurance Portability and Accountability Act)-compliant platform. A web-based tool was developed that uses branch logic to determine eligibility for simultaneously enrolling clinical trials and automatically notifies the study coordinator teams about eligible patients. After 12 weeks of use, 225 surveys were completed, and 51 patients were enrolled in acute stroke clinical trials. Compared with the 12 weeks before implementation of the tool, there was an increase in enrollment from 16.5% of patients screened to 23.4% of patients screened (P<0.05). Clinicians and coordinators reported increased satisfaction with the process and improved ease of screening. We created a semiautomated electronic screening tool that uses branch logic to screen patients for stroke clinical trials. The tool has improved efficiency and efficacy of screening, and it could be adapted for use at other sites and in other medical fields. © 2016 American Heart Association, Inc.
Bishai, David; Sherry, Melissa; Pereira, Claudia C; Chicumbe, Sergio; Mbofana, Francisco; Boore, Amy; Smith, Monica; Nhambi, Leonel; Borse, Nagesh N
2016-01-01
This study describes the development of a self-audit tool for public health and the associated methodology for implementing a district health system self-audit tool that can provide quantitative data on how district governments perceive their performance of the essential public health functions. Development began with a consensus-building process to engage Ministry of Health and provincial health officers in Mozambique and Botswana. We then worked with lists of relevant public health functions as determined by these stakeholders to adapt a self-audit tool describing essential public health functions to each country's health system. We then piloted the tool across districts in both countries and conducted interviews with district health personnel to determine health workers' perception of the usefulness of the approach. Country stakeholders were able to develop consensus around 11 essential public health functions that were relevant in each country. Pilots of the self-audit tool enabled the tool to be effectively shortened. Pilots also disclosed a tendency to upcode during self-audits that was checked by group deliberation. Convening sessions at the district enabled better attendance and representative deliberation. Instant feedback from the audit was a feature that 100% of pilot respondents found most useful. The development of metrics that provide feedback on public health performance can be used as an aid in the self-assessment of health system performance at the district level. Measurements of practice can open the door to future applications for practice improvement and research into the determinants and consequences of better public health practice. The current tool can be assessed for its usefulness to district health managers in improving their public health practice. The tool can also be used by the Ministry of Health or external donors in the African region for monitoring the district-level performance of the essential public health functions.
Bishai, David; Sherry, Melissa; Pereira, Claudia C.; Chicumbe, Sergio; Mbofana, Francisco; Boore, Amy; Smith, Monica; Nhambi, Leonel; Borse, Nagesh N.
2018-01-01
Introduction This study describes the development of a self-audit tool for public health and the associated methodology for implementing a district health system self-audit tool that can provide quantitative data on how district governments perceive their own performance of the essential public health functions. Methods Development began with a consensus building process to engage Ministry of Health and provincial health officers in Mozambique and Botswana. We then worked with lists of relevant public health functions as determined by these stakeholders to adapt a self-audit tool describing essential public health functions to each country’s health system. We then piloted the tool across districts in both countries and conducted interviews with district health personnel to determine health workers’ perception of the usefulness of the approach. Results Country stakeholders were able to develop consensus around eleven essential public health functions that were relevant in each country. Pilots of the self-audit tool enabled the tool to be effectively shortened. Pilots also disclosed a tendency to up code during self-audits that was checked by group deliberation. Convening sessions at the district enabled better attendance and representative deliberation. Instant feedback from the audit was a feature that 100% of pilot respondents found most useful. Conclusions The development of metrics that provide feedback on public health performance can be used as an aid in the self-assessment of health system performance at the district level. Measurements of practice can open the door to future applications for practice improvement and research into the determinants and consequences of better public health practice. The current tool can be assessed for its usefulness to district health managers in improving their public health practice. The tool can also be used by ministry of health or external donors in the African region for monitoring the district level performance of the essential public health functions. PMID:27682727
Levy, Andrew E; Shah, Nishant R; Matheny, Michael E; Reeves, Ruth M; Gobbel, Glenn T; Bradley, Steven M
2018-04-25
Reporting standards promote clarity and consistency of stress myocardial perfusion imaging (MPI) reports, but do not require an assessment of post-test risk. Natural Language Processing (NLP) tools could potentially help estimate this risk, yet it is unknown whether reports contain adequate descriptive data to use NLP. Among VA patients who underwent stress MPI and coronary angiography between January 1, 2009 and December 31, 2011, 99 stress test reports were randomly selected for analysis. Two reviewers independently categorized each report for the presence of critical data elements essential to describing post-test ischemic risk. Few stress MPI reports provided a formal assessment of post-test risk within the impression section (3%) or the entire document (4%). In most cases, risk was determinable by combining critical data elements (74% impression, 98% whole). If ischemic risk was not determinable (25% impression, 2% whole), inadequate description of systolic function (9% impression, 1% whole) and inadequate description of ischemia (5% impression, 1% whole) were most commonly implicated. Post-test ischemic risk was determinable but rarely reported in this sample of stress MPI reports. This supports the potential use of NLP to help clarify risk. Further study of NLP in this context is needed.
Peeling Onions: Some Tools and a Recipe for Solving Ethical Dilemmas.
ERIC Educational Resources Information Center
Gordon, Joan Claire
1993-01-01
Presents a process for solving ethical dilemmas: define the problem; identify facts; determine values; "slice" the problem different ways--duties, virtues, rights, and common good; rank ethical considerations; consult colleagues; and take action. (SK)
Guidance on Systematic Planning Using the Data Quality Objectives Process, EPA QA/G-4
Provides a standard working tool for project managers and planners to develop DQO for determining the type, quantity, and quality of data needed to reach defensible decisions or make credible estimates.
Process for selecting engineering tools : applied to selecting a SysML tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Spain, Mark J.; Post, Debra S.; Taylor, Jeffrey L.
2011-02-01
Process for Selecting Engineering Tools outlines the process and tools used to select a SysML (Systems Modeling Language) tool. The process is general in nature and users could use the process to select most engineering tools and software applications.
The chance of Sweden's public health targets making a difference.
Lager, Anton; Guldbrandsson, Karin; Fossum, Bjöörn
2007-03-01
There is a trend in health policy towards more focus on determinants and societal interventions and less on individuals. The Swedish public health targets are in line with this trend. The value of public health targets lies in their ability to function as a tool in governing with targets. This paper examines the possibility of the Swedish targets functioning as such a tool. Document analyses were performed to examine three prerequisites of governing with targets: (1) the influence of the administration in the target setting process, (2) the explicitness of targets and (3) the follow-up system. The material consisted of the documents from the committee drafting the targets, the written opinions on the drafts, and the governmental bill with the adopted public health targets. The administration influenced the target setting process. Further, the government invests in a follow-up system that makes indicators on health determinants visible. However, although there existed explicit targets earlier in the process, the final targets in the bill are not explicit enough. The Swedish public health targets are not explicit enough to function in governing with targets. The reasons for this were political rather than technical. This suggests that policy makers focusing health determinants should not put time and resources in technical target formulating. Instead they could make indicators visible, thereby drawing attention to trends that are political by nature.
Consensus statements for screening and assessment tools.
Bédard, Michel; Dickerson, Anne E
2014-04-01
Occupational therapists, both generalists and specialists, have a critical role in providing services to senior drivers. These services include evaluating fitness-to-drive, developing interventions to support community mobility, and facilitating the transition from driving to non-driving when necessary for personal and community safety. The evaluation component and decision-making process about fitness-to-drive are highly dependent on the use of screening and assessment tools. The purpose of this paper is to briefly present the rationale and context for 12 consensus statements about the usefulness and appropriateness of screening and assessment tools to determine fitness-to-drive, within the occupational therapy clinical setting, and their implications on community mobility.
A guided interview process to improve student pharmacists' identification of drug therapy problems.
Rovers, John; Miller, Michael J; Koenigsfeld, Carrie; Haack, Sally; Hegge, Karly; McCleeary, Erin
2011-02-10
To measure agreement between advanced pharmacy practice experience students using a guided interview process and experienced clinical pharmacists using standard practices to identify drug therapy problems. Student pharmacists enrolled in an advanced pharmacy practice experience (APPE) and clinical pharmacists conducted medication therapy management interviews to identify drug therapy problems in elderly patients recruited from the community. Student pharmacists used a guided interview tool, while clinical pharmacists' interviews were conducted using their usual and customary practices. Student pharmacists also were surveyed to determine their perceptions of the interview tool. Fair to moderate agreement was observed on student and clinical pharmacists' identification of 4 of 7 drug therapy problems. Of those, agreement was significantly higher than chance for 3 drug therapy problems (adverse drug reaction, dosage too high, and needs additional drug therapy) and not significant for 1 (unnecessary drug therapy). Students strongly agreed that the interview tool was useful but agreed less strongly on recommending its use in practice. The guided interview process served as a useful teaching aid to assist student pharmacists to identify drug therapy problems.
A Guided Interview Process to Improve Student Pharmacists' Identification of Drug Therapy Problems
Miller, Michael J.; Koenigsfeld, Carrie; Haack, Sally; Hegge, Karly; McCleeary, Erin
2011-01-01
Objective To measure agreement between advanced pharmacy practice experience students using a guided interview process and experienced clinical pharmacists using standard practices to identify drug therapy problems. Methods Student pharmacists enrolled in an advanced pharmacy practice experience (APPE) and clinical pharmacists conducted medication therapy management interviews to identify drug therapy problems in elderly patients recruited from the community. Student pharmacists used a guided interview tool, while clinical pharmacists' interviews were conducted using their usual and customary practices. Student pharmacists also were surveyed to determine their perceptions of the interview tool. Results Fair to moderate agreement was observed on student and clinical pharmacists' identification of 4 of 7 drug therapy problems. Of those, agreement was significantly higher than chance for 3 drug therapy problems (adverse drug reaction, dosage too high, and needs additional drug therapy) and not significant for 1 (unnecessary drug therapy). Students strongly agreed that the interview tool was useful but agreed less strongly on recommending its use in practice. Conclusions The guided interview process served as a useful teaching aid to assist student pharmacists to identify drug therapy problems. PMID:21451770
The Evolution of Friction Stir Welding Theory at Marshall Space Flight Center
NASA Technical Reports Server (NTRS)
Nunes, Arthur C.
2012-01-01
From 1995 to the present the friction stir welding (FSW) process has been under study at Marshall Space Flight Center (MSFC). This is an account of the progressive emergence of a set of conceptual tools beginning with the discovery of the shear surface, wiping metal transfer, and the invention of a kinematic model and making possible a treatment of both metallurgical structure formation and process dynamics in friction stir welding from a unified point of view. It is generally observed that the bulk of the deformation of weld metal around the FSW pin takes place in a very narrow, almost discontinuous zone with high deformation rates characteristic of metal cutting. By 1999 it was realized that this zone could be treated as a shear surface like that in simple metal cutting models. At the shear surface the seam is drawn out and compressed and pressure and flow conditions determine whether or not a sound weld is produced. The discovery of the shear surface was followed by the synthesis of a simple 3- flow kinematic model of the FSW process. Relative to the tool the flow components are: (1) an approaching translational flow at weld speed V, (2) a rotating cylindrical plug flow with the angular velocity of the tool , and (3) a relatively slow ring vortex flow (like a smoke ring) encircling the tool and driven by shoulder scrolls and pin threads. The rotating plug flow picks up an element of weld metal, rotates it around with the tool, and deposits it behind the tool ( wiping metal transfer ); it forms plan section loops in tracers cut through by the tool. Radially inward flow from the ring vortex component retains metal longer in the rotating plug and outward flow expels metal earlier; this interaction forms the looping weld seam trace and the tongue and groove bimetallic weld contour. The radial components of the translational and ring vortex flows introduce parent metal intrusions into the small grained nugget material close to the tool shoulder; if this feature is pronounced, nugget collapse may result. Certain weld features, in particular internal banding seen in transverse section as onion rings and associated surface ridges called tool marks , have long implied an oscillation flow component, but have only recently been attributed in the literature to tool eccentricity. Rotating plug shape, typically a hollow cylinder flared at the end where it sticks to the shoulder, varies as pressure distribution on the tool determines where sticking occurs. Simplified power input estimates balanced against heat loss estimates give reasonable temperature estimates, explain why the power requirement changes hardly at all over a wide range of RPM s, and yield isotherms that seem to fall along boundaries of parameter windows of operation.
Microtube strip heat exchanger
NASA Astrophysics Data System (ADS)
Doty, F. D.
1991-04-01
During the last quarter, Doty Scientific, Inc. (DSI) continued to make progress on the microtube strip (MTS) heat exchangers. The team has begun a heat exchanger stress analysis; however, they have been concentrating the bulk of their analytical energies on a computational fluid dynmaics (CFD) model to determine the location and magnitude of shell-side flow maldistribution which decreases heat exchanger effectiveness. DSI received 120 fineblanked tubestrips from Southern Fineblanking (SFB) for manufacturing process development. Both SFB and NIST provided inspection reports of the tubestrips. DSI completed the tooling required to encapsulate a tube array and press tubestrips on the array. Pressing the tubestrips on tube arrays showed design deficiencies both in the tubestrip design and the tooling design. DSI has a number of revisions in process to correct these deficiencies. The research effort has identified a more economical fusible alloy for encapsulating the tube array, and determined the parameters required to successfully encapsulate the tube array with the new alloy. A more compact MTS heat exchanger bank was designed.
Dynamic ultrasonic contact detection using acoustic emissions.
Turner, S L; Rabani, A; Axinte, D A; King, C W
2014-03-01
For a non-contact ultrasonic material removal process, the control of the standoff position can be crucial to process performance; particularly where the requirement is for a standoff of the order of <20 μm. The standoff distance relative to the surface to be machined can be set by first contacting the ultrasonic tool tip with the surface and then withdrawing the tool to the required position. Determination of this contact point in a dynamic system at ultrasonic frequencies (>20 kHz) is achieved by force measurement or by detection of acoustic emissions (AE). However, where detection of distance from a surface must be determined without contact taking place, an alternative method must be sought. In this paper, the effect of distance from contact of an ultrasonic tool is measured by detection of AE through the workpiece. At the point of contact, the amplitude of the signal at the fundamental frequency increases significantly, but the strength of the 2nd and 3rd harmonic signals increases more markedly. Closer examination of these harmonics shows that an increase in their intensities can be observed in the 10 μm prior to contact, providing a mechanism to detect near contact (<10 μm) without the need to first contact the surface in order to set a standoff. Copyright © 2013 Elsevier B.V. All rights reserved.
The use of self-organising maps for anomalous behaviour detection in a digital investigation.
Fei, B K L; Eloff, J H P; Olivier, M S; Venter, H S
2006-10-16
The dramatic increase in crime relating to the Internet and computers has caused a growing need for digital forensics. Digital forensic tools have been developed to assist investigators in conducting a proper investigation into digital crimes. In general, the bulk of the digital forensic tools available on the market permit investigators to analyse data that has been gathered from a computer system. However, current state-of-the-art digital forensic tools simply cannot handle large volumes of data in an efficient manner. With the advent of the Internet, many employees have been given access to new and more interesting possibilities via their desktop. Consequently, excessive Internet usage for non-job purposes and even blatant misuse of the Internet have become a problem in many organisations. Since storage media are steadily growing in size, the process of analysing multiple computer systems during a digital investigation can easily consume an enormous amount of time. Identifying a single suspicious computer from a set of candidates can therefore reduce human processing time and monetary costs involved in gathering evidence. The focus of this paper is to demonstrate how, in a digital investigation, digital forensic tools and the self-organising map (SOM)--an unsupervised neural network model--can aid investigators to determine anomalous behaviours (or activities) among employees (or computer systems) in a far more efficient manner. By analysing the different SOMs (one for each computer system), anomalous behaviours are identified and investigators are assisted to conduct the analysis more efficiently. The paper will demonstrate how the easy visualisation of the SOM enhances the ability of the investigators to interpret and explore the data generated by digital forensic tools so as to determine anomalous behaviours.
NASA Astrophysics Data System (ADS)
Omran, Adel; Dietrich, Schröder; Abouelmagd, Abdou; Michael, Märker
2016-09-01
Damages caused by flash floods hazards are an increasing phenomenon, especially in arid and semi-arid areas. Thus, the need to evaluate these areas based on their flash flood risk using maps and hydrological models is also becoming more important. For ungauged watersheds a tentative analysis can be carried out based on the geomorphometric characteristics of the terrain. To process regions with larger watersheds, where perhaps hundreds of watersheds have to be delineated, processed and classified, the overall process need to be automated. GIS packages such as ESRI's ArcGIS offer a number of sophisticated tools that help regarding such analysis. Yet there are still gaps and pitfalls that need to be considered if the tools are combined into a geoprocessing model to automate the complete assessment workflow. These gaps include issues such as i) assigning stream order according to Strahler theory, ii) calculating the threshold value for the stream network extraction, and iii) determining the pour points for each of the nodes of the Strahler ordered stream network. In this study a complete automated workflow based on ArcGIS Model Builder using standard tools will be introduced and discussed. Some additional tools have been implemented to complete the overall workflow. These tools have been programmed using Python and Java in the context of ArcObjects. The workflow has been applied to digital data from the southwestern Sinai Peninsula, Egypt. An optimum threshold value has been selected to optimize drainage configuration by statistically comparing all of the extracted stream configuration results from DEM with the available reference data from topographic maps. The code has succeeded in estimating the correct ranking of specific stream orders in an automatic manner without additional manual steps. As a result, the code has proven to save time and efforts; hence it's considered a very useful tool for processing large catchment basins.
Molecular epidemiology: new rules for new tools?
Merlo, Domenico Franco; Sormani, Maria Pia; Bruzzi, Paolo
2006-08-30
Molecular epidemiology combines biological markers and epidemiological observations in the study of the environmental and genetic determinants of cancer and other diseases. The potential advantages associated with biomarkers are manifold and include: (a) increased sensitivity and specificity to carcinogenic exposures; (b) more precise evaluation of the interplay between genetic and environmental determinants of cancer; (c) earlier detection of carcinogenic effects of exposure; (d) characterization of disease subtypes-etiologies patterns; (e) evaluation of primary prevention measures. These, in turn, may translate into better tools for etiologic research, individual risk assessment, and, ultimately, primary and secondary prevention. An area that has not received sufficient attention concerns the validation of these biomarkers as surrogate endpoints for cancer risk. Validation of a candidate biomarker's surrogacy is the demonstration that it possesses the properties required for its use as a substitute for a true endpoint. The principles underlying the validation process underwent remarkable developments and discussion in therapeutic research. However, the challenges posed by the application of these principles to epidemiological research, where the basic tool for this validation (i.e., the randomized study) is seldom possible, have not been thoroughly explored. The validation process of surrogacy must be applied rigorously to intermediate biomarkers of cancer risk before using them as risk predictors at the individual as well as at the population level.
A Critical Assessment of Vector Control for Dengue Prevention
Achee, Nicole L.; Gould, Fred; Perkins, T. Alex; Reiner, Robert C.; Morrison, Amy C.; Ritchie, Scott A.; Gubler, Duane J.; Teyssou, Remy; Scott, Thomas W.
2015-01-01
Recently, the Vaccines to Vaccinate (v2V) initiative was reconfigured into the Partnership for Dengue Control (PDC), a multi-sponsored and independent initiative. This redirection is consistent with the growing consensus among the dengue-prevention community that no single intervention will be sufficient to control dengue disease. The PDC's expectation is that when an effective dengue virus (DENV) vaccine is commercially available, the public health community will continue to rely on vector control because the two strategies complement and enhance one another. Although the concept of integrated intervention for dengue prevention is gaining increasingly broader acceptance, to date, no consensus has been reached regarding the details of how and what combination of approaches can be most effectively implemented to manage disease. To fill that gap, the PDC proposed a three step process: (1) a critical assessment of current vector control tools and those under development, (2) outlining a research agenda for determining, in a definitive way, what existing tools work best, and (3) determining how to combine the best vector control options, which have systematically been defined in this process, with DENV vaccines. To address the first step, the PDC convened a meeting of international experts during November 2013 in Washington, DC, to critically assess existing vector control interventions and tools under development. This report summarizes those deliberations. PMID:25951103
The development of an online decision support tool for organizational readiness for change.
Khan, Sobia; Timmings, Caitlyn; Moore, Julia E; Marquez, Christine; Pyka, Kasha; Gheihman, Galina; Straus, Sharon E
2014-05-10
Much importance has been placed on assessing readiness for change as one of the earliest steps of implementation, but measuring it can be a complex and daunting task. Organizations and individuals struggle with how to reliably and accurately measure readiness for change. Several measures have been developed to help organizations assess readiness, but these are often underused due to the difficulty of selecting the right measure. In response to this challenge, we will develop and test a prototype of a decision support tool that is designed to guide individuals interested in implementation in the selection of an appropriate readiness assessment measure for their setting. A multi-phase approach will be used to develop the decision support tool. First, we will identify key measures for assessing organizational readiness for change from a recently completed systematic review. Included measures will be those developed for healthcare settings (e.g., acute care, public health, mental health) and that have been deemed valid and reliable. Second, study investigators and field experts will engage in a mapping exercise to categorize individual items of included measures according to key readiness constructs from an existing framework. Third, a stakeholder panel will be recruited and consulted to determine the feasibility and relevance of the selected measures using a modified Delphi process. Fourth, findings from the mapping exercise and stakeholder consultation will inform the development of a decision support tool that will guide users in appropriately selecting change readiness measures. Fifth, the tool will undergo usability testing. Our proposed decision support tool will address current challenges in the field of organizational change readiness by aiding individuals in selecting a valid and reliable assessment measure that is relevant to user needs and practice settings. We anticipate that implementers and researchers who use our tool will be more likely to conduct readiness for change assessments in their settings when planning for implementation. This, in turn, may contribute to more successful implementation outcomes. We will test this tool in a future study to determine its efficacy and impact on implementation processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schunk, Peter Randall; King, William P.; Sun, Amy Cha-Tien
2006-08-01
This paper presents continuum simulations of polymer flow during nanoimprint lithography (NIL). The simulations capture the underlying physics of polymer flow from the nanometer to millimeter length scale and examine geometry and thermophysical process quantities affecting cavity filling. Variations in embossing tool geometry and polymer film thickness during viscous flow distinguish different flow driving mechanisms. Three parameters can predict polymer deformation mode: cavity width to polymer thickness ratio, polymer supply ratio, and Capillary number. The ratio of cavity width to initial polymer film thickness determines vertically or laterally dominant deformation. The ratio of indenter width to residual film thickness measuresmore » polymer supply beneath the indenter which determines Stokes or squeeze flow. The local geometry ratios can predict a fill time based on laminar flow between plates, Stokes flow, or squeeze flow. Characteristic NIL capillary number based on geometry-dependent fill time distinguishes between capillary or viscous driven flows. The three parameters predict filling modes observed in published studies of NIL deformation over nanometer to millimeter length scales. The work seeks to establish process design rules for NIL and to provide tools for the rational design of NIL master templates, resist polymers, and process parameters.« less
Determining Spacecraft Reaction Wheel Friction Parameters
NASA Technical Reports Server (NTRS)
Sarani, Siamak
2009-01-01
Software was developed to characterize the drag in each of the Cassini spacecraft's Reaction Wheel Assemblies (RWAs) to determine the RWA friction parameters. This tool measures the drag torque of RWAs for not only the high spin rates (greater than 250 RPM), but also the low spin rates (less than 250 RPM) where there is a lack of an elastohydrodynamic boundary layer in the bearings. RWA rate and drag torque profiles as functions of time are collected via telemetry once every 4 seconds and once every 8 seconds, respectively. Intermediate processing steps single-out the coast-down regions. A nonlinear model for the drag torque as a function of RWA spin rate is incorporated in order to characterize the low spin rate regime. The tool then uses a nonlinear parameter optimization algorithm based on the Nelder-Mead simplex method to determine the viscous coefficient, the Dahl friction, and the two parameters that account for the low spin-rate behavior.
Residency application screening tools: A survey of academic medical centers.
Hillebrand, Kristen; Leinum, Corey J; Desai, Sonya; Pettit, Natasha N; Fuller, Patrick D
2015-06-01
The current use and content of screening tools utilized by ASHP-accredited pharmacy residency programs were assessed. A survey consisting of 19 questions assessing residency programs and the screening of pharmacy residency program applicants was e-mailed to residency directors of 362 pharmacy residency programs at 105 University HealthSystem Consortium (UHC)-member institutions. Questions gathered general program demographic information, data related to applicant growth from residency years 2010-11 to 2011-12, and information about the residency screening processes currently used. Responses were received from 73 residency program sites (69.5%) of the 105 UHC-member institutions to whom the e-mail was sent. Many sites used screening tools to calculate applicants' scores and then determined which candidates to invite for an onsite interview based on applicants' scores and group discussion. Seventy-eight percent (n = 57) of the 73 responding institutions reported the use of a screening tool or rubric to select applicants to invite for onsite interviews. The most common method of evaluation was individual applicant review before meeting as a group to discuss candidate selection. The most important factor for determining which residency candidate to interview was the overall impression based on the candidate's curriculum vitae (CV) and letters of recommendation. Most residency programs in UHC-member hospitals used a screening tool to determine which applicants to invite for an onsite interview. The most important factor for determining which residency candidate to interview was the overall impression based on the candidate's CV and letters of recommendation. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Determination of the Actual Land Use Pattern Using Unmanned Aerial Vehicles and Multispectral Camera
NASA Astrophysics Data System (ADS)
Dindaroğlu, T.; Gündoğan, R.; Gülci, S.
2017-11-01
The international initiatives developed in the context of combating global warming are based on the monitoring of Land Use, Land Use Changes, and Forests (LULUCEF). Determination of changes in land use patterns is used to determine the effects of greenhouse gas emissions and to reduce adverse effects in subsequent processes. This process, which requires the investigation and control of quite large areas, has undoubtedly increased the importance of technological tools and equipment. The use of carrier platforms and commercially cheaper various sensors have become widespread. In this study, multispectral camera was used to determine the land use pattern with high sensitivity. Unmanned aerial flights were carried out in the research fields of Kahramanmaras Sutcu Imam University campus area. Unmanned aerial vehicle (UAV) (multi-propeller hexacopter) was used as a carrier platform for aerial photographs. Within the scope of this study, multispectral cameras were used to determine the land use pattern with high sensitivity.
Creating a user friendly GIS tool to define functional process zones
The goal of this research is to develop methods and indicators that are useful for evaluating the condition of aquatic communities, for assessing the restoration of aquatic communities in response to mitigation and best management practices, and for determining the exposure of aq...
ERIC Educational Resources Information Center
Lippert, Robert
2004-01-01
Student portfolios can offer great benefits to both high school and postsecondary education programs. They assist instructors in determining the progress of a student's performance and provide students with a vital self-promotion tool for job searches or the higher education application process. This brief article describes a few considerations to…
2013-01-01
Background Several measurement tools have been developed to measure health literacy. The tools vary in their approach and design, but few have focused on comprehensive health literacy in populations. This paper describes the design and development of the European Health Literacy Survey Questionnaire (HLS-EU-Q), an innovative, comprehensive tool to measure health literacy in populations. Methods Based on a conceptual model and definition, the process involved item development, pre-testing, field-testing, external consultation, plain language check, and translation from English to Bulgarian, Dutch, German, Greek, Polish, and Spanish. Results The development process resulted in the HLS-EU-Q, which entailed two sections, a core health literacy section and a section on determinants and outcomes associated to health literacy. The health literacy section included 47 items addressing self-reported difficulties in accessing, understanding, appraising and applying information in tasks concerning decisions making in healthcare, disease prevention, and health promotion. The second section included items related to, health behaviour, health status, health service use, community participation, socio-demographic and socio-economic factors. Conclusions By illuminating the detailed steps in the design and development process of the HLS-EU-Q, it is the aim to provide a deeper understanding of its purpose, its capability and its limitations for others using the tool. By stimulating a wide application it is the vision that HLS-EU-Q will be validated in more countries to enhance the understanding of health literacy in different populations. PMID:24112855
Diagnostic tools for mixing models of stream water chemistry
Hooper, Richard P.
2003-01-01
Mixing models provide a useful null hypothesis against which to evaluate processes controlling stream water chemical data. Because conservative mixing of end‐members with constant concentration is a linear process, a number of simple mathematical and multivariate statistical methods can be applied to this problem. Although mixing models have been most typically used in the context of mixing soil and groundwater end‐members, an extension of the mathematics of mixing models is presented that assesses the “fit” of a multivariate data set to a lower dimensional mixing subspace without the need for explicitly identified end‐members. Diagnostic tools are developed to determine the approximate rank of the data set and to assess lack of fit of the data. This permits identification of processes that violate the assumptions of the mixing model and can suggest the dominant processes controlling stream water chemical variation. These same diagnostic tools can be used to assess the fit of the chemistry of one site into the mixing subspace of a different site, thereby permitting an assessment of the consistency of controlling end‐members across sites. This technique is applied to a number of sites at the Panola Mountain Research Watershed located near Atlanta, Georgia.
Advances in the production of freeform optical surfaces
NASA Astrophysics Data System (ADS)
Tohme, Yazid E.; Luniya, Suneet S.
2007-05-01
Recent market demands for free-form optics have challenged the industry to find new methods and techniques to manufacture free-form optical surfaces with a high level of accuracy and reliability. Production techniques are becoming a mix of multi-axis single point diamond machining centers or deterministic ultra precision grinding centers coupled with capable measurement systems to accomplish the task. It has been determined that a complex software tool is required to seamlessly integrate all aspects of the manufacturing process chain. Advances in computational power and improved performance of computer controlled precision machinery have driven the use of such software programs to measure, visualize, analyze, produce and re-validate the 3D free-form design thus making the process of manufacturing such complex surfaces a viable task. Consolidation of the entire production cycle in a comprehensive software tool that can interact with all systems in design, production and measurement phase will enable manufacturers to solve these complex challenges providing improved product quality, simplified processes, and enhanced performance. The work being presented describes the latest advancements in developing such software package for the entire fabrication process chain for aspheric and free-form shapes. It applies a rational B-spline based kernel to transform an optical design in the form of parametrical definition (optical equation), standard CAD format, or a cloud of points to a central format that drives the simulation. This software tool creates a closed loop for the fabrication process chain. It integrates surface analysis and compensation, tool path generation, and measurement analysis in one package.
Proximity matching for ArF and KrF scanners
NASA Astrophysics Data System (ADS)
Kim, Young Ki; Pohling, Lua; Hwee, Ng Teng; Kim, Jeong Soo; Benyon, Peter; Depre, Jerome; Hong, Jongkyun; Serebriakov, Alexander
2009-03-01
There are many IC-manufacturers over the world that use various exposure systems and work with very high requirements in order to establish and maintain stable lithographic processes of 65 nm, 45 nm and below. Once the process is established, manufacturer desires to be able to run it on different tools that are available. This is why the proximity matching plays a key role to maximize tools utilization in terms of productivity for different types of exposure tools. In this paper, we investigate the source of errors that cause optical proximity mismatch and evaluate several approaches for proximity matching of different types of 193 nm and 248 nm scanner systems such as set-get sigma calibration, contrast adjustment, and, finally, tuning imaging parameters by optimization with Manual Scanner Matcher. First, to monitor the proximity mismatch, we collect CD measurement data for the reference tool and for the tool-to-be-matched. Normally, the measurement is performed for a set of line or space through pitch structures. Secondly, by simulation or experiment, we determine the sensitivity of the critical structures with respect to small adjustment of exposure settings such as NA, sigma inner, sigma outer, dose, focus scan range etc. that are called 'proximity tuning knobs'. Then, with the help of special optimization software, we compute the proximity knob adjustment that has to be applied to the tool-to-be-matched to match the reference tool. Finally, we verify successful matching by exposing on the tool-to-be-matched with tuned exposure settings. This procedure is applicable for inter- and intra scanner type matching, but possibly also for process transfers to the design targets. In order to illustrate the approach we show experimental data as well as results of imaging simulations. The set demonstrate successful matching of critical structures for ArF scanners of different tool generations.
ERIC Educational Resources Information Center
Kaya, Deniz
2017-01-01
The purpose of the study is to perform a less-dimensional thorough visualization process for the purpose of determining the images of the students on the concept of angle. The Ward clustering analysis combined with Self-Organizing Neural Network Map (SOM) has been used for the dimension process. The Conceptual Understanding Tool, which consisted…
Implementation of lean manufacturing for frozen fish process at PT. XYZ
NASA Astrophysics Data System (ADS)
Setiyawan, D. T.; Pertiwijaya, H. R.; Effendi, U.
2018-03-01
PT. XYZ is a company specialized in the processing of fishery products particularly in frozen fish fillet. The purpose of this research was to identify the type of waste and determine the recommendations of minimizing waste Lean manufacturing approach was used in the identification of waste by describing the Value Stream Mapping (VSM) and selecting tools in the Value Stream Analysis Tools (VALSAT). The results of this research showed that the highest waste that generated was the defect of leak packaging on fillet products with an average of 1.21%. In addition to defect, other insufficiencies were found such as: unnecessary motion, unnecessary overhead, and waiting time. Recommendations for improvements that given include reduction of time at several stages of the process, making production schedules, and conducting regular machine maintenance. VSM analysis shows reduced lead time of 582.04 minutes to 572.01 minutes.
Development of Universal Portable Spray Stand for Touch-Up Process in The Automotive Paintshop
NASA Astrophysics Data System (ADS)
Fatah Muhamed Mukhtar, Muhamed Abdul; Mohideen Shahul Hameed, Rasool
2016-02-01
A spray stand is a custom-made tool used to hold the automotive body parts as well as the devices used to facilitate the operator during the Touch Up process in Paint shop production. This paper discusses about the development of Universal Portable Spray Stand (UPSS) as a tool to hold various types of automotive body parts and model of car during the painting process. The main objective of this study is to determine the effective application of UPSS at the International College of Automotive (ICAM) and also in the automotive industry. This will be helpful to add features to the current spray stand in ICAM and to add value to the spray stand based on selected criteria which are universal, portable and cost saving. In addition, study in the UPSS is also expected to bring reduction in cycle time during the touch up process, in the paint defects and in the ergonomics issues among the operators.
2007-09-10
KENNEDY SPACE CENTER, FLA. -- In bay 3 of the Orbiter Processing Facility, a tool storage assembly unit is being moved for storage in Discovery's payload bay. The tools may be used on a spacewalk, yet to be determined, during mission STS-120. In an unusual operation, the payload bay doors had to be reopened after closure to accommodate the storage. Space shuttle Discovery is targeted to launch Oct. 23 to the International Space Station. It will carry the U.S. Node 2, a connecting module, named Harmony, for assembly on the space station. Photo credit: NASA/Amanda Diller
Determining the spatial altitude of the hydraulic fractures.
NASA Astrophysics Data System (ADS)
Khamiev, Marsel; Kosarev, Victor; Goncharova, Galina
2016-04-01
Mathematical modeling and numerical simulation are the most widely used approaches for the solving geological problems. They imply software tools which are based on Monte Carlo method. The results of this project presents shows the possibility of using PNL tool to determine fracturing location. The modeled media is a homogeneous rock (limestone) cut by a vertical borehole (d=216 mm) with metal casing 9 mm thick. The cement sheath is 35 mm thick. The borehole is filled with fresh water. The rock mass is cut by crack, filled with a mixture of doped (gadolinium oxide Gd2O3) proppant (75%) and water (25%). A pulse neutron logging (PNL) tool is used for quality control in hydraulic fracturing operations. It includes a fast neutron source (so-called "neutron generator") and a set of thermal (or epithermal) neutron-sensing devices, forming the so-called near (ND) and far (FD) detectors. To evaluate neutron properties various segments (sectors) of the rock mass, the detector must register only neutrons that come from this very formation. It's possible if detecting block includes some (6 for example) thermal neutron detectors arranged circumferentially inside the tool. As a result we get few independent well logs, each accords with define rock sector. Afterwards synthetic logs processing we can determine spatial position of the hydraulic fracture.
EasyDIAg: A tool for easy determination of interrater agreement.
Holle, Henning; Rein, Robert
2015-09-01
Reliable measurements are fundamental for the empirical sciences. In observational research, measurements often consist of observers categorizing behavior into nominal-scaled units. Since the categorization is the outcome of a complex judgment process, it is important to evaluate the extent to which these judgments are reproducible, by having multiple observers independently rate the same behavior. A challenge in determining interrater agreement for timed-event sequential data is to develop clear objective criteria to determine whether two raters' judgments relate to the same event (the linking problem). Furthermore, many studies presently report only raw agreement indices, without considering the degree to which agreement can occur by chance alone. Here, we present a novel, free, and open-source toolbox (EasyDIAg) designed to assist researchers with the linking problem, while also providing chance-corrected estimates of interrater agreement. Additional tools are included to facilitate the development of coding schemes and rater training.
Vector production in an academic environment: a tool to assess production costs.
Boeke, Aaron; Doumas, Patrick; Reeves, Lilith; McClurg, Kyle; Bischof, Daniela; Sego, Lina; Auberry, Alisha; Tatikonda, Mohan; Cornetta, Kenneth
2013-02-01
Generating gene and cell therapy products under good manufacturing practices is a complex process. When determining the cost of these products, researchers must consider the large number of supplies used for manufacturing and the personnel and facility costs to generate vector and maintain a cleanroom facility. To facilitate cost estimates, the Indiana University Vector Production Facility teamed with the Indiana University Kelley School of Business to develop a costing tool that, in turn, provides pricing. The tool is designed in Microsoft Excel and is customizable to meet the needs of other core facilities. It is available from the National Gene Vector Biorepository. The tool allows cost determinations using three different costing methods and was developed in an effort to meet the A21 circular requirements for U.S. core facilities performing work for federally funded projects. The costing tool analysis reveals that the cost of vector production does not have a linear relationship with batch size. For example, increasing the production from 9 to18 liters of a retroviral vector product increases total costs a modest 1.2-fold rather than doubling in total cost. The analysis discussed in this article will help core facilities and investigators plan a cost-effective strategy for gene and cell therapy production.
Vector Production in an Academic Environment: A Tool to Assess Production Costs
Boeke, Aaron; Doumas, Patrick; Reeves, Lilith; McClurg, Kyle; Bischof, Daniela; Sego, Lina; Auberry, Alisha; Tatikonda, Mohan
2013-01-01
Abstract Generating gene and cell therapy products under good manufacturing practices is a complex process. When determining the cost of these products, researchers must consider the large number of supplies used for manufacturing and the personnel and facility costs to generate vector and maintain a cleanroom facility. To facilitate cost estimates, the Indiana University Vector Production Facility teamed with the Indiana University Kelley School of Business to develop a costing tool that, in turn, provides pricing. The tool is designed in Microsoft Excel and is customizable to meet the needs of other core facilities. It is available from the National Gene Vector Biorepository. The tool allows cost determinations using three different costing methods and was developed in an effort to meet the A21 circular requirements for U.S. core facilities performing work for federally funded projects. The costing tool analysis reveals that the cost of vector production does not have a linear relationship with batch size. For example, increasing the production from 9 to18 liters of a retroviral vector product increases total costs a modest 1.2-fold rather than doubling in total cost. The analysis discussed in this article will help core facilities and investigators plan a cost-effective strategy for gene and cell therapy production. PMID:23360377
Flame analysis using image processing techniques
NASA Astrophysics Data System (ADS)
Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng
2018-04-01
This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.
Kahl, Johannes; Bodroza-Solarov, Marija; Busscher, Nicolaas; Hajslova, Jana; Kneifel, Wolfgang; Kokornaczyk, Maria Olga; van Ruth, Saskia; Schulzova, Vera; Stolz, Peter
2014-10-01
Organic food quality determination needs multi-dimensional evaluation tools. The main focus is on the authentication as an analytical verification of the certification process. New fingerprinting approaches such as ultra-performance liquid chromatography-mass spectrometry, gas chromatography-mass spectrometry, direct analysis in real time-high-resolution mass spectrometry as well as crystallization with and without the presence of additives seem to be promising methods in terms of time of analysis and detecting organic system-related parameters. For further methodological development, a system approach is recommended, which also takes into account food structure aspects. Furthermore, the authentication of processed organic samples needs more consciousness, hence most of organic food is complex and processed. © 2013 Society of Chemical Industry.
Hotspot Patterns: The Formal Definition and Automatic Detection of Architecture Smells
2015-01-15
serious question for a project manager or architect: how to determine which parts of the code base should be given higher priority for maintenance and...services framework; Hadoop8 is a tool for distributed processing of large data sets; HBase9 is the Hadoop database; Ivy10 is a dependency management tool...answer this question more rigorously, we conducted Pearson Correlation Analysis to test the dependency between the number of issues a file involves
Identification of novel peptides for horse meat speciation in highly processed foodstuffs.
Claydon, Amy J; Grundy, Helen H; Charlton, Adrian J; Romero, M Rosario
2015-01-01
There is a need for robust analytical methods to support enforcement of food labelling legislation. Proteomics is emerging as a complementary methodology to existing tools such as DNA and antibody-based techniques. Here we describe the development of a proteomics strategy for the determination of meat species in highly processed foods. A database of specific peptides for nine relevant animal species was used to enable semi-targeted species determination. This principle was tested for horse meat speciation, and a range of horse-specific peptides were identified as heat stable marker peptides for the detection of low levels of horse meat in mixtures with other species.
Farnbach, Sara; Evans, John; Eades, Anne-Marie; Gee, Graham; Fernando, Jamie; Hammond, Belinda; Simms, Matty; DeMasi, Karrina; Hackett, Maree
2017-11-03
Process evaluations are conducted alongside research projects to identify the context, impact and consequences of research, determine whether it was conducted per protocol and to understand how, why and for whom an intervention is effective. We present a process evaluation protocol for the Getting it Right research project, which aims to determine validity of a culturally adapted depression screening tool for use by Aboriginal and Torres Strait Islander people. In this process evaluation, we aim to: (1) explore the context, impact and consequences of conducting Getting It Right, (2) explore primary healthcare staff and community representatives' experiences with the research project, (3) determine if it was conducted per protocol and (4) explore experiences with the depression screening tool, including perceptions about how it could be implemented into practice (if found to be valid). We also describe the partnerships established to conduct this process evaluation and how the national Values and Ethics: Guidelines for Ethical Conduct in Aboriginal and Torres Strait Islander Health Research is met. Realist and grounded theory approaches are used. Qualitative data include semistructured interviews with primary healthcare staff and community representatives involved with Getting it Right. Iterative data collection and analysis will inform a coding framework. Interviews will continue until saturation of themes is reached, or all participants are considered. Data will be triangulated against administrative data and patient feedback. An Aboriginal and Torres Strait Islander Advisory Group guides this research. Researchers will be blinded from validation data outcomes for as long as is feasible. The University of Sydney Human Research Ethics Committee, Aboriginal Health and Medical Research Council of New South Wales and six state ethics committees have approved this research. Findings will be submitted to academic journals and presented at conferences. ACTRN12614000705684. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Using strategic foresight to assess conservation opportunity.
Cook, Carly N; Wintle, Bonnie C; Aldrich, Stephen C; Wintle, Brendan A
2014-12-01
The nature of conservation challenges can foster a reactive, rather than proactive approach to decision making. Failure to anticipate problems before they escalate results in the need for more costly and time-consuming solutions. Proactive conservation requires forward-looking approaches to decision making that consider possible futures without being overly constrained by the past. Strategic foresight provides a structured process for considering the most desirable future and for mapping the most efficient and effective approaches to promoting that future with tools that facilitate creative thinking. The process involves 6 steps: setting the scope, collecting inputs, analyzing signals, interpreting the information, determining how to act, and implementing the outcomes. Strategic foresight is ideal for seeking, recognizing, and realizing conservation opportunities because it explicitly encourages a broad-minded, forward-looking perspective on an issue. Despite its potential value, the foresight process is rarely used to address conservation issues, and previous attempts have generally failed to influence policy. We present the strategic foresight process as it can be used for proactive conservation planning, describing some of the key tools in the foresight tool kit and how they can be used to identify and exploit different types of conservation opportunities. Scanning is an important tool for collecting and organizing diverse streams of information and can be used to recognize new opportunities and those that could be created. Scenario planning explores how current trends, drivers of change, and key uncertainties might influence the future and can be used to identify barriers to opportunities. Backcasting is used to map out a path to a goal and can determine how to remove barriers to opportunities. We highlight how the foresight process was used to identify conservation opportunities during the development of a strategic plan to address climate change in New York State. The plan identified solutions that should be effective across a range of possible futures. Illustrating the application of strategic foresight to identify conservation opportunities should provide the impetus for decision makers to explore strategic foresight as a way to support more proactive conservation policy, planning, and management. © 2014 Society for Conservation Biology.
Demonstrating Functional Equivalence of Pilot and Production Scale Freeze-Drying of BCG
ten Have, R.; Reubsaet, K.; van Herpen, P.; Kersten, G.; Amorij, J.-P.
2016-01-01
Process analytical technology (PAT)-tools were used to monitor freeze-drying of Bacille Calmette-Guérin (BCG) at pilot and production scale. Among the evaluated PAT-tools, there is the novel use of the vacuum valve open/close frequency for determining the endpoint of primary drying at production scale. The duration of primary drying, the BCG survival rate, and the residual moisture content (RMC) were evaluated using two different freeze-drying protocols and were found to be independent of the freeze-dryer scale evidencing functional equivalence. The absence of an effect of the freeze-dryer scale on the process underlines the feasibility of the pilot scale freeze-dryer for further BCG freeze-drying process optimization which may be carried out using a medium without BCG. PMID:26981867
Demonstrating Functional Equivalence of Pilot and Production Scale Freeze-Drying of BCG.
Ten Have, R; Reubsaet, K; van Herpen, P; Kersten, G; Amorij, J-P
2016-01-01
Process analytical technology (PAT)-tools were used to monitor freeze-drying of Bacille Calmette-Guérin (BCG) at pilot and production scale. Among the evaluated PAT-tools, there is the novel use of the vacuum valve open/close frequency for determining the endpoint of primary drying at production scale. The duration of primary drying, the BCG survival rate, and the residual moisture content (RMC) were evaluated using two different freeze-drying protocols and were found to be independent of the freeze-dryer scale evidencing functional equivalence. The absence of an effect of the freeze-dryer scale on the process underlines the feasibility of the pilot scale freeze-dryer for further BCG freeze-drying process optimization which may be carried out using a medium without BCG.
Aron, Miles; Browning, Richard; Carugo, Dario; Sezgin, Erdinc; Bernardino de la Serna, Jorge; Eggeling, Christian; Stride, Eleanor
2017-05-12
Spectral imaging with polarity-sensitive fluorescent probes enables the quantification of cell and model membrane physical properties, including local hydration, fluidity, and lateral lipid packing, usually characterized by the generalized polarization (GP) parameter. With the development of commercial microscopes equipped with spectral detectors, spectral imaging has become a convenient and powerful technique for measuring GP and other membrane properties. The existing tools for spectral image processing, however, are insufficient for processing the large data sets afforded by this technological advancement, and are unsuitable for processing images acquired with rapidly internalized fluorescent probes. Here we present a MATLAB spectral imaging toolbox with the aim of overcoming these limitations. In addition to common operations, such as the calculation of distributions of GP values, generation of pseudo-colored GP maps, and spectral analysis, a key highlight of this tool is reliable membrane segmentation for probes that are rapidly internalized. Furthermore, handling for hyperstacks, 3D reconstruction and batch processing facilitates analysis of data sets generated by time series, z-stack, and area scan microscope operations. Finally, the object size distribution is determined, which can provide insight into the mechanisms underlying changes in membrane properties and is desirable for e.g. studies involving model membranes and surfactant coated particles. Analysis is demonstrated for cell membranes, cell-derived vesicles, model membranes, and microbubbles with environmentally-sensitive probes Laurdan, carboxyl-modified Laurdan (C-Laurdan), Di-4-ANEPPDHQ, and Di-4-AN(F)EPPTEA (FE), for quantification of the local lateral density of lipids or lipid packing. The Spectral Imaging Toolbox is a powerful tool for the segmentation and processing of large spectral imaging datasets with a reliable method for membrane segmentation and no ability in programming required. The Spectral Imaging Toolbox can be downloaded from https://uk.mathworks.com/matlabcentral/fileexchange/62617-spectral-imaging-toolbox .
Applying the Karma Provenance tool to NASA's AMSR-E Data Production Stream
NASA Astrophysics Data System (ADS)
Ramachandran, R.; Conover, H.; Regner, K.; Movva, S.; Goodman, H. M.; Pale, B.; Purohit, P.; Sun, Y.
2010-12-01
Current procedures for capturing and disseminating provenance, or data product lineage, are limited in both what is captured and how it is disseminated to the science community. For example, the Advanced Microwave Scanning Radiometer for the Earth Observing System (AMSR-E) Science Investigator-led Processing System (SIPS) generates Level 2 and Level 3 data products for a variety of geophysical parameters. Data provenance and quality information for these data sets is either very general (e.g., user guides, a list of anomalous data receipt and processing conditions over the life of the missions) or difficult to access or interpret (e.g., quality flags embedded in the data, production history files not easily available to users). Karma is a provenance collection and representation tool designed and developed for data driven workflows such as the productions streams used to produce EOS standard products. Karma records uniform and usable provenance metadata independent of the processing system while minimizing both the modification burden on the processing system and the overall performance overhead. Karma collects both the process and data provenance. The process provenance contains information about the workflow execution and the associated algorithm invocations. The data provenance captures metadata about the derivation history of the data product, including algorithms used and input data sources transformed to generate it. As part of an ongoing NASA funded project, Karma is being integrated into the AMSR-E SIPS data production streams. Metadata gathered by the tool will be presented to the data consumers as provenance graphs, which are useful in validating the workflows and determining the quality of the data product. This presentation will discuss design and implementation issues faced while incorporating a provenance tool into a structured data production flow. Prototype results will also be presented in this talk.
Measuring Social-Emotional Skills to Advance Science and Practice
ERIC Educational Resources Information Center
McKown, Clark; Russo-Ponsaran, Nicole; Johnson, Jason
2016-01-01
The ability to understand and effectively interact with others is a critical determinant of academic, social, and life success (DiPerna & Elliott, 2002). An area in particular need of scalable, feasible, usable, and scientifically sound assessment tools is social-emotional comprehension, which includes mental processes enlisted to encode,…
Red Hair, Hot Tempers, and Hasty Assertions.
ERIC Educational Resources Information Center
Kelly, Ivan; Ryan, Alan
1983-01-01
Explains the use of contingency tables as a tool in assessing variables to determine whether a relationship exists. Develops an example hypothesis step-by-step, noting the scientific processes and attitudes being addressed. Cautions that a large difference, which suggests a relationship, is not explanation since correlation does not guarantee…
Frictional conditions between alloy AA6060 aluminium and tool steel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wideroee, Fredrik; Welo, Torgeir
The frictional conditions in the new process of screw extrusion of aluminium have been investigated. The contact behaviour between the aluminum alloy and the tool steel in the extruder is vital for understanding the extrusion process. Using a compressive-rotational method for frictional measurements the conditions for unlubricated sticking friction between aluminum alloy AA6060 and tool steel at different combinations of temperatures and pressures have been investigated. In this method the samples in the form of disks are put under hydrostatic pressure while simultaneously being rotated at one end. Pins made from contrast material have been inserted into the samples tomore » measure the deformation introduced. This approach along with 3D simulations form a method for determining the frictional conditions. The paper describes the test method and the results. It was found that the necessary pressure for sticking to occur between the aluminum AA6060 and the different parts of the extruder is heavily influenced by the temperature.« less
Detection of LiveLock in BPMN Using Process Expression
NASA Astrophysics Data System (ADS)
Tantitharanukul, Nasi; Jumpamule, Watcharee
Although the Business Process Modeling Notation (BPMN) is a popular tool for modeling business process in conceptual level, the result diagram may contain structural problem. One of the structural problems is livelock. In this problem, one token proceeds to end event, while other token is still in process with no progression. In this paper, we introduce an expression liked method to detect livelock in the BPMN diagram. Our approach utilizes the power of the declarative ability of expression to determine all of the possible process chains, and indicate whether there are livelock or not. As a result, we have shown that our method can detect livelock, if any.
Naidu, Venkata Ramana; Deshpande, Rucha S; Syed, Moinuddin R; Wakte, Pravin S
2018-07-01
A direct imaging system (Eyecon TM ) was used as a Process Analytical Technology (PAT) tool to monitor fluid bed coating process. Eyecon TM generated real-time onscreen images, particle size and shape information of two identically manufactured laboratory-scale batches. Eyecon TM has accuracy of measuring the particle size increase of ±1 μm on particles in the size range of 50-3000 μm. Eyecon TM captured data every 2 s during the entire process. The moving average of D90 particle size values recorded by Eyecon TM were calculated for every 30 min to calculate the radial coating thickness of coated particles. After the completion of coating process, the radial coating thickness was found to be 11.3 and 9.11 μm, with a standard deviation of ±0.68 and 1.8 μm for Batch 1 and Batch 2, respectively. The coating thickness was also correlated with percent weight build-up by gel permeation chromatography (GPC) and dissolution. GPC indicated weight build-up of 10.6% and 9.27% for Batch 1 and Batch 2, respectively. In conclusion, weight build-up of 10% can also be correlated with 10 ± 2 μm increase in the coating thickness of pellets, indicating the potential applicability of real-time imaging as an endpoint determination tool for fluid bed coating process.
NASA Astrophysics Data System (ADS)
Grigoriev, S. N.; Bobrovskij, N. M.; Melnikov, P. A.; Bobrovskij, I. N.
2017-05-01
Modern vector of development of machining technologies aimed at the transition to environmentally safe technologies - “green” technologies. The concept of “green technology” includes a set of signs of knowledge intended for practical use (“technology”). One of the ways to improve the quality of production is the use of surface plastic deformation (SPD) processing methods. The advantage of the SPD is a capability to combine effects of finishing and strengthening treatment. The SPD processing can replace operations: fine turning, grinding or polishing. The SPD is a forceful contact impact of indentor on workpiece’s surface in condition of their relative motion. It is difficult to implement the core technology of the SPD (burnishing, roller burnishing, etc.) while maintaining core technological advantages without the use of lubricating and cooling technology (metalworking fluids, MWF). The “green” SPD technology was developed by the authors for dry processing and has not such shortcomings. When processing with SPD without use of MWF requirements for tool’s durability is most significant, especially in the conditions of mass production. It is important to determine the period of durability of tool at the design stage of the technological process with the purpose of wastage preventing. This paper represents the results of durability research of natural and synthetic diamonds (polycrystalline diamond - ASPK) as well as precision of polycrystalline superabrasive tools made of dense boron nitride (DBN) during SPD processing without application of MWF.
Screening for sepsis in general hospitalized patients: a systematic review.
Alberto, L; Marshall, A P; Walker, R; Aitken, L M
2017-08-01
Sepsis is a condition widely observed outside critical care areas. To examine the application of sepsis screening tools for early recognition of sepsis in general hospitalized patients to: (i) identify the accuracy of these tools; (ii) determine the outcomes associated with their implementation; and (iii) describe the implementation process. A systematic review method was used. PubMed, CINAHL, Cochrane, Scopus, Web of Science, and Embase databases were systematically searched for primary articles, published from January 1990 to June 2016, that investigated screening tools or alert mechanisms for early identification of sepsis in adult general hospitalized patients. The review protocol was registered with PROSPERO (CRD42016042261). More than 8000 citations were screened for eligibility after duplicates had been removed. Six articles met the inclusion criteria testing two types of sepsis screening tools. Electronic tools can capture, recognize abnormal variables, and activate an alert in real time. However, accuracy of these tools was inconsistent across studies with only one demonstrating high specificity and sensitivity. Paper-based, nurse-led screening tools appear to be more sensitive in the identification of septic patients but were only studied in small samples and particular populations. The process of care measures appears to be enhanced; however, demonstrating improved outcomes is more challenging. Implementation details are rarely reported. Heterogeneity of studies prevented meta-analysis. Clinicians, researchers and health decision-makers should consider these findings and limitations when implementing screening tools, research or policy on sepsis recognition in general hospitalized patients. Copyright © 2017 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.
Making business decisions using trend information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prevette, S.S., Westinghouse Hanford, Richland, WA
1997-11-24
Performance Measures, and the trend information that results from their analyses, can help managers in their decision making process. The business decisions that are to be discussed are: Assignment of limited Resources, Funding, Budget; Contractor Rewards/Incentives; Where to focus Process Improvement, Reengineering efforts; When to ask ``What Happened?!!``; Determine if a previous decision was effectively implemented. Trending can provide an input for rational Business Decisions. Key Element is determination of whether or not a significant trend exists - segregating Common Cause from Special Cause. The Control Chart is the tool for accomplishment of trending and determining if you are meetingmore » your Business Objectives. Eliminate Numerical Targets; the goal is Significant Improvement. Profound Knowledge requires integrating data results with gut feeling.« less
A quality-based cost model for new electronic systems and products
NASA Astrophysics Data System (ADS)
Shina, Sammy G.; Saigal, Anil
1998-04-01
This article outlines a method for developing a quality-based cost model for the design of new electronic systems and products. The model incorporates a methodology for determining a cost-effective design margin allocation for electronic products and systems and its impact on manufacturing quality and cost. A spreadsheet-based cost estimating tool was developed to help implement this methodology in order for the system design engineers to quickly estimate the effect of design decisions and tradeoffs on the quality and cost of new products. The tool was developed with automatic spreadsheet connectivity to current process capability and with provisions to consider the impact of capital equipment and tooling purchases to reduce the product cost.
Drilling of Hybrid Titanium Composite Laminate (HTCL) with Electrical Discharge Machining.
Ramulu, M; Spaulding, Mathew
2016-09-01
An experimental investigation was conducted to determine the application of die sinker electrical discharge machining (EDM) as it applies to a hybrid titanium thermoplastic composite laminate material. Holes were drilled using a die sinker EDM. The effects of peak current, pulse time, and percent on-time on machinability of hybrid titanium composite material were evaluated in terms of material removal rate (MRR), tool wear rate, and cut quality. Experimental models relating each process response to the input parameters were developed and optimum operating conditions with a short cutting time, achieving the highest workpiece MRR, with very little tool wear were determined to occur at a peak current value of 8.60 A, a percent on-time of 36.12%, and a pulse time of 258 microseconds. After observing data acquired from experimentation, it was determined that while use of EDM is possible, for desirable quality it is not fast enough for industrial application.
Drilling of Hybrid Titanium Composite Laminate (HTCL) with Electrical Discharge Machining
Ramulu, M.; Spaulding, Mathew
2016-01-01
An experimental investigation was conducted to determine the application of die sinker electrical discharge machining (EDM) as it applies to a hybrid titanium thermoplastic composite laminate material. Holes were drilled using a die sinker EDM. The effects of peak current, pulse time, and percent on-time on machinability of hybrid titanium composite material were evaluated in terms of material removal rate (MRR), tool wear rate, and cut quality. Experimental models relating each process response to the input parameters were developed and optimum operating conditions with a short cutting time, achieving the highest workpiece MRR, with very little tool wear were determined to occur at a peak current value of 8.60 A, a percent on-time of 36.12%, and a pulse time of 258 microseconds. After observing data acquired from experimentation, it was determined that while use of EDM is possible, for desirable quality it is not fast enough for industrial application. PMID:28773866
Monleón, Daniel; Colson, Kimberly; Moseley, Hunter N B; Anklin, Clemens; Oswald, Robert; Szyperski, Thomas; Montelione, Gaetano T
2002-01-01
Rapid data collection, spectral referencing, processing by time domain deconvolution, peak picking and editing, and assignment of NMR spectra are necessary components of any efficient integrated system for protein NMR structure analysis. We have developed a set of software tools designated AutoProc, AutoPeak, and AutoAssign, which function together with the data processing and peak-picking programs NMRPipe and Sparky, to provide an integrated software system for rapid analysis of protein backbone resonance assignments. In this paper we demonstrate that these tools, together with high-sensitivity triple resonance NMR cryoprobes for data collection and a Linux-based computer cluster architecture, can be combined to provide nearly complete backbone resonance assignments and secondary structures (based on chemical shift data) for a 59-residue protein in less than 30 hours of data collection and processing time. In this optimum case of a small protein providing excellent spectra, extensive backbone resonance assignments could also be obtained using less than 6 hours of data collection and processing time. These results demonstrate the feasibility of high throughput triple resonance NMR for determining resonance assignments and secondary structures of small proteins, and the potential for applying NMR in large scale structural proteomics projects.
NASA Astrophysics Data System (ADS)
Baird, M. E.; Walker, S. J.; Wallace, B. B.; Webster, I. T.; Parslow, J. S.
2003-03-01
A simple model of estuarine eutrophication is built on biomechanical (or mechanistic) descriptions of a number of the key ecological processes in estuaries. Mechanistically described processes include the nutrient uptake and light capture of planktonic and benthic autotrophs, and the encounter rates of planktonic predators and prey. Other more complex processes, such as sediment biogeochemistry, detrital processes and phosphate dynamics, are modelled using empirical descriptions from the Port Phillip Bay Environmental Study (PPBES) ecological model. A comparison is made between the mechanistically determined rates of ecological processes and the analogous empirically determined rates in the PPBES ecological model. The rates generally agree, with a few significant exceptions. Model simulations were run at a range of estuarine depths and nutrient loads, with outputs presented as the annually averaged biomass of autotrophs. The simulations followed a simple conceptual model of eutrophication, suggesting a simple biomechanical understanding of estuarine processes can provide a predictive tool for ecological processes in a wide range of estuarine ecosystems.
The COLA Collision Avoidance Method
NASA Astrophysics Data System (ADS)
Assmann, K.; Berger, J.; Grothkopp, S.
2009-03-01
In the following we present a collision avoidance method named COLA. The method has been designed to predict collisions for Earth orbiting spacecraft on any orbits, including orbit changes, with other space-born objects. The point in time of a collision and the collision probability are determined. To guarantee effective processing the COLA method uses a modular design and is composed of several components which are either developed within this work or deduced from existing algorithms: A filtering module, the close approach determination, the collision detection and the collision probability calculation. A software tool which implements the COLA method has been verified using various test cases built from sample missions. This software has been implemented in the C++ programming language and serves as a universal collision detection tool at LSE Space Engineering & Operations AG.
NASA Technical Reports Server (NTRS)
Moser, D. E.; Suggs, R. M.; Kupferschmidt, L.; Feldman, J.
2015-01-01
A bright impact flash detected by the NASA Lunar Impact Monitoring Program in March 2013 brought into focus the importance of determining the impact flash location. A process for locating the impact flash, and presumably its associated crater, was developed using commercially available software tools. The process was successfully applied to the March 2013 impact flash and put into production on an additional 300 impact flashes. The goal today: provide a description of the geolocation technique developed.
Gershengorn, Hayley B; Kocher, Robert; Factor, Phillip
2014-03-01
The success of quality-improvement projects relies heavily on both project design and the metrics chosen to assess change. In Part II of this three-part American Thoracic Society Seminars series, we begin by describing methods for determining which data to collect, tools for data presentation, and strategies for data dissemination. As Avedis Donabedian detailed a half century ago, defining metrics in healthcare can be challenging; algorithmic determination of the best type of metric (outcome, process, or structure) can help intensive care unit (ICU) managers begin this process. Choosing appropriate graphical data displays (e.g., run charts) can prompt discussions about and promote quality improvement. Similarly, dashboards/scorecards are useful in presenting performance improvement data either publicly or privately in a visually appealing manner. To have compelling data to show, ICU managers must plan quality-improvement projects well. The second portion of this review details four quality-improvement tools-checklists, Six Sigma methodology, lean thinking, and Kaizen. Checklists have become commonplace in many ICUs to improve care quality; thinking about how to maximize their effectiveness is now of prime importance. Six Sigma methodology, lean thinking, and Kaizen are techniques that use multidisciplinary teams to organize thinking about process improvement, formalize change strategies, actualize initiatives, and measure progress. None originated within healthcare, but each has been used in the hospital environment with success. To conclude this part of the series, we demonstrate how to use these tools through an example of improving the timely administration of antibiotics to patients with sepsis.
Mechanical Properties and Microstructure of High-Strength Steel Controlled by Hot Stamping Process
NASA Astrophysics Data System (ADS)
Ou, Hang; Zhang, Xu; Xu, Junrui; Li, Guangyao; Cui, Junjia
2018-03-01
A novel design and manufacturing method, dubbed "precast," of the cooling system and tools for a hot forming process was proposed in this paper. The integrated structures of the punch and blank holder were determined by analyzing the bending and reverse-bending deformation of the forming parts. The desired crashworthiness performance of an automotive front bumper constructed with this process was obtained by a tailored phase transformation, which generated martensite-bainite in the middle and full martensite transformation in the corner areas. Varying cooling effects in the formed parts caused the highest temperature to be located in the bottom and the lowest on the end of the formed parts. Moreover, the microstructural distributions demonstrated that the bottom possessed a relatively lower content of martensite, while, conversely, the end possessed a higher content. This was precisely the most desired phase distributions for the hot formed parts. For the six-process cycle stamping, the temperatures reached a stable status after an initial rapid increase in the first three process cycles. The microstructural results verified the feasibility of the hot forming tools under multiprocess cycles.
Solid State Joining of Magnesium to Steel
NASA Astrophysics Data System (ADS)
Jana, Saumyadeep; Hovanski, Yuri; Pilli, Siva P.; Field, David P.; Yu, Hao; Pan, Tsung-Yu; Santella, M. L.
Friction stir welding and ultrasonic welding techniques were applied to join automotive magnesium alloys to steel sheet. The effect of tooling and process parameters on the post-weld microstructure, texture and mechanical properties was investigated. Static and dynamic loading were utilized to investigate the joint strength of both cast and wrought magnesium alloys including their susceptibility and degradation under corrosive media. The conditions required to produce joint strengths in excess of 75% of the base metal strength were determined, and the effects of surface coatings, tooling and weld parameters on weld properties are presented.
Probabilistic Structural Analysis of the SRB Aft Skirt External Fitting Modification
NASA Technical Reports Server (NTRS)
Townsend, John S.; Peck, J.; Ayala, S.
1999-01-01
NASA has funded several major programs (the PSAM Project is an example) to develop Probabilistic Structural Analysis Methods and tools for engineers to apply in the design and assessment of aerospace hardware. A probabilistic finite element design tool, known as NESSUS, is used to determine the reliability of the Space Shuttle Solid Rocket Booster (SRB) aft skirt critical weld. An external bracket modification to the aft skirt provides a comparison basis for examining the details of the probabilistic analysis and its contributions to the design process.
Anvil Forecast Tool in the Advanced Weather Interactive Processing System
NASA Technical Reports Server (NTRS)
Barrett, Joe H., III; Hood, Doris
2009-01-01
Meteorologists from the 45th Weather Squadron (45 WS) and National Weather Service Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the Lightning Launch Commit Criteria and Space Shuttle Flight Rules. As a result, the Applied Meteorology Unit (AMU) was tasked to create a graphical overlay tool for the Meteorological Interactive Data Display System (MIDDS) that indicates the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input. The tool creates a graphic depicting the potential location of thunderstorm anvils one, two, and three hours into the future. The locations are based on the average of the upper level observed or forecasted winds. The graphic includes 10 and 20 n mi standoff circles centered at the location of interest, as well as one-, two-, and three-hour arcs in the upwind direction. The arcs extend outward across a 30 sector width based on a previous AMU study that determined thunderstorm anvils move in a direction plus or minus 15 of the upper-level wind direction. The AMU was then tasked to transition the tool to the Advanced Weather Interactive Processing System (AWIPS). SMG later requested the tool be updated to provide more flexibility and quicker access to model data. This presentation describes the work performed by the AMU to transition the tool into AWIPS, as well as the subsequent improvements made to the tool.
Modeling the curing process of thick-section autoclave cured composites
NASA Technical Reports Server (NTRS)
Loos, A. C.; Dara, P. H.
1985-01-01
Temperature gradients are significant during cure of large area, thick-section composites. Such temperature gradients result in nonuniformly cured parts with high void contents, poor ply compaction, and variations in the fiber/resin distribution. A model was developed to determine the temperature distribution in thick-section autoclave cured composites. Using the model, long with temperature measurements obtained from the thick-section composites, the effects of various processing parameters on the thermal response of the composites were examined. A one-dimensional heat transfer model was constructed for the composite-tool assembly. The governing differential equations and associated boundary conditions describing one-dimensional unsteady heat-conduction in the composite, tool plate, and pressure plate are given. Solution of the thermal model was obtained using an implicit finite difference technique.
Cutting Zone Temperature Identification During Machining of Nickel Alloy Inconel 718
NASA Astrophysics Data System (ADS)
Czán, Andrej; Daniš, Igor; Holubják, Jozef; Zaušková, Lucia; Czánová, Tatiana; Mikloš, Matej; Martikáň, Pavol
2017-12-01
Quality of machined surface is affected by quality of cutting process. There are many parameters, which influence on the quality of the cutting process. The cutting temperature is one of most important parameters that influence the tool life and the quality of machined surfaces. Its identification and determination is key objective in specialized machining processes such as dry machining of hard-to-machine materials. It is well known that maximum temperature is obtained in the tool rake face at the vicinity of the cutting edge. A moderate level of cutting edge temperature and a low thermal shock reduce the tool wear phenomena, and a low temperature gradient in the machined sublayer reduces the risk of high tensile residual stresses. The thermocouple method was used to measure the temperature directly in the cutting zone. An original thermocouple was specially developed for measuring of temperature in the cutting zone, surface and subsurface layers of machined surface. This paper deals with identification of temperature and temperature gradient during dry peripheral milling of Inconel 718. The measurements were used to identification the temperature gradients and to reconstruct the thermal distribution in cutting zone with various cutting conditions.
ERIC Educational Resources Information Center
Vouchilas, Gus; George, Gretchen
2016-01-01
The Professional Development Portfolio (PDP) in family and consumer sciences nutrition and dietetics programs is a tool that can help students in their transition to professionals. Significant issues in the portfolio development process are: content selection, decision to create paper or online formatting, determination of proper timing to begin…
Cultural Strategies for Teaching HIV/AIDS Prevention to American Indians
ERIC Educational Resources Information Center
McIntosh, Dannette R.
2012-01-01
The purpose of this study was to describe what tools and strategies Native Americans who live in Oklahoma believe are important in learning about HIV/AIDS, to determine if culturally specific information is important in developing prevention programs, and to ascertain learning strategies. Data collection was a two-part process. First, the Cultural…
NASA Technical Reports Server (NTRS)
Miller, Darcy
2000-01-01
Foreign object debris (FOD) is an important concern while processing space flight hardware. FOD can be defined as "The debris that is left in or around flight hardware, where it could cause damage to that flight hardware," (United Space Alliance, 2000). Just one small screw left unintentionally in the wrong place could delay a launch schedule while it is retrieved, increase the cost of processing, or cause a potentially fatal accident. At this time, there is not a single solution to help reduce the number of dropped parts such as screws, bolts, nuts, and washers during installation. Most of the effort is currently focused on training employees and on capturing the parts once they are dropped. Advances in ergonomics and hand tool design suggest that a solution may be possible, in the form of specialty hand tools, which secure the small parts while they are being handled. To assist in the development of these new advances, a test methodology was developed to conduct a usability evaluation of hand tools, while performing tasks with risk of creating FOD. The methodology also includes hardware in the form of a testing board and the small parts that can be installed onto the board during a test. The usability of new hand tools was determined based on efficiency and the number of dropped parts. To validate the methodology, participants were tested while performing a task that is representative of the type of work that may be done when processing space flight hardware. Test participants installed small parts using their hands and two commercially available tools. The participants were from three groups: (1) students, (2) engineers / managers and (3) technicians. The test was conducted to evaluate the differences in performance when using the three installation methods, as well as the difference in performance of the three participant groups.
NASA Astrophysics Data System (ADS)
Mia, Mozammel; Bashir, Mahmood Al; Dhar, Nikhil Ranjan
2016-07-01
Hard turning is gradually replacing the time consuming conventional turning process, which is typically followed by grinding, by producing surface quality compatible to grinding. The hard turned surface roughness depends on the cutting parameters, machining environments and tool insert configurations. In this article the variation of the surface roughness of the produced surfaces with the changes in tool insert configuration, use of coolant and different cutting parameters (cutting speed, feed rate) has been investigated. This investigation was performed in machining AISI 1060 steel, hardened to 56 HRC by heat treatment, using coated carbide inserts under two different machining environments. The depth of cut, fluid pressure and material hardness were kept constant. The Design of Experiment (DOE) was performed to determine the number and combination sets of different cutting parameters. A full factorial analysis has been performed to examine the effect of main factors as well as interaction effect of factors on surface roughness. A statistical analysis of variance (ANOVA) was employed to determine the combined effect of cutting parameters, environment and tool configuration. The result of this analysis reveals that environment has the most significant impact on surface roughness followed by feed rate and tool configuration respectively.
Simulation of the Press Hardening Process and Prediction of the Final Mechanical Material Properties
NASA Astrophysics Data System (ADS)
Hochholdinger, Bernd; Hora, Pavel; Grass, Hannes; Lipp, Arnulf
2011-08-01
Press hardening is a well-established production process in the automotive industry today. The actual trend of this process technology points towards the manufacturing of parts with tailored properties. Since the knowledge of the mechanical properties of a structural part after forming and quenching is essential for the evaluation of for example the crash performance, an accurate as possible virtual assessment of the production process is more than ever necessary. In order to achieve this, the definition of reliable input parameters and boundary conditions for the thermo-mechanically coupled simulation of the process steps is required. One of the most important input parameters, especially regarding the final properties of the quenched material, is the contact heat transfer coefficient (IHTC). The CHTC depends on the effective pressure or the gap distance between part and tool. The CHTC at different contact pressures and gap distances is determined through inverse parameter identification. Furthermore a simulation strategy for the subsequent steps of the press hardening process as well as adequate modeling approaches for part and tools are discussed. For the prediction of the yield curves of the material after press hardening a phenomenological model is presented. This model requires the knowledge of the microstructure within the part. By post processing the nodal temperature history with a CCT diagram the quantitative distribution of the phase fractions martensite, bainite, ferrite and pearlite after press hardening is determined. The model itself is based on a Hockett-Sherby approach with the Hockett-Sherby parameters being defined in function of the phase fractions and a characteristic cooling rate.
Brouckaert, Davinia; De Meyer, Laurens; Vanbillemont, Brecht; Van Bockstal, Pieter-Jan; Lammens, Joris; Mortier, Séverine; Corver, Jos; Vervaet, Chris; Nopens, Ingmar; De Beer, Thomas
2018-04-03
Near-infrared chemical imaging (NIR-CI) is an emerging tool for process monitoring because it combines the chemical selectivity of vibrational spectroscopy with spatial information. Whereas traditional near-infrared spectroscopy is an attractive technique for water content determination and solid-state investigation of lyophilized products, chemical imaging opens up possibilities for assessing the homogeneity of these critical quality attributes (CQAs) throughout the entire product. In this contribution, we aim to evaluate NIR-CI as a process analytical technology (PAT) tool for at-line inspection of continuously freeze-dried pharmaceutical unit doses based on spin freezing. The chemical images of freeze-dried mannitol samples were resolved via multivariate curve resolution, allowing us to visualize the distribution of mannitol solid forms throughout the entire cake. Second, a mannitol-sucrose formulation was lyophilized with variable drying times for inducing changes in water content. Analyzing the corresponding chemical images via principal component analysis, vial-to-vial variations as well as within-vial inhomogeneity in water content could be detected. Furthermore, a partial least-squares regression model was constructed for quantifying the water content in each pixel of the chemical images. It was hence concluded that NIR-CI is inherently a most promising PAT tool for continuously monitoring freeze-dried samples. Although some practicalities are still to be solved, this analytical technique could be applied in-line for CQA evaluation and for detecting the drying end point.
Hart, Tae L; Blacker, Susan; Panjwani, Aliza; Torbit, Lindsey; Evans, Michael
2015-03-01
To create informational tools for breast cancer patients with low levels of health literacy. Tools were developed through a three-stage process. (1) Focus groups were conducted with breast cancer survivors and interviews were held with health educators to determine content, source of information, format and medium of the tools. (2) Based on this feedback, a suite of tools was developed. (3) Focus groups were reconvened and health educators re-interviewed to obtain feedback and determine satisfaction. We developed a suite of five informational tools using low health literacy principles, which focused on learning about breast cancer resources and learning about the members of one's healthcare team, understanding the "journey" or trajectory of care beginning at diagnosis, hearing from other breast cancer patients about their own journey, and becoming informed about what to expect pre-and post-surgery for breast cancer. The final products were rated highly by breast cancer survivors. The developed materials, designed for patients who read below an 8th grade level, reflect the informational needs reported by breast cancer patients. Healthcare providers must consider utilizing design principles and theories of adult learning appropriate for those with low health literacy. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Automatic Coding of Dialogue Acts in Collaboration Protocols
ERIC Educational Resources Information Center
Erkens, Gijsbert; Janssen, Jeroen
2008-01-01
Although protocol analysis can be an important tool for researchers to investigate the process of collaboration and communication, the use of this method of analysis can be time consuming. Hence, an automatic coding procedure for coding dialogue acts was developed. This procedure helps to determine the communicative function of messages in online…
Machining of AISI D2 Tool Steel with Multiple Hole Electrodes by EDM Process
NASA Astrophysics Data System (ADS)
Prasad Prathipati, R.; Devuri, Venkateswarlu; Cheepu, Muralimohan; Gudimetla, Kondaiah; Uzwal Kiran, R.
2018-03-01
In recent years, with the increasing of technology the demand for machining processes is increasing for the newly developed materials. The conventional machining processes are not adequate to meet the accuracy of the machining of these materials. The non-conventional machining processes of electrical discharge machining is one of the most efficient machining processes is being widely used to machining of high accuracy products of various industries. The optimum selection of process parameters is very important in machining processes as that of an electrical discharge machining as they determine surface quality and dimensional precision of the obtained parts, even though time consumption rate is higher for machining of large dimension features. In this work, D2 high carbon and chromium tool steel has been machined using electrical discharge machining with the multiple hole electrode technique. The D2 steel has several applications such as forming dies, extrusion dies and thread rolling. But the machining of this tool steel is very hard because of it shard alloyed elements of V, Cr and Mo which enhance its strength and wear properties. However, the machining is possible by using electrical discharge machining process and the present study implemented a new technique to reduce the machining time using a multiple hole copper electrode. In this technique, while machining with multiple holes electrode, fin like projections are obtained, which can be removed easily by chipping. Then the finishing is done by using solid electrode. The machining time is reduced to around 50% while using multiple hole electrode technique for electrical discharge machining.
Muirhead, David; Aoun, Patricia; Powell, Michael; Juncker, Flemming; Mollerup, Jens
2010-08-01
The need for higher efficiency, maximum quality, and faster turnaround time is a continuous focus for anatomic pathology laboratories and drives changes in work scheduling, instrumentation, and management control systems. To determine the costs of generating routine, special, and immunohistochemical microscopic slides in a large, academic anatomic pathology laboratory using a top-down approach. The Pathology Economic Model Tool was used to analyze workflow processes at The Nebraska Medical Center's anatomic pathology laboratory. Data from the analysis were used to generate complete cost estimates, which included not only materials, consumables, and instrumentation but also specific labor and overhead components for each of the laboratory's subareas. The cost data generated by the Pathology Economic Model Tool were compared with the cost estimates generated using relative value units. Despite the use of automated systems for different processes, the workflow in the laboratory was found to be relatively labor intensive. The effect of labor and overhead on per-slide costs was significantly underestimated by traditional relative-value unit calculations when compared with the Pathology Economic Model Tool. Specific workflow defects with significant contributions to the cost per slide were identified. The cost of providing routine, special, and immunohistochemical slides may be significantly underestimated by traditional methods that rely on relative value units. Furthermore, a comprehensive analysis may identify specific workflow processes requiring improvement.
NASA Astrophysics Data System (ADS)
Ham, Boo-Hyun; Kim, Il-Hwan; Park, Sung-Sik; Yeo, Sun-Young; Kim, Sang-Jin; Park, Dong-Woon; Park, Joon-Soo; Ryu, Chang-Hoon; Son, Bo-Kyeong; Hwang, Kyung-Bae; Shin, Jae-Min; Shin, Jangho; Park, Ki-Yeop; Park, Sean; Liu, Lei; Tien, Ming-Chun; Nachtwein, Angelique; Jochemsen, Marinus; Yan, Philip; Hu, Vincent; Jones, Christopher
2017-03-01
As critical dimensions for advanced two dimensional (2D) DUV patterning continue to shrink, the exact process window becomes increasingly difficult to determine. The defect size criteria shrink with the patterning critical dimensions and are well below the resolution of current optical inspection tools. As a result, it is more challenging for traditional bright field inspection tools to accurately discover the hotspots that define the process window. In this study, we use a novel computational inspection method to identify the depth-of-focus limiting features of a 10 nm node mask with 2D metal structures (single exposure) and compare the results to those obtained with a traditional process windows qualification (PWQ) method based on utilizing a focus modulated wafer and bright field inspection (BFI) to detect hotspot defects. The method is extended to litho-etch litho-etch (LELE) on a different test vehicle to show that overlay related bridging hotspots also can be identified.
Identifying sediment sources in the sediment TMDL process
Gellis, Allen C.; Fitzpatrick, Faith A.; Schubauer-Berigan, Joseph P.; Landy, R.B.; Gorman Sanisaca, Lillian E.
2015-01-01
Sediment is an important pollutant contributing to aquatic-habitat degradation in many waterways of the United States. This paper discusses the application of sediment budgets in conjunction with sediment fingerprinting as tools to determine the sources of sediment in impaired waterways. These approaches complement monitoring, assessment, and modeling of sediment erosion, transport, and storage in watersheds. Combining the sediment fingerprinting and sediment budget approaches can help determine specific adaptive management plans and techniques applied to targeting hot spots or areas of high erosion.
2015-01-01
We report the implementation of high-quality signal processing algorithms into ProteoWizard, an efficient, open-source software package designed for analyzing proteomics tandem mass spectrometry data. Specifically, a new wavelet-based peak-picker (CantWaiT) and a precursor charge determination algorithm (Turbocharger) have been implemented. These additions into ProteoWizard provide universal tools that are independent of vendor platform for tandem mass spectrometry analyses and have particular utility for intralaboratory studies requiring the advantages of different platforms convergent on a particular workflow or for interlaboratory investigations spanning multiple platforms. We compared results from these tools to those obtained using vendor and commercial software, finding that in all cases our algorithms resulted in a comparable number of identified peptides for simple and complex samples measured on Waters, Agilent, and AB SCIEX quadrupole time-of-flight and Thermo Q-Exactive mass spectrometers. The mass accuracy of matched precursor ions also compared favorably with vendor and commercial tools. Additionally, typical analysis runtimes (∼1–100 ms per MS/MS spectrum) were short enough to enable the practical use of these high-quality signal processing tools for large clinical and research data sets. PMID:25411686
AutoFACT: An Automatic Functional Annotation and Classification Tool
Koski, Liisa B; Gray, Michael W; Lang, B Franz; Burger, Gertraud
2005-01-01
Background Assignment of function to new molecular sequence data is an essential step in genomics projects. The usual process involves similarity searches of a given sequence against one or more databases, an arduous process for large datasets. Results We present AutoFACT, a fully automated and customizable annotation tool that assigns biologically informative functions to a sequence. Key features of this tool are that it (1) analyzes nucleotide and protein sequence data; (2) determines the most informative functional description by combining multiple BLAST reports from several user-selected databases; (3) assigns putative metabolic pathways, functional classes, enzyme classes, GeneOntology terms and locus names; and (4) generates output in HTML, text and GFF formats for the user's convenience. We have compared AutoFACT to four well-established annotation pipelines. The error rate of functional annotation is estimated to be only between 1–2%. Comparison of AutoFACT to the traditional top-BLAST-hit annotation method shows that our procedure increases the number of functionally informative annotations by approximately 50%. Conclusion AutoFACT will serve as a useful annotation tool for smaller sequencing groups lacking dedicated bioinformatics staff. It is implemented in PERL and runs on LINUX/UNIX platforms. AutoFACT is available at . PMID:15960857
French, William R; Zimmerman, Lisa J; Schilling, Birgit; Gibson, Bradford W; Miller, Christine A; Townsend, R Reid; Sherrod, Stacy D; Goodwin, Cody R; McLean, John A; Tabb, David L
2015-02-06
We report the implementation of high-quality signal processing algorithms into ProteoWizard, an efficient, open-source software package designed for analyzing proteomics tandem mass spectrometry data. Specifically, a new wavelet-based peak-picker (CantWaiT) and a precursor charge determination algorithm (Turbocharger) have been implemented. These additions into ProteoWizard provide universal tools that are independent of vendor platform for tandem mass spectrometry analyses and have particular utility for intralaboratory studies requiring the advantages of different platforms convergent on a particular workflow or for interlaboratory investigations spanning multiple platforms. We compared results from these tools to those obtained using vendor and commercial software, finding that in all cases our algorithms resulted in a comparable number of identified peptides for simple and complex samples measured on Waters, Agilent, and AB SCIEX quadrupole time-of-flight and Thermo Q-Exactive mass spectrometers. The mass accuracy of matched precursor ions also compared favorably with vendor and commercial tools. Additionally, typical analysis runtimes (∼1-100 ms per MS/MS spectrum) were short enough to enable the practical use of these high-quality signal processing tools for large clinical and research data sets.
Manufacturing methods of a composite cell case for a Ni-Cd battery
NASA Technical Reports Server (NTRS)
Bauer, J. L.; Bogner, R. S.; Lowe, E. P.; Orlowski, E.
1979-01-01
Graphite epoxy material for a nickel cadmium battery cell case has been evaluated and determined to perform in the simulated environment of the battery. The basic manufacturing method requires refinement to demonstrate production feasibility. The various facets of production scale-up, i.e., process and tooling development together with material and process control, have been integrated into a comprehensive manufacturing process that assures production reproducibility and product uniformity. Test results substantiate that a battery cell case produced from graphite epoxy pre-impregnated material utilizing internal pressure bag fabrication method is feasible.
web-based interactive data processing: application to stable isotope metrology.
Verkouteren, R M; Lee, J N
2001-08-01
To address a fundamental need in stable isotope metrology, the National Institute of Standards and Technology (NIST) has established a web-based interactive data-processing system accessible through a common gateway interface (CGI) program on the internet site http://www. nist.gov/widps-co2. This is the first application of a web-based tool that improves the measurement traceability afforded by a series of NIST standard materials. Specifically, this tool promotes the proper usage of isotope reference materials (RMs) and improves the quality of reported data from extensive measurement networks. Through the International Atomic Energy Agency (IAEA), we have defined standard procedures for stable isotope measurement and data-processing, and have determined and applied consistent reference values for selected NIST and IAEA isotope RMs. Measurement data of samples and RMs are entered into specified fields on the web-based form. These data are submitted through the CGI program on a NIST Web server, where appropriate calculations are performed and results returned to the client. Several international laboratories have independently verified the accuracy of the procedures and algorithm for measurements of naturally occurring carbon-13 and oxygen-18 abundances and slightly enriched compositions up to approximately 150% relative to natural abundances. To conserve the use of the NIST RMs, users may determine value assignments for a secondary standard to be used in routine analysis. Users may also wish to validate proprietary algorithms embedded in their laboratory instrumentation, or specify the values of fundamental variables that are usually fixed in reduction algorithms to see the effect on the calculations. The results returned from the web-based tool are limited in quality only by the measurements themselves, and further value may be realized through the normalization function. When combined with stringent measurement protocols, two- to threefold improvements have been realized in the reproducibility of carbon-13 and oxygen-18 determinations across laboratories.
NASA Astrophysics Data System (ADS)
Liu, Ronghua; Sun, Qiaofeng; Hu, Tian; Li, Lian; Nie, Lei; Wang, Jiayue; Zhou, Wanhui; Zang, Hengchang
2018-03-01
As a powerful process analytical technology (PAT) tool, near infrared (NIR) spectroscopy has been widely used in real-time monitoring. In this study, NIR spectroscopy was applied to monitor multi-parameters of traditional Chinese medicine (TCM) Shenzhiling oral liquid during the concentration process to guarantee the quality of products. Five lab scale batches were employed to construct quantitative models to determine five chemical ingredients and physical change (samples density) during concentration process. The paeoniflorin, albiflorin, liquiritin and samples density were modeled by partial least square regression (PLSR), while the content of the glycyrrhizic acid and cinnamic acid were modeled by support vector machine regression (SVMR). Standard normal variate (SNV) and/or Savitzkye-Golay (SG) smoothing with derivative methods were adopted for spectra pretreatment. Variable selection methods including correlation coefficient (CC), competitive adaptive reweighted sampling (CARS) and interval partial least squares regression (iPLS) were performed for optimizing the models. The results indicated that NIR spectroscopy was an effective tool to successfully monitoring the concentration process of Shenzhiling oral liquid.
A computer method for schedule processing and quick-time updating.
NASA Technical Reports Server (NTRS)
Mccoy, W. H.
1972-01-01
A schedule analysis program is presented which can be used to process any schedule with continuous flow and with no loops. Although generally thought of as a management tool, it has applicability to such extremes as music composition and computer program efficiency analysis. Other possibilities for its use include the determination of electrical power usage during some operation such as spacecraft checkout, and the determination of impact envelopes for the purpose of scheduling payloads in launch processing. At the core of the described computer method is an algorithm which computes the position of each activity bar on the output waterfall chart. The algorithm is basically a maximal-path computation which gives to each node in the schedule network the maximal path from the initial node to the given node.
Detecting measurement outliers: remeasure efficiently
NASA Astrophysics Data System (ADS)
Ullrich, Albrecht
2010-09-01
Shrinking structures, advanced optical proximity correction (OPC) and complex measurement strategies continually challenge critical dimension (CD) metrology tools and recipe creation processes. One important quality ensuring task is the control of measurement outlier behavior. Outliers could trigger false positive alarm for specification violations impacting cycle time or potentially yield. Constant high level of outliers not only deteriorates cycle time but also puts unnecessary stress on tool operators leading eventually to human errors. At tool level the sources of outliers are natural variations (e.g. beam current etc.), drifts, contrast conditions, focus determination or pattern recognition issues, etc. Some of these can result from suboptimal or even wrong recipe settings, like focus position or measurement box size. Such outliers, created by an automatic recipe creation process faced with more complicated structures, would manifest itself rather as systematic variation of measurements than the one caused by 'pure' tool variation. I analyzed several statistical methods to detect outliers. These range from classical outlier tests for extrema, robust metrics like interquartile range (IQR) to methods evaluating the distribution of different populations of measurement sites, like the Cochran test. The latter suits especially the detection of systematic effects. The next level of outlier detection entwines additional information about the mask and the manufacturing process with the measurement results. The methods were reviewed for measured variations assumed to be normally distributed with zero mean but also for the presence of a statistically significant spatial process signature. I arrive at the conclusion that intelligent outlier detection can influence the efficiency and cycle time of CD metrology greatly. In combination with process information like target, typical platform variation and signature, one can tailor the detection to the needs of the photomask at hand. By monitoring the outlier behavior carefully, weaknesses of the automatic recipe creation process can be spotted.
Swartz, Jordan; Koziatek, Christian; Theobald, Jason; Smith, Silas; Iturrate, Eduardo
2017-05-01
Testing for venous thromboembolism (VTE) is associated with cost and risk to patients (e.g. radiation). To assess the appropriateness of imaging utilization at the provider level, it is important to know that provider's diagnostic yield (percentage of tests positive for the diagnostic entity of interest). However, determining diagnostic yield typically requires either time-consuming, manual review of radiology reports or the use of complex and/or proprietary natural language processing software. The objectives of this study were twofold: 1) to develop and implement a simple, user-configurable, and open-source natural language processing tool to classify radiology reports with high accuracy and 2) to use the results of the tool to design a provider-specific VTE imaging dashboard, consisting of both utilization rate and diagnostic yield. Two physicians reviewed a training set of 400 lower extremity ultrasound (UTZ) and computed tomography pulmonary angiogram (CTPA) reports to understand the language used in VTE-positive and VTE-negative reports. The insights from this review informed the arguments to the five modifiable parameters of the NLP tool. A validation set of 2,000 studies was then independently classified by the reviewers and by the tool; the classifications were compared and the performance of the tool was calculated. The tool was highly accurate in classifying the presence and absence of VTE for both the UTZ (sensitivity 95.7%; 95% CI 91.5-99.8, specificity 100%; 95% CI 100-100) and CTPA reports (sensitivity 97.1%; 95% CI 94.3-99.9, specificity 98.6%; 95% CI 97.8-99.4). The diagnostic yield was then calculated at the individual provider level and the imaging dashboard was created. We have created a novel NLP tool designed for users without a background in computer programming, which has been used to classify venous thromboembolism reports with a high degree of accuracy. The tool is open-source and available for download at http://iturrate.com/simpleNLP. Results obtained using this tool can be applied to enhance quality by presenting information about utilization and yield to providers via an imaging dashboard. Copyright © 2017 Elsevier B.V. All rights reserved.
A Qualitative Research on Example Generation Capabilities of University Students
ERIC Educational Resources Information Center
Saglam, Yasemin; Dost, Senol
2016-01-01
Examples which are used in exploring a procedure or comprehending/concretizing a mathematical concept are powerful teaching tools. Generating examples other than conventional ones is both a means for research and a pedagogical method. The aim of this study is to determine the transition process between example generation strategies, and the…
Addressing Misconceptions in Geometry through Written Error Analyses
ERIC Educational Resources Information Center
Kembitzky, Kimberle A.
2009-01-01
This study examined the improvement of students' comprehension of geometric concepts through analytical writing about their own misconceptions using a reflective tool called an ERNIe (acronym for ERror aNalyIsis). The purpose of this study was to determine whether the ERNIe process could be used to correct geometric misconceptions, as well as how…
Benech, P D; Patatian, A
2014-12-01
There is no doubt that the DNA microarray-based technology contributed to increase our knowledge of a wide range of processes. However, integrating genes into functional networks, rather than terms describing generic characteristics, remains an important challenge. The highly context-dependent function of a given gene and feedback mechanisms complexify greatly the interpretation of the data. Moreover, it is difficult to determine whether changes in gene expression are the result or the cause of pathologies or physiological events. In both cases, the difficulty relies on the involvement of processes that, at an early stage, can be protective and later on, deleterious because of their runaway. Each individual cell has its own transcription profile that determines its behaviour and its relationships with its neighbours. This is particularly true when a mechanism such as cell cycle is concerned. Another issue concerns the analyses from samples of different donors. Whereas the statistical tools lead to determine common features among groups, they tend to smooth the overall data and consequently, the selected values represent the 'tip of the iceberg'. There is a significant overlap in the set of genes identified in the different studies on skin ageing processes described in the present review. The reason of this overlap is because most of these genes belong to the basic machinery controlling cell growth and arrest. To get a more full picture of these processes, a hard work has still to be done to determine the precise mechanisms conferring the cell type specificity of ageing. Integrative biology applied to the huge amount of existing microarray data should fulfil gaps, through the characterization of additional actors accounting for the activation of specific signalling pathways at crossing points. Furthermore, computational tools have to be developed taking into account that expression values among similar groups may not vary 'by chance' but may reflect, along with other subtle changes, specific features of one given donor. Through a better stratification, these tools will allow to recover genes from the 'bottom of the iceberg'. Identifying these genes should contribute to understand how skin ages among individuals, thus paving the way for personalized skin care. © 2014 Society of Cosmetic Scientists and the Société Française de Cosmétologie.
McGinty, Meghan D; Burke, Thomas A; Resnick, Beth; Barnett, Daniel J; Smith, Katherine C; Rutkow, Lainie
Evacuation and shelter-in-place decision making for hospitals is complex, and existing literature contains little information about how these decisions are made in practice. To describe decision-making processes and identify determinants of acute care hospital evacuation and shelter-in-place during Hurricane Sandy. Semistructured interviews were conducted from March 2014 to February 2015 with key informants who had authority and responsibility for evacuation and shelter-in-place decisions for hospitals during Hurricane Sandy in 2012. Interviews were recorded, transcribed, and thematically analyzed. Interviewees included hospital executives and state and local public health, emergency management, and emergency medical service officials from Delaware, Maryland, New Jersey, and New York. Interviewees identified decision processes and determinants of acute care hospital evacuation and shelter-in-place during Hurricane Sandy. We interviewed 42 individuals from 32 organizations. Decisions makers reported relying on their instincts rather than employing guides or tools to make evacuation and shelter-in-place decisions during Hurricane Sandy. Risk to patient health from evacuation, prior experience, cost, and ability to maintain continuity of operations were the most influential factors in decision making. Flooding and utility outages, which were predicted to or actually impacted continuity of operations, were the primary determinants of evacuation. Evacuation and shelter-in-place decision making for hospitals can be improved by ensuring hospital emergency plans address flooding and include explicit thresholds that, if exceeded, would trigger evacuation. Comparative risk assessments that inform decision making would be enhanced by improved collection, analysis, and communication of data on morbidity and mortality associated with evacuation versus sheltering-in-place of hospitals. In addition, administrators and public officials can improve their preparedness to make evacuation and shelter-in-place decisions by practicing the use of decision-making tools during training and exercises.
NASA Astrophysics Data System (ADS)
Bandrowski, D.; Lai, Y.; Bradley, N.; Gaeuman, D. A.; Murauskas, J.; Som, N. A.; Martin, A.; Goodman, D.; Alvarez, J.
2014-12-01
In the field of river restoration sciences there is a growing need for analytical modeling tools and quantitative processes to help identify and prioritize project sites. 2D hydraulic models have become more common in recent years and with the availability of robust data sets and computing technology, it is now possible to evaluate large river systems at the reach scale. The Trinity River Restoration Program is now analyzing a 40 mile segment of the Trinity River to determine priority and implementation sequencing for its Phase II rehabilitation projects. A comprehensive approach and quantitative tool has recently been developed to analyze this complex river system referred to as: 2D-Hydrodynamic Based Logic Modeling (2D-HBLM). This tool utilizes various hydraulic output parameters combined with biological, ecological, and physical metrics at user-defined spatial scales. These metrics and their associated algorithms are the underpinnings of the 2D-HBLM habitat module used to evaluate geomorphic characteristics, riverine processes, and habitat complexity. The habitat metrics are further integrated into a comprehensive Logic Model framework to perform statistical analyses to assess project prioritization. The Logic Model will analyze various potential project sites by evaluating connectivity using principal component methods. The 2D-HBLM tool will help inform management and decision makers by using a quantitative process to optimize desired response variables with balancing important limiting factors in determining the highest priority locations within the river corridor to implement restoration projects. Effective river restoration prioritization starts with well-crafted goals that identify the biological objectives, address underlying causes of habitat change, and recognizes that social, economic, and land use limiting factors may constrain restoration options (Bechie et. al. 2008). Applying natural resources management actions, like restoration prioritization, is essential for successful project implementation (Conroy and Peterson, 2013). Evaluating tradeoffs and examining alternatives to improve fish habitat through optimization modeling is not just a trend but rather the scientific strategy by which management needs embrace and apply in its decision framework.
Relevance of deterministic chaos theory to studies in functioning of dynamical systems
NASA Astrophysics Data System (ADS)
Glagolev, S. N.; Bukhonova, S. M.; Chikina, E. D.
2018-03-01
The paper considers chaotic behavior of dynamical systems typical for social and economic processes. Approaches to analysis and evaluation of system development processes are studies from the point of view of controllability and determinateness. Explanations are given for necessity to apply non-standard mathematical tools to explain states of dynamical social and economic systems on the basis of fractal theory. Features of fractal structures, such as non-regularity, self-similarity, dimensionality and fractionality are considered.
Unraveling the Processing Parameters in Friction Stir Welding
NASA Technical Reports Server (NTRS)
Schneider, Judy; Nunes, Arthur C., Jr.
2005-01-01
In friction stir welding (FSW), a rotating threaded pin tool is translated along a weld seam, literally stirring the edges of the seam together. To determine optimal processing parameters for producing a defect free weld, a better understanding of the resulting metal deformation flow path or paths is required. In this study, various markers are used to trace the flow paths of the metal. X-ray radiographs record the segmentation and position of the wire. Several variations in the trajectories can be differentiated within the weld zone.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sterner, J.W.; Steele, D.K.; Shirts, M.B.
The Bureau of Mines conducted studies on four makes of Japanese automobiles, three 1981 and one 1982 model years, received from three manufacturers to determine if their materials composition would present problems to the current technology used to process junk automobiles for metal recovery. One of each make of automobile was hand-dismantled to determine the materials composition. In addition, two nearly identical automobiles of each make were shredded at a commercial operation where all metal products and rejects were collected for analysis to determine metal and nonmetal distribution. The average weight of the four automobiles to be dismantled, less batteries,more » tools, and fluids, was 1,938.3 lb. There were no materials used in the manufacture of the late model Japanese automobiles that should present handling or processing problems to the steelmaking or secondary metal recyclers.« less
Optimization of a hardware implementation for pulse coupled neural networks for image applications
NASA Astrophysics Data System (ADS)
Gimeno Sarciada, Jesús; Lamela Rivera, Horacio; Warde, Cardinal
2010-04-01
Pulse Coupled Neural Networks are a very useful tool for image processing and visual applications, since it has the advantages of being invariant to image changes as rotation, scale, or certain distortion. Among other characteristics, the PCNN changes a given image input into a temporal representation which can be easily later analyzed for pattern recognition. The structure of a PCNN though, makes it necessary to determine all of its parameters very carefully in order to function optimally, so that the responses to the kind of inputs it will be subjected are clearly discriminated allowing for an easy and fast post-processing yielding useful results. This tweaking of the system is a taxing process. In this paper we analyze and compare two methods for modeling PCNNs. A purely mathematical model is programmed and a similar circuital model is also designed. Both are then used to determine the optimal values of the several parameters of a PCNN: gain, threshold, time constants for feed-in and threshold and linking leading to an optimal design for image recognition. The results are compared for usefulness, accuracy and speed, as well as the performance and time requirements for fast and easy design, thus providing a tool for future ease of management of a PCNN for different tasks.
Students' Opinions on the Use of Tablet Computers in Education
ERIC Educational Resources Information Center
Duran, Muharrem; Aytaç, Tufan
2016-01-01
One of the most important tools for the integration of ICT in education, especially with tablet computers, has been employed in Turkey through the FATIH Project. This study aimed to determine students' views on the use of tablet computers in learning and teaching processes. Eighty-four first-year high school students studying at three schools in…
ERIC Educational Resources Information Center
Basham, James D.; Smith, Sean J.; Satter, Allyson L.
2016-01-01
In the process of evaluating online learning products for accessibility, researchers in the Center on Online Learning and Students with Disabilities concluded that most often consultation guides and assessment tools were useful in determining sensory accessibility but did not extend to critical aspects of learning within the Universal Design for…
Process Damping and Cutting Tool Geometry in Machining
NASA Astrophysics Data System (ADS)
Taylor, C. M.; Sims, N. D.; Turner, S.
2011-12-01
Regenerative vibration, or chatter, limits the performance of machining processes. Consequences of chatter include tool wear and poor machined surface finish. Process damping by tool-workpiece contact can reduce chatter effects and improve productivity. Process damping occurs when the flank (also known as the relief face) of the cutting tool makes contact with waves on the workpiece surface, created by chatter motion. Tool edge features can act to increase the damping effect. This paper examines how a tool's edge condition combines with the relief angle to affect process damping. An analytical model of cutting with chatter leads to a two-section curve describing how process damped vibration amplitude changes with surface speed for radiussed tools. The tool edge dominates the process damping effect at the lowest surface speeds, with the flank dominating at higher speeds. A similar curve is then proposed regarding tools with worn edges. Experimental data supports the notion of the two-section curve. A rule of thumb is proposed which could be useful to machine operators, regarding tool wear and process damping. The question is addressed, should a tool of a given geometry, used for a given application, be considered as sharp, radiussed or worn regarding process damping.
NASA Astrophysics Data System (ADS)
Dasgupta, S.; Mukherjee, S.
2016-09-01
One of the most significant factors in metal cutting is tool life. In this research work, the effects of machining parameters on tool under wet machining environment were studied. Tool life characteristics of brazed carbide cutting tool machined against mild steel and optimization of machining parameters based on Taguchi design of experiments were examined. The experiments were conducted using three factors, spindle speed, feed rate and depth of cut each having three levels. Nine experiments were performed on a high speed semi-automatic precision central lathe. ANOVA was used to determine the level of importance of the machining parameters on tool life. The optimum machining parameter combination was obtained by the analysis of S/N ratio. A mathematical model based on multiple regression analysis was developed to predict the tool life. Taguchi's orthogonal array analysis revealed the optimal combination of parameters at lower levels of spindle speed, feed rate and depth of cut which are 550 rpm, 0.2 mm/rev and 0.5mm respectively. The Main Effects plot reiterated the same. The variation of tool life with different process parameters has been plotted. Feed rate has the most significant effect on tool life followed by spindle speed and depth of cut.
ON DEVELOPING TOOLS AND METHODS FOR ENVIRONMENTALLY BENIGN PROCESSES
Two types of tools are generally needed for designing processes and products that are cleaner from environmental impact perspectives. The first kind is called process tools. Process tools are based on information obtained from experimental investigations in chemistry., material s...
Amino Acid profile as a feasible tool for determination of the authenticity of fruit juices.
Asadpoor, Mostafa; Ansarin, Masoud; Nemati, Mahboob
2014-12-01
Fruit juice is a nutrient rich food product with a direct connection to public health. The purpose of this research was to determine the amino acid profile of juices and provide a quick and accurate indicator for determining their authenticity. The method of analysis was HPLC with fluorescence detector and pre-column derivatization by orthophtaldialdehyde (OPA). Sixty-six samples of fruit juices were analyzed, and fourteen amino acids were identified and determined in the sampled fruit juices. The fruit samples used for this analysis were apples, oranges, cherry, pineapple, mango, apricot, pomegranate, peach and grapes. The results showed that 32% of samples tested in this study had a lower concentrate percentage as compared to that of their labels and/or other possible authenticity problems in the manufacturing process. The following samples showed probable adulteration: four cherry juice samples, two pomegranate juice samples, one mango, three grape, four peach, seven orange, two apple and one apricot juice samples. In general, determining the amount of amino acids and comparing sample amino acids profiles with the standard values seems to be an indicator for quality control. This method can provide the regulatory agencies with a tool, to help produce a healthier juice. The aim of this study is the analytical control of the fruit juice composition is becoming an important issue, and HPLC can provide an important and essential tool for more accurate research as well as for routine analysis.
ERIC Educational Resources Information Center
Kalessopoulou, Despina
2017-01-01
Children's museums and exhibitions designed for children are well-thought adult projects; however, children rarely participate in the design process. Visual research methods can provide a remedy for this scarcity of children's agency in determining significant exhibition qualities. The paper will discuss a visual research methodology that empowers…
The Use of Poster Projects as a Motivational and Learning Tool in Managerial Accounting Courses
ERIC Educational Resources Information Center
Altintas, Nergis Nalan; Suer, Ayca Zeynep; Sari, Emre Selcuk; Ulker, Mirac Sema
2014-01-01
Poster projects are an alternative method of motivation, learning, and information dissemination in education. The main purpose of this initial study was to determine the effect of poster projects on the motivational and learning process of managerial accounting students. In addition, the authors aimed to compare the opinions of managerial…
NMR studies of protein-nucleic acid interactions.
Varani, Gabriele; Chen, Yu; Leeper, Thomas C
2004-01-01
Protein-DNA and protein-RNA complexes play key functional roles in every living organism. Therefore, the elucidation of their structure and dynamics is an important goal of structural and molecular biology. Nuclear magnetic resonance (NMR) studies of protein and nucleic acid complexes have common features with studies of protein-protein complexes: the interaction surfaces between the molecules must be carefully delineated, the relative orientation of the two species needs to be accurately and precisely determined, and close intermolecular contacts defined by nuclear Overhauser effects (NOEs) must be obtained. However, differences in NMR properties (e.g., chemical shifts) and biosynthetic pathways for sample productions generate important differences. Chemical shift differences between the protein and nucleic acid resonances can aid the NMR structure determination process; however, the relatively limited dispersion of the RNA ribose resonances makes the process of assigning intermolecular NOEs more difficult. The analysis of the resulting structures requires computational tools unique to nucleic acid interactions. This chapter summarizes the most important elements of the structure determination by NMR of protein-nucleic acid complexes and their analysis. The main emphasis is on recent developments (e.g., residual dipolar couplings and new Web-based analysis tools) that have facilitated NMR studies of these complexes and expanded the type of biological problems to which NMR techniques of structural elucidation can now be applied.
Taking Care of the Small Computer: A Guide for Librarians.
ERIC Educational Resources Information Center
Williams, Gene
1986-01-01
Describes how to identify microcomputer problems and determine whether the services of a technician are required by troubleshooting, or using a process of elimination, without needing a technical background or special tools. Prevention methods and the use of diagnostic programs are also explained. (EM)
The hidden KPI registration accuracy.
Shorrosh, Paul
2011-09-01
Determining the registration accuracy rate is fundamental to improving revenue cycle key performance indicators. A registration quality assurance (QA) process allows errors to be corrected before bills are sent and helps registrars learn from their mistakes. Tools are available to help patient access staff who perform registration QA manually.
Microstructural Evolution during DPRM Process of Semisolid Ledeburitic D2 Tool Steel
Mohammed, M. N.; Omar, M. Z.; Syarif, J.; Sajuri, Z.; Salleh, M. S.; Alhawari, K. S.
2013-01-01
Semisolid metal processing is a relatively new technology that offers several advantages over liquid processing and solid processing because of the unique behaviour and characteristic microstructure of metals in this state. With the aim of finding a minimum process chain for the manufacture of high-quality production at minimal cost for forming, the microstructural evolution of the ledeburitic AISI D2 tool steel in the semisolid state was studied experimentally. The potential of the direct partial remelting (DPRM) process for the production of AISI D2 with a uniform globular microstructure was revealed. The liquid fraction was determined using differential scanning calorimetry. The microstructures of the samples were investigated using an optical microscope and a scanning electron microscope equipped with an energy dispersive spectroscopy analyser, while X-ray phase analysis was performed to identify the phase evolution and the type of carbides. Mechanical characterisation was completed by hardness measurements. The typical microstructure after DPRM consists of metastable austenite which was located particularly in the globular grains (average grain size about 50 μm), while the remaining interspaces were filled by precipitated eutectic carbides on the grain boundaries and lamellar network. PMID:24223510
Microstructural evolution during DPRM process of semisolid ledeburitic D2 tool steel.
Mohammed, M N; Omar, M Z; Syarif, J; Sajuri, Z; Salleh, M S; Alhawari, K S
2013-01-01
Semisolid metal processing is a relatively new technology that offers several advantages over liquid processing and solid processing because of the unique behaviour and characteristic microstructure of metals in this state. With the aim of finding a minimum process chain for the manufacture of high-quality production at minimal cost for forming, the microstructural evolution of the ledeburitic AISI D2 tool steel in the semisolid state was studied experimentally. The potential of the direct partial remelting (DPRM) process for the production of AISI D2 with a uniform globular microstructure was revealed. The liquid fraction was determined using differential scanning calorimetry. The microstructures of the samples were investigated using an optical microscope and a scanning electron microscope equipped with an energy dispersive spectroscopy analyser, while X-ray phase analysis was performed to identify the phase evolution and the type of carbides. Mechanical characterisation was completed by hardness measurements. The typical microstructure after DPRM consists of metastable austenite which was located particularly in the globular grains (average grain size about 50 μ m), while the remaining interspaces were filled by precipitated eutectic carbides on the grain boundaries and lamellar network.
Wireless Monitoring of the Height of Condensed Water in Steam Pipes
NASA Technical Reports Server (NTRS)
Lee, Hyeong Jae; Bar-Cohen, Yoseph; Lih, Shyh-Shiuh; Badescu, Mircea; Dingizian, Arsham; Takano, Nobuyuki; Blosiu, Julian O.
2014-01-01
A wireless health monitoring system has been developed for determining the height of water condensation in the steam pipes and the data acquisition is done remotely using a wireless network system. The developed system is designed to operate in the harsh environment encountered at manholes and the pipe high temperature of over 200 °C. The test method is an ultrasonic pulse-echo and the hardware includes a pulser, receiver and wireless modem for communication. Data acquisition and signal processing software were developed to determine the water height using adaptive signal processing and data communication that can be controlled while the hardware is installed in a manhole. A statistical decision-making tool is being developed based on the field test data to determine the height of in the condensed water under high noise conditions and other environmental factors.
Thermomechanical conditions and stresses on the friction stir welding tool
NASA Astrophysics Data System (ADS)
Atthipalli, Gowtam
Friction stir welding has been commercially used as a joining process for aluminum and other soft materials. However, the use of this process in joining of hard alloys is still developing primarily because of the lack of cost effective, long lasting tools. Here I have developed numerical models to understand the thermo mechanical conditions experienced by the FSW tool and to improve its reusability. A heat transfer and visco-plastic flow model is used to calculate the torque, and traverse force on the tool during FSW. The computed values of torque and traverse force are validated using the experimental results for FSW of AA7075, AA2524, AA6061 and Ti-6Al-4V alloys. The computed torque components are used to determine the optimum tool shoulder diameter based on the maximum use of torque and maximum grip of the tool on the plasticized workpiece material. The estimation of the optimum tool shoulder diameter for FSW of AA6061 and AA7075 was verified with experimental results. The computed values of traverse force and torque are used to calculate the maximum shear stress on the tool pin to determine the load bearing ability of the tool pin. The load bearing ability calculations are used to explain the failure of H13 steel tool during welding of AA7075 and commercially pure tungsten during welding of L80 steel. Artificial neural network (ANN) models are developed to predict the important FSW output parameters as function of selected input parameters. These ANN consider tool shoulder radius, pin radius, pin length, welding velocity, tool rotational speed and axial pressure as input parameters. The total torque, sliding torque, sticking torque, peak temperature, traverse force, maximum shear stress and bending stress are considered as the output for ANN models. These output parameters are selected since they define the thermomechanical conditions around the tool during FSW. The developed ANN models are used to understand the effect of various input parameters on the total torque and traverse force during FSW of AA7075 and 1018 mild steel. The ANN models are also used to determine tool safety factor for wide range of input parameters. A numerical model is developed to calculate the strain and strain rates along the streamlines during FSW. The strain and strain rate values are calculated for FSW of AA2524. Three simplified models are also developed for quick estimation of output parameters such as material velocity field, torque and peak temperature. The material velocity fields are computed by adopting an analytical method of calculating velocities for flow of non-compressible fluid between two discs where one is rotating and other is stationary. The peak temperature is estimated based on a non-dimensional correlation with dimensionless heat input. The dimensionless heat input is computed using known welding parameters and material properties. The torque is computed using an analytical function based on shear strength of the workpiece material. These simplified models are shown to be able to predict these output parameters successfully.
ERIC Educational Resources Information Center
Meeks, Glenn E.; Fisher, Ricki; Loveless, Warren
Personnel involved in planning or developing schools lack the costing tools that will enable them to determine educational technology costs. This report presents an overview of the technology costing process and the general costs used in estimating educational technology systems on a macro-budget basis, along with simple cost estimates for…
Peng Zhao; Hui-Juan Zhou; Daniel Potter; Yi-Heng Hu; Xiao-Jia Feng; Meng Dang; Li Feng; Saman Zulfiqar; Wen-Zhe Liu; Gui-Fang Zhao; Keith Woeste
2018-01-01
Genomic data are a powerful tool for elucidating the processes involved in the evolution and divergence of species. The speciation and phylogenetic relationships among Chinese Juglans remain unclear. Here, we used results from phylogenomic and population genetic analyses, transcriptomics, Genotyping-By-Sequencing (GBS), and whole chloroplast...
Using the Moon as a Tool for Discovery-Oriented Learning.
ERIC Educational Resources Information Center
Cummins, Robert Hays; Ritger, Scott David; Myers, Christopher Adam
1992-01-01
Students test the hypothesis that the moon revolves east to west around the earth, determine by observation approximately how many degrees the moon revolves per night, and develop a scale model of the earth-sun-moon system in this laboratory exercise. Students are actively involved in the scientific process and are introduced to the importance of…
Mathematical modeling of a radio-frequency path for IEEE 802.11ah based wireless sensor networks
NASA Astrophysics Data System (ADS)
Tyshchenko, Igor; Cherepanov, Alexander; Dmitrii, Vakhnin; Popova, Mariia
2017-09-01
This article discusses the process of creating the mathematical model of a radio-frequency path for an IEEE 802.11ah based wireless sensor networks using M atLab Simulink CAD tools. In addition, it describes occurring perturbing effects and determining the presence of a useful signal in the received mixture.
Capolongo, S; Bellini, E; Nachiero, D; Rebecchi, A; Buffoli, M
2014-01-01
The design of hospital environments is determined by functional requirements and technical regulations, as well as numerous protocols, which define the structure and system characteristics that such environments need to achieve. In order to improve people's well-being and the quality of their experience within public hospitals, design elements (soft qualities) are added to those 'necessary' features. The aim of this research has been to experiment a new design process and also to create health care spaces with high environmental quality and capable to meet users' emotional and perceptual needs. Such needs were investigated with the help of qualitative research tools and the design criteria for one of these soft qualities - colour - were subsequently defined on the basis of the findings. The colour scheme design for the new San Paolo Hospital Emergency Department in Milan was used as case study. Focus groups were fundamental in defining the project's goals and criteria. The issues raised have led to believe that the proper procedure is not the mere consultation of the users in order to define the goals: users should rather be involved in the whole design process and become co-agents of the choices that determine the environment characteristics, so as to meet the quality requirements identified by the users themselves. The case study has shown the possibility of developing a designing methodology made by three steps (or operational tools) in which users' groups are involved in the choices, loading to plan the environments where compliance with expectations is already implied and verified by means of the process itself. Thus, the method leads to the creation of soft qualities in Healthcare.
Overlay Tolerances For VLSI Using Wafer Steppers
NASA Astrophysics Data System (ADS)
Levinson, Harry J.; Rice, Rory
1988-01-01
In order for VLSI circuits to function properly, the masking layers used in the fabrication of those devices must overlay each other to within the manufacturing tolerance incorporated in the circuit design. The capabilities of the alignment tools used in the masking process determine the overlay tolerances to which circuits can be designed. It is therefore of considerable importance that these capabilities be well characterized. Underestimation of the overlay accuracy results in unnecessarily large devices, resulting in poor utilization of wafer area and possible degradation of device performance. Overestimation will result in significant yield loss because of the failure to conform to the tolerances of the design rules. The proper methodology for determining the overlay capabilities of wafer steppers, the most commonly used alignment tool for the production of VLSI circuits, is the subject of this paper. Because cost-effective manufacturing process technology has been the driving force of VLSI, the impact on productivity is a primary consideration in all discussions. Manufacturers of alignment tools advertise the capabilities of their equipment. It is notable that no manufacturer currently characterizes his aligners in a manner consistent with the requirements of producing very large integrated circuits, as will be discussed. This has resulted in the situation in which the evaluation and comparison of the capabilities of alignment tools require the attention of a lithography specialist. Unfortunately, lithographic capabilities must be known by many other people, particularly the circuit designers and the managers responsible for the financial consequences of the high prices of modern alignment tools. All too frequently, the designer or manager is confronted with contradictory data, one set coming from his lithography specialist, and the other coming from a sales representative of an equipment manufacturer. Since the latter generally attempts to make his merchandise appear as attractive as possible, the lithographer is frequently placed in the position of having to explain subtle issues in order to justify his decisions. It is the purpose of this paper to provide that explanation.
McDonald, Catherine M
2008-04-01
According to the 2002 Cystic Fibrosis (CF) Foundation nutrition consensus report, children with CF should grow normally. Cross-sectional data from the foundation's patient registry concluded that a body mass index at or greater than the 50th percentile is associated with better lung function. A consistent, evidence-based screening process can identify those individuals with CF having nutrition risk factors associated with a decrease in pulmonary function, target early intervention, and prevent further decline. A tool for screening nutrition risk is described to identify those children with CF who would benefit from more extensive nutrition intervention. The proposed screening tool is a risk-based classification system with 3 categories: weight gain, height velocity, and body mass index. The CF Foundation recommendations regarding these parameters are incorporated, with risk points assigned when minimum body mass index, weight gain, and/or height gain standards are unmet. An interrater measure of agreement determined a satisfactory level of reliability (kappa = 0.85). Patient records (n = 85) were reviewed to determine nutrition status category (no risk or at risk) of this tool compared with the CF Foundation 2002 Nutrition Consensus, yielding sensitivity and specificity at 84% and 75%, respectively. A second comparison was made with combined, independent nutrition risk factors not included in the screening tool. The sensitivity and specificity of the screening tool compared with the combined risk factors were 86% and 78%, respectively. This tool for screening nutrition risk for CF is reliable and valid, with consistent, reproducible results, free from subject or observer bias.
Panjikar, Santosh; Parthasarathy, Venkataraman; Lamzin, Victor S; Weiss, Manfred S; Tucker, Paul A
2005-04-01
The EMBL-Hamburg Automated Crystal Structure Determination Platform is a system that combines a number of existing macromolecular crystallographic computer programs and several decision-makers into a software pipeline for automated and efficient crystal structure determination. The pipeline can be invoked as soon as X-ray data from derivatized protein crystals have been collected and processed. It is controlled by a web-based graphical user interface for data and parameter input, and for monitoring the progress of structure determination. A large number of possible structure-solution paths are encoded in the system and the optimal path is selected by the decision-makers as the structure solution evolves. The processes have been optimized for speed so that the pipeline can be used effectively for validating the X-ray experiment at a synchrotron beamline.
Michelson, Kelly N; Frader, Joel; Sorce, Lauren; Clayman, Marla L; Persell, Stephen D; Fragen, Patricia; Ciolino, Jody D; Campbell, Laura C; Arenson, Melanie; Aniciete, Danica Y; Brown, Melanie L; Ali, Farah N; White, Douglas
2016-12-01
Stakeholder-developed interventions are needed to support pediatric intensive care unit (PICU) communication and decision-making. Few publications delineate methods and outcomes of stakeholder engagement in research. We describe the process and impact of stakeholder engagement on developing a PICU communication and decision-making support intervention. We also describe the resultant intervention. Stakeholders included parents of PICU patients, healthcare team members (HTMs), and research experts. Through a year-long iterative process, we involved 96 stakeholders in 25 meetings and 26 focus groups or interviews. Stakeholders adapted an adult navigator model by identifying core intervention elements and then determining how to operationalize those core elements in pediatrics. The stakeholder input led to PICU-specific refinements, such as supporting transitions after PICU discharge and including ancillary tools. The resultant intervention includes navigator involvement with parents and HTMs and navigator-guided use of ancillary tools. Subsequent research will test the feasibility and efficacy of our intervention.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schlipf, David; Raach, Steffen; Haizmann, Florian
2015-12-14
This paper presents first steps toward an adaptive lidar data processing technique crucial for lidar-assisted control in wind turbines. The prediction time and the quality of the wind preview from lidar measurements depend on several factors and are not constant. If the data processing is not continually adjusted, the benefit of lidar-assisted control cannot be fully exploited, or can even result in harmful control action. An online analysis of the lidar and turbine data are necessary to continually reassess the prediction time and lidar data quality. In this work, a structured process to develop an analysis tool for the predictionmore » time and a new hardware setup for lidar-assisted control are presented. The tool consists of an online estimation of the rotor effective wind speed from lidar and turbine data and the implementation of an online cross correlation to determine the time shift between both signals. Further, initial results from an ongoing campaign in which this system was employed for providing lidar preview for feed-forward pitch control are presented.« less
Collette, Fabienne; Van der Linden, Martial; Salmon, Eric
2010-01-01
A decline of cognitive functioning affecting several cognitive domains was frequently reported in patients with frontotemporal dementia. We were interested in determining if these deficits can be interpreted as reflecting an impairment of controlled cognitive processes by using an assessment tool specifically developed to explore the distinction between automatic and controlled processes, namely the process dissociation procedure (PDP) developed by Jacoby. The PDP was applied to a word stem completion task to determine the contribution of automatic and controlled processes to episodic memory performance and was administered to a group of 12 patients with the behavioral variant of frontotemporal dementia (bv-FTD) and 20 control subjects (CS). Bv-FTD patients obtained a lower performance than CS for the estimates of controlled processes, but no group differences was observed for estimates of automatic processes. The between-groups comparison of the estimates of controlled and automatic processes showed a larger contribution of automatic processes to performance in bv-FTD, while a slightly more important contribution of controlled processes was observed in control subjects. These results are clearly indicative of an alteration of controlled memory processes in bv-FTD.
Antibiogramj: A tool for analysing images from disk diffusion tests.
Alonso, C A; Domínguez, C; Heras, J; Mata, E; Pascual, V; Torres, C; Zarazaga, M
2017-05-01
Disk diffusion testing, known as antibiogram, is widely applied in microbiology to determine the antimicrobial susceptibility of microorganisms. The measurement of the diameter of the zone of growth inhibition of microorganisms around the antimicrobial disks in the antibiogram is frequently performed manually by specialists using a ruler. This is a time-consuming and error-prone task that might be simplified using automated or semi-automated inhibition zone readers. However, most readers are usually expensive instruments with embedded software that require significant changes in laboratory design and workflow. Based on the workflow employed by specialists to determine the antimicrobial susceptibility of microorganisms, we have designed a software tool that, from images of disk diffusion tests, semi-automatises the process. Standard computer vision techniques are employed to achieve such an automatisation. We present AntibiogramJ, a user-friendly and open-source software tool to semi-automatically determine, measure and categorise inhibition zones of images from disk diffusion tests. AntibiogramJ is implemented in Java and deals with images captured with any device that incorporates a camera, including digital cameras and mobile phones. The fully automatic procedure of AntibiogramJ for measuring inhibition zones achieves an overall agreement of 87% with an expert microbiologist; moreover, AntibiogramJ includes features to easily detect when the automatic reading is not correct and fix it manually to obtain the correct result. AntibiogramJ is a user-friendly, platform-independent, open-source, and free tool that, up to the best of our knowledge, is the most complete software tool for antibiogram analysis without requiring any investment in new equipment or changes in the laboratory. Copyright © 2017 Elsevier B.V. All rights reserved.
Multi-Attribute Consensus Building Tool
ERIC Educational Resources Information Center
Shyyan, Vitaliy; Christensen, Laurene; Thurlow, Martha; Lazarus, Sheryl
2013-01-01
The Multi-Attribute Consensus Building (MACB) method is a quantitative approach for determining a group's opinion about the importance of each item (strategy, decision, recommendation, policy, priority, etc.) on a list (Vanderwood, & Erickson, 1994). This process enables a small or large group of participants to generate and discuss a set…
TOC, ATP AND RESPIRATION RATE AS CONTROL PARAMETERS FOR THE ACTIVATED SLUDGE PROCESS
This research was conducted to determine the feasibility of using TOC, ATP and respiration rates as tools for controlling a complete mix activated sludge plant handling a significant amount of industrial waste. Control methodology was centered on using F/M ratio which was determi...
The use KPI's to determine the waste in production process
NASA Astrophysics Data System (ADS)
Borsos, G.; Iacob, C. C.; Calefariu, G.
2016-11-01
In theory and practice of management is well-known Lean approach about forms of waste from production processes (Muda) and the method VSM (Value Stream Map), one of the most effective methods for determining the activities generating value within industrial companies. It is also obvious concern of the specialists for performance measurement regardless of purview of the organizations. The literature review has shown that the link between performance indicators and the objectives of the companies is researched in detail. However, the correlation between indicators and the forms of waste that generate deviations from the setpoints is rather nature practical and it depends on the talent and managerial skills of those directing production processes. The paper presents the results of a applied study, performed by the authors, through which it was has sought to will create a system of performance indicators specific to manufacturing activity that to be a useful tool to quantify the losses and to determining ways to improve default losses.
[Logistic and production process in a regional blood center: modeling and analysis].
Baesler, Felipe; Martínez, Cristina; Yaksic, Eduardo; Herrera, Claudia
2011-09-01
The blood supply chain is a complex system that considers different interconnected elements that have to be synchronized correctly to satisfy in quality and quantity the final patient requirements. To determine the blood center maximum production capacity, as well as the determination of the necessary changes for a future production capacity expansion. This work was developed in the Blood Center of Concepción, Chile, operations management tools were applied to model it and to propose improvement alternatives for the production process. The use of simulation is highlighted, which permitted the replication of the center behavior and the evaluation of expansion alternatives. It is possible to absorb a 100% increment in blood demand, without making major changes or investments in the production process. Also it was possible to determine the subsequent steps in terms of investments in equipment and human resources for a future expansion of the center coverage. The techniques used to model the production process of the blood center of Concepción, Chile, allowed us to analyze how it operates, to detect "bottle necks", and to support the decision making process for a future expansion of its capacity.
A Automated Tool for Supporting FMEAs of Digital Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yue,M.; Chu, T.-L.; Martinez-Guridi, G.
2008-09-07
Although designs of digital systems can be very different from each other, they typically use many of the same types of generic digital components. Determining the impacts of the failure modes of these generic components on a digital system can be used to support development of a reliability model of the system. A novel approach was proposed for such a purpose by decomposing the system into a level of the generic digital components and propagating failure modes to the system level, which generally is time-consuming and difficult to implement. To overcome the associated issues of implementing the proposed FMEA approach,more » an automated tool for a digital feedwater control system (DFWCS) has been developed in this study. The automated FMEA tool is in nature a simulation platform developed by using or recreating the original source code of the different module software interfaced by input and output variables that represent physical signals exchanged between modules, the system, and the controlled process. For any given failure mode, its impacts on associated signals are determined first and the variables that correspond to these signals are modified accordingly by the simulation. Criteria are also developed, as part of the simulation platform, to determine whether the system has lost its automatic control function, which is defined as a system failure in this study. The conceptual development of the automated FMEA support tool can be generalized and applied to support FMEAs for reliability assessment of complex digital systems.« less
Direct Metal Deposition of H13 Tool Steel on Copper Alloy Substrate: Parametric Investigation
NASA Astrophysics Data System (ADS)
Imran, M. Khalid; Masood, S. H.; Brandt, Milan
2015-12-01
Over the past decade, researchers have demonstrated interest in tribology and prototyping by the laser aided material deposition process. Laser aided direct metal deposition (DMD) enables the formation of a uniform clad by melting the powder to form desired component from metal powder materials. In this research H13 tool steel has been used to clad on a copper alloy substrate using DMD. The effects of laser parameters on the quality of DMD deposited clad have been investigated and acceptable processing parameters have been determined largely through trial-and-error approaches. The relationships between DMD process parameters and the product characteristics such as porosity, micro-cracks and microhardness have been analysed using scanning electron microscope (SEM), image analysis software (ImageJ) and microhardness tester. It has been found that DMD parameters such as laser power, powder mass flow rate, feed rate and focus size have an important role in clad quality and crack formation.
Scattering effects of machined optical surfaces
NASA Astrophysics Data System (ADS)
Thompson, Anita Kotha
1998-09-01
Optical fabrication is one of the most labor-intensive industries in existence. Lensmakers use pitch to affix glass blanks to metal chucks that hold the glass as they grind it with tools that have not changed much in fifty years. Recent demands placed on traditional optical fabrication processes in terms of surface accuracy, smoothnesses, and cost effectiveness has resulted in the exploitation of precision machining technology to develop a new generation of computer numerically controlled (CNC) optical fabrication equipment. This new kind of precision machining process is called deterministic microgrinding. The most conspicuous feature of optical surfaces manufactured by the precision machining processes (such as single-point diamond turning or deterministic microgrinding) is the presence of residual cutting tool marks. These residual tool marks exhibit a highly structured topography of periodic azimuthal or radial deterministic marks in addition to random microroughness. These distinct topographic features give rise to surface scattering effects that can significantly degrade optical performance. In this dissertation project we investigate the scattering behavior of machined optical surfaces and their imaging characteristics. In particular, we will characterize the residual optical fabrication errors and relate the resulting scattering behavior to the tool and machine parameters in order to evaluate and improve the deterministic microgrinding process. Other desired information derived from the investigation of scattering behavior is the optical fabrication tolerances necessary to satisfy specific image quality requirements. Optical fabrication tolerances are a major cost driver for any precision optical manufacturing technology. The derivation and control of the optical fabrication tolerances necessary for different applications and operating wavelength regimes will play a unique and central role in establishing deterministic microgrinding as a preferred and a cost-effective optical fabrication process. Other well understood optical fabrication processes will also be reviewed and a performance comparison with the conventional grinding and polishing technique will be made to determine any inherent advantages in the optical quality of surfaces produced by other techniques.
Bainbridge, Daryl; Brazil, Kevin; Krueger, Paul; Ploeg, Jenny; Taniguchi, Alan; Darnay, Julie
2015-05-01
In many countries formal or informal palliative care networks (PCNs) have evolved to better integrate community-based services for individuals with a life-limiting illness. We conducted a cross-sectional survey using a customized tool to determine the perceptions of the processes of palliative care delivery reflective of horizontal integration from the perspective of nurses, physicians and allied health professionals working in a PCN, as well as to assess the utility of this tool. The process elements examined were part of a conceptual framework for evaluating integration of a system of care and centred on interprofessional collaboration. We used the Index of Interdisciplinary Collaboration (IIC) as a basis of measurement. The 86 respondents (85% response rate) placed high value on working collaboratively and most reported being part of an interprofessional team. The survey tool showed utility in identifying strengths and gaps in integration across the network and in detecting variability in some factors according to respondent agency affiliation and profession. Specifically, support for interprofessional communication and evaluative activities were viewed as insufficient. Impediments to these aspects of horizontal integration may be reflective of workload constraints, differences in agency operations or an absence of key structural features.
State Share of Instruction Funding to Ohio Public Community Colleges: A Policy Analysis
ERIC Educational Resources Information Center
Johnson, Betsy
2012-01-01
This study investigated various state policies to determine their impact on the state share of instruction (SSI) funding to community colleges in the state of Ohio. To complete the policy analysis, the researcher utilized three policy analysis tools, defined by Gill and Saunders (2010) as iterative processes, intuition and judgment, and advice and…
An Alternative Evaluation: Online Puzzle as a Course-End Activity
ERIC Educational Resources Information Center
Genç, Zülfü; Aydemir, Emrah
2015-01-01
Purpose: The purpose of this study is to determine whether the use of online puzzles in the instructional process has an effect on student achievement and learning retention. This study examined students ' perception and experiences on use of puzzle as an alternative evaluation tool. To achieve this aim, the following hypotheses were tested: using…
Can a Multimedia Tool Help Students' Learning Performance in Complex Biology Subjects?
ERIC Educational Resources Information Center
Koseoglu, Pinar; Efendioglu, Akin
2015-01-01
The aim of the present study was to determine the effects of multimedia-based biology teaching (Mbio) and teacher-centered biology (TCbio) instruction approaches on learners' biology achievements, as well as their views towards learning approaches. During the research process, an experimental design with two groups, TCbio (n = 22) and Mbio (n =…
Rachel Riemann; Kathy Tillman
1999-01-01
The increasing proximity of human development to forest lands and the extent of forest fragmentation caused by this development are major concerns for natural resource managers. Forest fragmentation affects the biodiversity of native flora and fauna, hydrologic processes, and management opportunities. Knowing the extent and location of forest fragmentation and...
Akseli, Ilgaz; Xie, Jingjin; Schultz, Leon; Ladyzhynsky, Nadia; Bramante, Tommasina; He, Xiaorong; Deanne, Rich; Horspool, Keith R; Schwabe, Robert
2017-01-01
Enabling the paradigm of quality by design requires the ability to quantitatively correlate material properties and process variables to measureable product performance attributes. Conventional, quality-by-test methods for determining tablet breaking force and disintegration time usually involve destructive tests, which consume significant amount of time and labor and provide limited information. Recent advances in material characterization, statistical analysis, and machine learning have provided multiple tools that have the potential to develop nondestructive, fast, and accurate approaches in drug product development. In this work, a methodology to predict the breaking force and disintegration time of tablet formulations using nondestructive ultrasonics and machine learning tools was developed. The input variables to the model include intrinsic properties of formulation and extrinsic process variables influencing the tablet during manufacturing. The model has been applied to predict breaking force and disintegration time using small quantities of active pharmaceutical ingredient and prototype formulation designs. The novel approach presented is a step forward toward rational design of a robust drug product based on insight into the performance of common materials during formulation and process development. It may also help expedite drug product development timeline and reduce active pharmaceutical ingredient usage while improving efficiency of the overall process. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
The application of virtual prototyping methods to determine the dynamic parameters of mobile robot
NASA Astrophysics Data System (ADS)
Kurc, Krzysztof; Szybicki, Dariusz; Burghardt, Andrzej; Muszyńska, Magdalena
2016-04-01
The paper presents methods used to determine the parameters necessary to build a mathematical model of an underwater robot with a crawler drive. The parameters present in the dynamics equation will be determined by means of advanced mechatronic design tools, including: CAD/CAE software andMES modules. The virtual prototyping process is described as well as the various possible uses (design adaptability) depending on the optional accessories added to the vehicle. A mathematical model is presented to show the kinematics and dynamics of the underwater crawler robot, essential for the design stage.
NASA Astrophysics Data System (ADS)
Balaykin, A. V.; Bezsonov, K. A.; Nekhoroshev, M. V.; Shulepov, A. P.
2018-01-01
This paper dwells upon a variance parameterization method. Variance or dimensional parameterization is based on sketching, with various parametric links superimposed on the sketch objects and user-imposed constraints in the form of an equation system that determines the parametric dependencies. This method is fully integrated in a top-down design methodology to enable the creation of multi-variant and flexible fixture assembly models, as all the modeling operations are hierarchically linked in the built tree. In this research the authors consider a parameterization method of machine tooling used for manufacturing parts using multiaxial CNC machining centers in the real manufacturing process. The developed method allows to significantly reduce tooling design time when making changes of a part’s geometric parameters. The method can also reduce time for designing and engineering preproduction, in particular, for development of control programs for CNC equipment and control and measuring machines, automate the release of design and engineering documentation. Variance parameterization helps to optimize construction of parts as well as machine tooling using integrated CAE systems. In the framework of this study, the authors demonstrate a comprehensive approach to parametric modeling of machine tooling in the CAD package used in the real manufacturing process of aircraft engines.
Enhanced methodology of focus control and monitoring on scanner tool
NASA Astrophysics Data System (ADS)
Chen, Yen-Jen; Kim, Young Ki; Hao, Xueli; Gomez, Juan-Manuel; Tian, Ye; Kamalizadeh, Ferhad; Hanson, Justin K.
2017-03-01
As the demand of the technology node shrinks from 14nm to 7nm, the reliability of tool monitoring techniques in advanced semiconductor fabs to achieve high yield and quality becomes more critical. Tool health monitoring methods involve periodic sampling of moderately processed test wafers to detect for particles, defects, and tool stability in order to ensure proper tool health. For lithography TWINSCAN scanner tools, the requirements for overlay stability and focus control are very strict. Current scanner tool health monitoring methods include running BaseLiner to ensure proper tool stability on a periodic basis. The focus measurement on YIELDSTAR by real-time or library-based reconstruction of critical dimensions (CD) and side wall angle (SWA) has been demonstrated as an accurate metrology input to the control loop. The high accuracy and repeatability of the YIELDSTAR focus measurement provides a common reference of scanner setup and user process. In order to further improve the metrology and matching performance, Diffraction Based Focus (DBF) metrology enabling accurate, fast, and non-destructive focus acquisition, has been successfully utilized for focus monitoring/control of TWINSCAN NXT immersion scanners. The optimal DBF target was determined to have minimized dose crosstalk, dynamic precision, set-get residual, and lens aberration sensitivity. By exploiting this new measurement target design, 80% improvement in tool-to-tool matching, >16% improvement in run-to-run mean focus stability, and >32% improvement in focus uniformity have been demonstrated compared to the previous BaseLiner methodology. Matching <2.4 nm across multiple NXT immersion scanners has been achieved with the new methodology of set baseline reference. This baseline technique, with either conventional BaseLiner low numerical aperture (NA=1.20) mode or advanced illumination high NA mode (NA=1.35), has also been evaluated to have consistent performance. This enhanced methodology of focus control and monitoring on multiple illumination conditions, opens an avenue to significantly reduce Focus-Exposure Matrix (FEM) wafer exposure for new product/layer best focus (BF) setup.
EconoMe-Develop - a calculation tool for multi-risk assessment and benefit-cost-analysis
NASA Astrophysics Data System (ADS)
Bründl, M.
2012-04-01
Public money is used to finance the protection of human life, material assets and the environment against natural hazards. This limited resource should be used in a way that it achieves the maximum possible effect by minimizing as many risks as possible. Hence, decision-makers are facing the question which mitigation measures should be prioritised. Benefit-Cost-Analysis (BCA) is a recognized method for determining the economic efficiency of investments in mitigation measures. In Switzerland, the Federal Office for the Environment (FOEN) judges the benefit-cost-ratio of mitigation projects on the base of the results of the calculation tool "EconoMe" [1]. The check of the economic efficiency of mitigation projects with an investment of more than 1 million CHF (800,000 EUR) by using "EconoMe" is mandatory since 2008 in Switzerland. Within "EconoMe", most calculation parameters cannot be changed by the user allowing for comparable results. Based on the risk guideline "RIKO" [2] an extended version of the operational version of "EconoMe", called "EconoMe-Develop" was developed. "EconoMe-Develop" is able to deal with various natural hazard processes and thus allows multi-risk assessments, since all restrictions of the operational version of "EconoMe" like e.g. the number of scenarios and expositions, vulnerability, spatial probability of processes and probability of presence of objects, are not existing. Additionally, the influences of uncertainty of calculation factors, like e.g. vulnerability, on the final results can be determined. "EconoMe-Develop" offers import and export of data, e.g. results of GIS-analysis. The possibility for adapting the tool to user specific requirements makes EconoMe-Develop an easy-to-use tool for risk assessment and assessment of economic efficiency of mitigation projects for risk experts. In the paper we will present the most important features of the tool and we will illustrate the application by a practical example.
DiMaio, F; Chiu, W
2016-01-01
Electron cryo-microscopy (cryoEM) has advanced dramatically to become a viable tool for high-resolution structural biology research. The ultimate outcome of a cryoEM study is an atomic model of a macromolecule or its complex with interacting partners. This chapter describes a variety of algorithms and software to build a de novo model based on the cryoEM 3D density map, to optimize the model with the best stereochemistry restraints and finally to validate the model with proper protocols. The full process of atomic structure determination from a cryoEM map is described. The tools outlined in this chapter should prove extremely valuable in revealing atomic interactions guided by cryoEM data. © 2016 Elsevier Inc. All rights reserved.
Probst, Yasmine; Morrison, Evan; Sullivan, Emma; Dam, Hoa Khanh
2016-07-28
Standardizing the background diet of participants during a dietary randomized controlled trial is vital to trial outcomes. For this process, dietary modeling based on food groups and their target servings is employed via a dietary prescription before an intervention, often using a manual process. Partial automation has employed the use of linear programming. Validity of the modeling approach is critical to allow trial outcomes to be translated to practice. This paper describes the first-stage development of a tool to automatically perform dietary modeling using food group and macronutrient requirements as a test case. The Dietary Modeling Tool (DMT) was then compared with existing approaches to dietary modeling (manual and partially automated), which were previously available to dietitians working within a dietary intervention trial. Constraint optimization techniques were implemented to determine whether nonlinear constraints are best suited to the development of the automated dietary modeling tool using food composition and food consumption data. Dietary models were produced and compared with a manual Microsoft Excel calculator, a partially automated Excel Solver approach, and the automated DMT that was developed. The web-based DMT was produced using nonlinear constraint optimization, incorporating estimated energy requirement calculations, nutrition guidance systems, and the flexibility to amend food group targets for individuals. Percentage differences between modeling tools revealed similar results for the macronutrients. Polyunsaturated fatty acids and monounsaturated fatty acids showed greater variation between tools (practically equating to a 2-teaspoon difference), although it was not considered clinically significant when the whole diet, as opposed to targeted nutrients or energy requirements, were being addressed. Automated modeling tools can streamline the modeling process for dietary intervention trials ensuring consistency of the background diets, although appropriate constraints must be used in their development to achieve desired results. The DMT was found to be a valid automated tool producing similar results to tools with less automation. The results of this study suggest interchangeability of the modeling approaches used, although implementation should reflect the requirements of the dietary intervention trial in which it is used.
Morrison, Evan; Sullivan, Emma; Dam, Hoa Khanh
2016-01-01
Background Standardizing the background diet of participants during a dietary randomized controlled trial is vital to trial outcomes. For this process, dietary modeling based on food groups and their target servings is employed via a dietary prescription before an intervention, often using a manual process. Partial automation has employed the use of linear programming. Validity of the modeling approach is critical to allow trial outcomes to be translated to practice. Objective This paper describes the first-stage development of a tool to automatically perform dietary modeling using food group and macronutrient requirements as a test case. The Dietary Modeling Tool (DMT) was then compared with existing approaches to dietary modeling (manual and partially automated), which were previously available to dietitians working within a dietary intervention trial. Methods Constraint optimization techniques were implemented to determine whether nonlinear constraints are best suited to the development of the automated dietary modeling tool using food composition and food consumption data. Dietary models were produced and compared with a manual Microsoft Excel calculator, a partially automated Excel Solver approach, and the automated DMT that was developed. Results The web-based DMT was produced using nonlinear constraint optimization, incorporating estimated energy requirement calculations, nutrition guidance systems, and the flexibility to amend food group targets for individuals. Percentage differences between modeling tools revealed similar results for the macronutrients. Polyunsaturated fatty acids and monounsaturated fatty acids showed greater variation between tools (practically equating to a 2-teaspoon difference), although it was not considered clinically significant when the whole diet, as opposed to targeted nutrients or energy requirements, were being addressed. Conclusions Automated modeling tools can streamline the modeling process for dietary intervention trials ensuring consistency of the background diets, although appropriate constraints must be used in their development to achieve desired results. The DMT was found to be a valid automated tool producing similar results to tools with less automation. The results of this study suggest interchangeability of the modeling approaches used, although implementation should reflect the requirements of the dietary intervention trial in which it is used. PMID:27471104
Optimized filtration for reduced defectivity and improved dispense recipe in 193-nm BARC lithography
NASA Astrophysics Data System (ADS)
Do, Phong; Pender, Joe; Lehmann, Thomas; Mc Ardle, Leo P.; Gotlinsky, Barry; Mesawich, Michael
2004-05-01
The implementation of 193 nm lithography into production has been complicated by high defectivity issues. Many companies have been struggling with high defect densities, forcing process and lithography engineers to focus their efforts on chemical filtration instead of process development. After-etch defects have complicated the effort to reduce this problem. In particular it has been determined that chemical filtration at the 90 nm node and below is a crucial item which current industry standard pump recipes and material choices are not able to address. LSI Logic and Pall Corporation have been working together exploring alternative materials and resist pump process parameters to address these issues. These changes will free up process development time by reducing these high defect density issues. This paper provides a fundamental understanding of how 20nm filtration combined with optimized resist pump set-up and dispense can significantly reduce defects in 193nm lithography. The purpose of this study is to examine the effectiveness of 20 nanometer rated filters to reduce various defects observed in bottom anti reflective coating materials. Multiple filter types were installed on a Tokyo Electron Limited Clean Track ACT8 tool utilizing two-stage resist pumps. Lithographic performance of the filtered resist and defect analysis of patterned and non-patterned wafers were performed. Optimized pump start-up and dispense recipes also were evaluated to determine their effect on defect improvements. The track system used in this experiment was a standard production tool and was not modified from its original specifications.
Investigation of Polyurethane Electrospinning Process Efficiency
NASA Astrophysics Data System (ADS)
Kimmer, Dusan; Zatloukal, Martin; Petras, David; Vincent, Ivo; Slobodian, Petr
2009-07-01
The electrospinning process efficiency of different PUs has been investigated. Specific attention has been paid to understand the role of PU soft segments and synthesis type on the stability of the PU solution and electrospinning process as well as on the quality/property changes of the produced nanofibres. PU samples before and after the process were analyzed rheologicaly and relaxation spectra were determined for all of them from frequency dependent loss and storage moduli measurements. It has been found that rheological analysis of PU, which is used for electrospinning process, can be useful tool from electrospinning process efficiency and optimization point of view. Nanolayers homogeneity during several hours of manufacture in optimized electrospinning process is proved by selected properties from aerosol filtration.
Cutting process simulation of flat drill
NASA Astrophysics Data System (ADS)
Tamura, Shoichi; Matsumura, Takashi
2018-05-01
Flat drills at a point angle of 180 deg. have recently been developed for drilling of automobile parts with the inclination of the workpiece surfaces. The paper studies the cutting processes of the flat drills in the analytical simulation. A predictive force model is applied to simulation of the cutting force with the chip flow direction. The chip flow model is piled up with orthogonal cuttings in the plane containing the cutting velocities and the chip flow velocities, in which the chip flow direction is determined to minimize the cutting energy. Then, the cutting force is predicted in the determined in the chip flow model. The typical cutting force of the flat drill is discussed with comparing to that of the standard drill. The typical differences are confirmed in the cutting force change during the tool engagement and disengagement. The cutting force, then, is simulated in drilling for an inclined workpiece with a flat drill. The horizontal components in the cutting forces are simulated with changing the inclination angle of the plate. The horizontal force component in the flat drilling is stable to be controlled in terms of the machining accuracy and the tool breakage.
Distributed Aerodynamic Sensing and Processing Toolbox
NASA Technical Reports Server (NTRS)
Brenner, Martin; Jutte, Christine; Mangalam, Arun
2011-01-01
A Distributed Aerodynamic Sensing and Processing (DASP) toolbox was designed and fabricated for flight test applications with an Aerostructures Test Wing (ATW) mounted under the fuselage of an F-15B on the Flight Test Fixture (FTF). DASP monitors and processes the aerodynamics with the structural dynamics using nonintrusive, surface-mounted, hot-film sensing. This aerodynamic measurement tool benefits programs devoted to static/dynamic load alleviation, body freedom flutter suppression, buffet control, improvement of aerodynamic efficiency through cruise control, supersonic wave drag reduction through shock control, etc. This DASP toolbox measures local and global unsteady aerodynamic load distribution with distributed sensing. It determines correlation between aerodynamic observables (aero forces) and structural dynamics, and allows control authority increase through aeroelastic shaping and active flow control. It offers improvements in flutter suppression and, in particular, body freedom flutter suppression, as well as aerodynamic performance of wings for increased range/endurance of manned/ unmanned flight vehicles. Other improvements include inlet performance with closed-loop active flow control, and development and validation of advanced analytical and computational tools for unsteady aerodynamics.
Using task analysis to understand the Data System Operations Team
NASA Technical Reports Server (NTRS)
Holder, Barbara E.
1994-01-01
The Data Systems Operations Team (DSOT) currently monitors the Multimission Ground Data System (MGDS) at JPL. The MGDS currently supports five spacecraft and within the next five years, it will support ten spacecraft simultaneously. The ground processing element of the MGDS consists of a distributed UNIX-based system of over 40 nodes and 100 processes. The MGDS system provides operators with little or no information about the system's end-to-end processing status or end-to-end configuration. The lack of system visibility has become a critical issue in the daily operation of the MGDS. A task analysis was conducted to determine what kinds of tools were needed to provide DSOT with useful status information and to prioritize the tool development. The analysis provided the formality and structure needed to get the right information exchange between development and operations. How even a small task analysis can improve developer-operator communications is described, and the challenges associated with conducting a task analysis in a real-time mission operations environment are examined.
Hanke, Alexander T; Tsintavi, Eleni; Ramirez Vazquez, Maria Del Pilar; van der Wielen, Luuk A M; Verhaert, Peter D E M; Eppink, Michel H M; van de Sandt, Emile J A X; Ottens, Marcel
2016-09-01
Knowledge-based development of chromatographic separation processes requires efficient techniques to determine the physicochemical properties of the product and the impurities to be removed. These characterization techniques are usually divided into approaches that determine molecular properties, such as charge, hydrophobicity and size, or molecular interactions with auxiliary materials, commonly in the form of adsorption isotherms. In this study we demonstrate the application of a three-dimensional liquid chromatography approach to a clarified cell homogenate containing a therapeutic enzyme. Each separation dimension determines a molecular property relevant to the chromatographic behavior of each component. Matching of the peaks across the different separation dimensions and against a high-resolution reference chromatogram allows to assign the determined parameters to pseudo-components, allowing to determine the most promising technique for the removal of each impurity. More detailed process design using mechanistic models requires isotherm parameters. For this purpose, the second dimension consists of multiple linear gradient separations on columns in a high-throughput screening compatible format, that allow regression of isotherm parameters with an average standard error of 8%. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:1283-1291, 2016. © 2016 American Institute of Chemical Engineers.
Logistics Process Analysis ToolProcess Analysis Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
2008-03-31
LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fortmore » Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less
Innovative Stormwater Quality Tools by SARA for Holistic Watershed Master Planning
NASA Astrophysics Data System (ADS)
Thomas, S. M.; Su, Y. C.; Hummel, P. R.
2016-12-01
Stormwater management strategies such as Best Management Practices (BMP) and Low-Impact Development (LID) have increasingly gained attention in urban runoff control, becoming vital to holistic watershed master plans. These strategies can help address existing water quality impairments and support regulatory compliance, as well as guide planning and management of future development when substantial population growth and urbanization is projected to occur. However, past efforts have been limited to qualitative planning due to the lack of suitable tools to conduct quantitative assessment. The San Antonio River Authority (SARA), with the assistance of Lockwood, Andrews & Newnam, Inc. (LAN) and AQUA TERRA Consultants (a division of RESPEC), developed comprehensive hydrodynamic and water quality models using the Hydrological Simulation Program-FORTRAN (HSPF) for several urban watersheds in the San Antonio River Basin. These models enabled watershed management to look at water quality issues on a more refined temporal and spatial scale than the limited monitoring data. They also provided a means to locate and quantify potential water quality impairments and evaluate the effects of mitigation measures. To support the models, a suite of software tools were developed. including: 1) SARA Timeseries Utility Tool for managing and processing of large model timeseries files, 2) SARA Load Reduction Tool to determine load reductions needed to achieve screening levels for each modeled constituent on a sub-basin basis, and 3) SARA Enhanced BMP Tool to determine the optimal combination of BMP types and units needed to achieve the required load reductions. Using these SARA models and tools, water quality agencies and stormwater professionals can determine the optimal combinations of BMP/LID to accomplish their goals and save substantial stormwater infrastructure and management costs. The tools can also help regulators and permittees evaluate the feasibility of achieving compliance using BMP/LID. The project has gained national attention, being showcased in multiple newsletters, professional magazines, and conference presentations. The project also won the Texas American Council of Engineering Companies (ACEC) Gold Medal Award and the ACEC National Recognition Award in 2016.
Initial Navigation Alignment of Optical Instruments on GOES-R
NASA Astrophysics Data System (ADS)
Isaacson, P.; DeLuccia, F.; Reth, A. D.; Igli, D. A.; Carter, D.
2016-12-01
The GOES-R satellite is the first in NOAA's next-generation series of geostationary weather satellites. In addition to a number of space weather sensors, it will carry two principal optical earth-observing instruments, the Advanced Baseline Imager (ABI) and the Geostationary Lightning Mapper (GLM). During launch, currently scheduled for November of 2016, the alignment of these optical instruments is anticipated to shift from that measured during pre-launch characterization. While both instruments have image navigation and registration (INR) processing algorithms to enable automated geolocation of the collected data, the launch-derived misalignment may be too large for these approaches to function without an initial adjustment to calibration parameters. The parameters that may require adjustment are for Line of Sight Motion Compensation (LMC), and the adjustments will be estimated on orbit during the post-launch test (PLT) phase. We have developed approaches to estimate the initial alignment errors for both ABI and GLM image products. Our approaches involve comparison of ABI and GLM images collected during PLT to a set of reference ("truth") images using custom image processing tools and other software (the INR Performance Assessment Tool Set, or "IPATS") being developed for other INR assessments of ABI and GLM data. IPATS is based on image correlation approaches to determine offsets between input and reference images, and these offsets are the fundamental input to our estimate of the initial alignment errors. Initial testing of our alignment algorithms on proxy datasets lends high confidence that their application will determine the initial alignment errors to within sufficient accuracy to enable the operational INR processing approaches to proceed in a nominal fashion. We will report on the algorithms, implementation approach, and status of these initial alignment tools being developed for the GOES-R ABI and GLM instruments.
Microstructure Modeling of 3rd Generation Disk Alloys
NASA Technical Reports Server (NTRS)
Jou, Herng-Jeng
2010-01-01
The objective of this program is to model, validate, and predict the precipitation microstructure evolution, using PrecipiCalc (QuesTek Innovations LLC) software, for 3rd generation Ni-based gas turbine disc superalloys during processing and service, with a set of logical and consistent experiments and characterizations. Furthermore, within this program, the originally research-oriented microstructure simulation tool will be further improved and implemented to be a useful and user-friendly engineering tool. In this report, the key accomplishment achieved during the second year (2008) of the program is summarized. The activities of this year include final selection of multicomponent thermodynamics and mobility databases, precipitate surface energy determination from nucleation experiment, multiscale comparison of predicted versus measured intragrain precipitation microstructure in quench samples showing good agreement, isothermal coarsening experiment and interaction of grain boundary and intergrain precipitates, primary microstructure of subsolvus treatment, and finally the software implementation plan for the third year of the project. In the following year, the calibrated models and simulation tools will be validated against an independently developed experimental data set, with actual disc heat treatment process conditions. Furthermore, software integration and implementation will be developed to provide material engineers valuable information in order to optimize the processing of the 3rd generation gas turbine disc alloys.
An intelligent advisor for the design manager
NASA Technical Reports Server (NTRS)
Rogers, James L.; Padula, Sharon L.
1989-01-01
A design problem is viewed as a complex system divisible into modules. Before the design of a complex system can begin, much time and money are spent in determining the couplings among modules and the presence of iterative loops. This is important because the design manager must know how to group the modules into substems and how to assign subsystems to design teams so that changes in one subsystem will have predictable effects on other subsystems. Determining these subsystems is not an easy, straightforward process and often important couplings are overlooked. Moreover, the planning task must be repeated as new information becomes available or as the design specifications change. The purchase of this research effort is to develop a knowledge-based tool to act as an intelligent advisor for the design manager. This tool identifies the subsystems of a complex design problem, orders them into a well-structured format, and marks the couplings among the subsystems to facilitate the use of multilevel tools. The tool was tested in the decomposition of the COFS (Control of Flexible Structures) mast design which has about 50 modules. This test indicated that this type of approach could lead to a substantial savings by organizing and displaying a complex problem as a sequence of subsystems easily divisible among design teams.
Keeley, Jon E.
2015-01-01
In grasslands fire may play a role in the plant invasion process, both by creating disturbances that potentially favour non-native invasions and as a possible tool for controlling alien invasions. Havill et al. (Applied Vegetation Science, 18, 2015, this issue) determine how native and non-native species respond to different fire regimes as a first step in understanding the potential control of invasive grasses.
ERIC Educational Resources Information Center
Abrahams, Fatima; Friedrich, Christian; Tredoux, Nanette
2012-01-01
South African higher education institutions are experiencing challenges regarding access, redress and the successful completion of programmes in an environment where there are still imbalances in the schooling system. Tools are needed that will assist with the process of selecting students. The aim of this study is to determine whether a test…
ERIC Educational Resources Information Center
Kadyrova, Alina A.; Valeev, Agzam A.
2016-01-01
There is a determined number of trends in the process of intensification of high school training, including the integration of professional, linguistic and cultural training of professionals in the unity with the development of their personal qualities;. For this reason, modern educational technologies serve as a tool for practical implementation…
ERIC Educational Resources Information Center
Nilsson, Tor; Niedderer, Hans
2012-01-01
In undergraduate chemical thermodynamics teachers often include equations and view manipulations of variables as understanding. Undergraduate students are often not able to describe the meaning of these equations. In chemistry, enthalpy and its change are introduced to describe some features of chemical reactions. In the process of measuring heat…
Overview of Digital Forensics Algorithms in Dslr Cameras
NASA Astrophysics Data System (ADS)
Aminova, E.; Trapeznikov, I.; Priorov, A.
2017-05-01
The widespread usage of the mobile technologies and the improvement of the digital photo devices getting has led to more frequent cases of falsification of images including in the judicial practice. Consequently, the actual task for up-to-date digital image processing tools is the development of algorithms for determining the source and model of the DSLR (Digital Single Lens Reflex) camera and improve image formation algorithms. Most research in this area based on the mention that the extraction of unique sensor trace of DSLR camera could be possible on the certain stage of the imaging process into the camera. It is considered that the study focuses on the problem of determination of unique feature of DSLR cameras based on optical subsystem artifacts and sensor noises.
Development of Tool Representations in the Dorsal and Ventral Visual Object Processing Pathways
Kersey, Alyssa J.; Clark, Tyia S.; Lussier, Courtney A.; Mahon, Bradford Z.; Cantlon, Jessica F.
2016-01-01
Tools represent a special class of objects, because they are processed across both the dorsal and ventral visual object processing pathways. Three core regions are known to be involved in tool processing: the left posterior middle temporal gyrus, the medial fusiform gyrus (bilaterally), and the left inferior parietal lobule. A critical and relatively unexplored issue concerns whether, in development, tool preferences emerge at the same time and to a similar degree across all regions of the tool-processing network. To test this issue, we used functional magnetic resonance imaging to measure the neural amplitude, peak location, and the dispersion of tool-related neural responses in the youngest sample of children tested to date in this domain (ages 4–8 years). We show that children recruit overlapping regions of the adult tool-processing network and also exhibit similar patterns of co-activation across the network to adults. The amplitude and co-activation data show that the core components of the tool-processing network are established by age 4. Our findings on the distributions of peak location and dispersion of activation indicate that the tool network undergoes refinement between ages 4 and 8 years. PMID:26108614
Cobbledick, Jeffrey; Nguyen, Alexander; Latulippe, David R
2014-07-01
The current challenges associated with the design and operation of net-energy positive wastewater treatment plants demand sophisticated approaches for the monitoring of polymer-induced flocculation. In anaerobic digestion (AD) processes, the dewaterability of the sludge is typically assessed from off-line lab-bench tests - the capillary suction time (CST) test is one of the most common. Focused beam reflectance measurement (FBRM) is a promising technique for real-time monitoring of critical performance attributes in large scale processes and is ideally suited for dewatering applications. The flocculation performance of twenty-four cationic polymers, that spanned a range of polymer size and charge properties, was measured using both the FBRM and CST tests. Analysis of the data revealed a decreasing monotonic trend; the samples that had the highest percent removal of particles less than 50 microns in size as determined by FBRM had the lowest CST values. A subset of the best performing polymers was used to evaluate the effects of dosage amount and digestate sources on dewatering performance. The results from this work show that FBRM is a powerful tool that can be used for optimization and on-line monitoring of dewatering processes. Copyright © 2014 Elsevier Ltd. All rights reserved.
Does a selection interview predict year 1 performance in dental school?
McAndrew, R; Ellis, J; Valentine, R A
2017-05-01
It is important for dental schools to select students who will complete their degree and progress on to become the dentists of the future. The process should be transparent, fair and ethical and utilise selection tools that select appropriate students. The interview is an integral part of UK dental schools student selection procedures. This study was undertaken in order to determine whether different interview methods (Cardiff with a multiple mini interview and Newcastle with a more traditional interview process) along with other components used in selection predicted academic performance in students. The admissions selection data for two dental schools (Cardiff and Newcastle) were collected and analysed alongside student performance in academic examinations in Year 1 of the respective schools. Correlation statistics were used to determine whether selection tools had any relevance to academic performance once students were admitted to their respective Universities. Data was available for a total of 177 students (77 Cardiff and 100 Newcastle). Examination performance did not correlate with admission interview scores at either school; however UKCAT score was linked to poor academic performance. Although interview methodology does not appear to correlate with academic performance it remains an integral and very necessary part of the admissions process. Ultimately schools need to be comfortable with their admissions procedures in attracting and selecting the calibre of students they desire. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
ASQ Program Observation Instrument: A Tool for Assessing School-Age Child Care Quality.
ERIC Educational Resources Information Center
O'Connor, Susan; And Others
ASQ (Assessing School-Aged Child Care Quality) is a system for determining the quality of school-age child care programs. The ASQ Program Observation Instrument is a ten-step, self assessment process to guide program improvement. This instrument does not work well in full-day programs that have a single focus, but works well in programs that offer…
Depth of manual dismantling analysis: a cost-benefit approach.
Achillas, Ch; Aidonis, D; Vlachokostas, Ch; Karagiannidis, A; Moussiopoulos, N; Loulos, V
2013-04-01
This paper presents a decision support tool for manufacturers and recyclers towards end-of-life strategies for waste electrical and electronic equipment. A mathematical formulation based on the cost benefit analysis concept is herein analytically described in order to determine the parts and/or components of an obsolete product that should be either non-destructively recovered for reuse or be recycled. The framework optimally determines the depth of disassembly for a given product, taking into account economic considerations. On this basis, it embeds all relevant cost elements to be included in the decision-making process, such as recovered materials and (depreciated) parts/components, labor costs, energy consumption, equipment depreciation, quality control and warehousing. This tool can be part of the strategic decision-making process in order to maximize profitability or minimize end-of-life management costs. A case study to demonstrate the models' applicability is presented for a typical electronic product in terms of structure and material composition. Taking into account the market values of the pilot product's components, the manual disassembly is proven profitable with the marginal revenues from recovered reusable materials to be estimated at 2.93-23.06 €, depending on the level of disassembly. Copyright © 2013 Elsevier Ltd. All rights reserved.
Persson, Johanna; Dalholm, Elisabeth Hornyánszky; Johansson, Gerd
2014-01-01
To demonstrate the use of visualization and simulation tools in order to involve stakeholders and inform the process in hospital change processes, illustrated by an empirical study from a children's emergency clinic. Reorganization and redevelopment of a hospital is a complex activity that involves many stakeholders and demands. Visualization and simulation tools have proven useful for involving practitioners and eliciting relevant knowledge. More knowledge is desired about how these tools can be implemented in practice for hospital planning processes. A participatory planning process including practitioners and researchers was executed over a 3-year period to evaluate a combination of visualization and simulation tools to involve stakeholders in the planning process and to elicit knowledge about needs and requirements. The initial clinic proposal from the architect was discarded as a result of the empirical study. Much general knowledge about the needs of the organization was extracted by means of the adopted tools. Some of the tools proved to be more accessible than others for the practitioners participating in the study. The combination of tools added value to the process by presenting information in alternative ways and eliciting questions from different angles. Visualization and simulation tools inform a planning process (or other types of change processes) by providing the means to see beyond present demands and current work structures. Long-term involvement in combination with accessible tools is central for creating a participatory setting where the practitioners' knowledge guides the process. © 2014 Vendome Group, LLC.
Jérôme, Marc; Martinsohn, Jann Thorsten; Ortega, Delphine; Carreau, Philippe; Verrez-Bagnis, Véronique; Mouchel, Olivier
2008-05-28
Traceability in the fish food sector plays an increasingly important role for consumer protection and confidence building. This is reflected by the introduction of legislation and rules covering traceability on national and international levels. Although traceability through labeling is well established and supported by respective regulations, monitoring and enforcement of these rules are still hampered by the lack of efficient diagnostic tools. We describe protocols using a direct sequencing method based on 212-274-bp diagnostic sequences derived from species-specific mitochondria DNA cytochrome b, 16S rRNA, and cytochrome oxidase subunit I sequences which can efficiently be applied to unambiguously determine even closely related fish species in processed food products labeled "anchovy". Traceability of anchovy-labeled products is supported by the public online database AnchovyID ( http://anchovyid.jrc.ec.europa.eu), which provided data obtained during our study and tools for analytical purposes.
Islam, Rafiqul
2013-07-01
Today's bioanalytical CROs face increasing global competition, highly variable demand, high fixed costs, pricing pressure, and increasing demand for quality and speed. Most bioanalytical laboratories have responded to these challenges by implementing automation and by implementing process improvement methodologies (e.g., Six Sigma). These solutions have not resulted in a significant improvement in productivity and profitability since none of them are able to predict the upturn or downturn in demand. High volatility of demand causes long lead times and high costs during peak demand and poor productivity during trough demand. Most bioanalytical laboratories lack the tools to align supply efficiently to meet changing demand. In this paper, sales and operation planning (S&OP) has been investigated as a tool to balance supply and demand. The S&OP process, when executed effectively, can be the single greatest determinant of profitability for a bioanalytical business.
Computed tomography-based volumetric tool for standardized measurement of the maxillary sinus
Giacomini, Guilherme; Pavan, Ana Luiza Menegatti; Altemani, João Mauricio Carrasco; Duarte, Sergio Barbosa; Fortaleza, Carlos Magno Castelo Branco; Miranda, José Ricardo de Arruda
2018-01-01
Volume measurements of maxillary sinus may be useful to identify diseases affecting paranasal sinuses. However, literature shows a lack of consensus in studies measuring the volume. This may be attributable to different computed tomography data acquisition techniques, segmentation methods, focuses of investigation, among other reasons. Furthermore, methods for volumetrically quantifying the maxillary sinus are commonly manual or semiautomated, which require substantial user expertise and are time-consuming. The purpose of the present study was to develop an automated tool for quantifying the total and air-free volume of the maxillary sinus based on computed tomography images. The quantification tool seeks to standardize maxillary sinus volume measurements, thus allowing better comparisons and determinations of factors that influence maxillary sinus size. The automated tool utilized image processing techniques (watershed, threshold, and morphological operators). The maxillary sinus volume was quantified in 30 patients. To evaluate the accuracy of the automated tool, the results were compared with manual segmentation that was performed by an experienced radiologist using a standard procedure. The mean percent differences between the automated and manual methods were 7.19% ± 5.83% and 6.93% ± 4.29% for total and air-free maxillary sinus volume, respectively. Linear regression and Bland-Altman statistics showed good agreement and low dispersion between both methods. The present automated tool for maxillary sinus volume assessment was rapid, reliable, robust, accurate, and reproducible and may be applied in clinical practice. The tool may be used to standardize measurements of maxillary volume. Such standardization is extremely important for allowing comparisons between studies, providing a better understanding of the role of the maxillary sinus, and determining the factors that influence maxillary sinus size under normal and pathological conditions. PMID:29304130
NASA Astrophysics Data System (ADS)
Cioată, V. G.; Kiss, I.; Alexa, V.; Raţiu, S. A.; Racov, M.
2018-01-01
In the machining process, the workpieces are installed in machining fixtures in order to establish a strictly determined position with the cutting tool or its trajectory. During the cutting process, the weight of the workpiece, the forces and moments of inertia, cutting forces and moments, clamping forces, the heat released during the cutting process determine the contact forces between the locators and the workpiece. The magnitude of these forces is important because too large value can destroy the surface of the workpiece, and a too small value can cause the workpiece to slip on the locators or even the loss of the contact with the workpiece. Both situations must be avoided. The paper presents a study, realized with CAE software, regarding the influence of the cutting temperature on the magnitude of the contact forces in a machining fixture for the milling a rectangular workpiece.
Galí, A; García-Montoya, E; Ascaso, M; Pérez-Lozano, P; Ticó, J R; Miñarro, M; Suñé-Negre, J M
2016-09-01
Although tablet coating processes are widely used in the pharmaceutical industry, they often lack adequate robustness. Up-scaling can be challenging as minor changes in parameters can lead to varying quality results. To select critical process parameters (CPP) using retrospective data of a commercial product and to establish a design of experiments (DoE) that would improve the robustness of the coating process. A retrospective analysis of data from 36 commercial batches. Batches were selected based on the quality results generated during batch release, some of which revealed quality deviations concerning the appearance of the coated tablets. The product is already marketed and belongs to the portfolio of a multinational pharmaceutical company. The Statgraphics 5.1 software was used for data processing to determine critical process parameters in order to propose new working ranges. This study confirms that it is possible to determine the critical process parameters and create design spaces based on retrospective data of commercial batches. This type of analysis is thus converted into a tool to optimize the robustness of existing processes. Our results show that a design space can be established with minimum investment in experiments, since current commercial batch data are processed statistically.
Surface enhancement of cold work tool steels by friction stir processing with a pinless tool
NASA Astrophysics Data System (ADS)
Costa, M. I.; Verdera, D.; Vieira, M. T.; Rodrigues, D. M.
2014-03-01
The microstructure and mechanical properties of enhanced tool steel (AISI D2) surfaces produced using a friction stir welding (FSW) related procedure, called friction stir processing (FSP), are analysed in this work. The surface of the tool steel samples was processed using a WC-Co pinless tool and varying processing conditions. Microstructural analysis revealed that meanwhile the original substrate structure consisted of a heterogeneous distribution of coarse carbides in a ferritic matrix, the transformed surfaces consisted of very small carbides, homogenously distributed in a ferrite- bainite- martensite matrix. The morphology of the surfaces, as well as its mechanical properties, evaluated by hardness and tensile testing, were found to vary with increasing tool rotation speed. Surface hardness was drastically increased, relative to the initial hardness of bulk steel. This was attributed to ferrite and carbide refinement, as well as to martensite formation during solid state processing. At the highest rotation rates, tool sliding during processing deeply compromised the characteristics of the processed surfaces.
Knowledge mapping as a technique to support knowledge translation.
Ebener, S.; Khan, A.; Shademani, R.; Compernolle, L.; Beltran, M.; Lansang, Ma; Lippman, M.
2006-01-01
This paper explores the possibility of integrating knowledge mapping into a conceptual framework that could serve as a tool for understanding the many complex processes, resources and people involved in a health system, and for identifying potential gaps within knowledge translation processes in order to address them. After defining knowledge mapping, this paper presents various examples of the application of this process in health, before looking at the steps that need to be taken to identify potential gaps, to determine to what extent these gaps affect the knowledge translation process and to establish their cause. This is followed by proposals for interventions aimed at strengthening the overall process. Finally, potential limitations on the application of this framework at the country level are addressed. PMID:16917651
BMDExpress Data Viewer: A Visualization Tool to Analyze ...
Regulatory agencies increasingly apply benchmark dose (BMD) modeling to determine points of departure in human risk assessments. BMDExpress applies BMD modeling to transcriptomics datasets and groups genes to biological processes and pathways for rapid assessment of doses at which biological perturbations occur. However, graphing and analytical capabilities within BMDExpress are limited, and the analysis of output files is challenging. We developed a web-based application, BMDExpress Data Viewer, for visualization and graphical analyses of BMDExpress output files. The software application consists of two main components: ‘Summary Visualization Tools’ and ‘Dataset Exploratory Tools’. We demonstrate through two case studies that the ‘Summary Visualization Tools’ can be used to examine and assess the distributions of probe and pathway BMD outputs, as well as derive a potential regulatory BMD through the modes or means of the distributions. The ‘Functional Enrichment Analysis’ tool presents biological processes in a two-dimensional bubble chart view. By applying filters of pathway enrichment p-value and minimum number of significant genes, we showed that the Functional Enrichment Analysis tool can be applied to select pathways that are potentially sensitive to chemical perturbations. The ‘Multiple Dataset Comparison’ tool enables comparison of BMDs across multiple experiments (e.g., across time points, tissues, or organisms, etc.). The ‘BMDL-BM
Summarizing health inequalities in a Balanced Scorecard. Methodological considerations.
Auger, Nathalie; Raynault, Marie-France
2006-01-01
The association between social determinants and health inequalities is well recognized. What are now needed are tools to assist in disseminating such information. This article describes how the Balanced Scorecard may be used for summarizing data on health inequalities. The process begins by selecting appropriate social groups and indicators, and is followed by the measurement of differences across person, place, or time. The next step is to decide whether to focus on absolute versus relative inequality. The last step is to determine the scoring method, including whether to address issues of depth of inequality.
Priority setting for health in emerging markets.
Glassman, Amanda; Giedion, Ursula; McQueston, Kate
2013-05-01
The use of health technology assessment research in emerging economies is becoming an increasingly important tool to determine the uses of health spending. As low- and middle-income countries' gross domestic product grows, the funding available for health has increased in tandem. There is growing evidence that comparative effectiveness research and cost-effectiveness can be used to improve health outcomes within a predefined financial space. The use of these evaluation tools, combined with a systematized process of priority setting, can help inform national and global health payers. This review of country institutions for health technology assessment illustrates two points: the efforts underway to use research to inform priorities are widespread and not confined to wealthier countries; and many countries' efforts to create evidence-based policy are incomplete and more country-specific research will be needed. Further evidence shows that there is scope to reduce these gaps and opportunity to support better incorporation of data through better-defined priority-setting processes.
Analysis of Video-Based Microscopic Particle Trajectories Using Kalman Filtering
Wu, Pei-Hsun; Agarwal, Ashutosh; Hess, Henry; Khargonekar, Pramod P.; Tseng, Yiider
2010-01-01
Abstract The fidelity of the trajectories obtained from video-based particle tracking determines the success of a variety of biophysical techniques, including in situ single cell particle tracking and in vitro motility assays. However, the image acquisition process is complicated by system noise, which causes positioning error in the trajectories derived from image analysis. Here, we explore the possibility of reducing the positioning error by the application of a Kalman filter, a powerful algorithm to estimate the state of a linear dynamic system from noisy measurements. We show that the optimal Kalman filter parameters can be determined in an appropriate experimental setting, and that the Kalman filter can markedly reduce the positioning error while retaining the intrinsic fluctuations of the dynamic process. We believe the Kalman filter can potentially serve as a powerful tool to infer a trajectory of ultra-high fidelity from noisy images, revealing the details of dynamic cellular processes. PMID:20550894
Kinjo, Akira R.; Bekker, Gert-Jan; Suzuki, Hirofumi; Tsuchiya, Yuko; Kawabata, Takeshi; Ikegawa, Yasuyo; Nakamura, Haruki
2017-01-01
The Protein Data Bank Japan (PDBj, http://pdbj.org), a member of the worldwide Protein Data Bank (wwPDB), accepts and processes the deposited data of experimentally determined macromolecular structures. While maintaining the archive in collaboration with other wwPDB partners, PDBj also provides a wide range of services and tools for analyzing structures and functions of proteins. We herein outline the updated web user interfaces together with RESTful web services and the backend relational database that support the former. To enhance the interoperability of the PDB data, we have previously developed PDB/RDF, PDB data in the Resource Description Framework (RDF) format, which is now a wwPDB standard called wwPDB/RDF. We have enhanced the connectivity of the wwPDB/RDF data by incorporating various external data resources. Services for searching, comparing and analyzing the ever-increasing large structures determined by hybrid methods are also described. PMID:27789697
NASA Astrophysics Data System (ADS)
Tan, Samantha H.; Chen, Ning; Liu, Shi; Wang, Kefei
2003-09-01
As part of the semiconductor industry "contamination-free manufacturing" effort, significant emphasis has been placed on reducing potential sources of contamination from process equipment and process equipment components. Process tools contain process chambers and components that are exposed to the process environment or process chemistry and in some cases are in direct contact with production wafers. Any contamination from these sources must be controlled or eliminated in order to maintain high process yields, device performance, and device reliability. This paper discusses new nondestructive analytical methods for quantitative measurement of the cleanliness of metal, quartz, polysilicon and ceramic components that are used in process equipment tools. The goal of these new procedures is to measure the effectiveness of cleaning procedures and to verify whether a tool component part is sufficiently clean for installation and subsequent routine use in the manufacturing line. These procedures provide a reliable "qualification method" for tool component certification and also provide a routine quality control method for reliable operation of cleaning facilities. Cost advantages to wafer manufacturing include higher yields due to improved process cleanliness and elimination of yield loss and downtime resulting from the installation of "bad" components in process tools. We also discuss a representative example of wafer contamination having been linked to a specific process tool component.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hübner, M.; Lang, N.; Röpcke, J.
2015-01-19
Dielectric etching plasma processes for modern interlevel dielectrics become more and more complex by the introduction of new ultra low-k dielectrics. One challenge is the minimization of sidewall damage, while etching ultra low-k porous SiCOH by fluorocarbon plasmas. The optimization of this process requires a deeper understanding of the concentration of the CF{sub 2} radical, which acts as precursor in the polymerization of the etch sample surfaces. In an industrial dielectric etching plasma reactor, the CF{sub 2} radical was measured in situ using a continuous wave quantum cascade laser (cw-QCL) around 1106.2 cm{sup −1}. We measured Doppler-resolved ro-vibrational absorption lines andmore » determined absolute densities using transitions in the ν{sub 3} fundamental band of CF{sub 2} with the aid of an improved simulation of the line strengths. We found that the CF{sub 2} radical concentration during the etching plasma process directly correlates to the layer structure of the etched wafer. Hence, this correlation can serve as a diagnostic tool of dielectric etching plasma processes. Applying QCL based absorption spectroscopy opens up the way for advanced process monitoring and etching controlling in semiconductor manufacturing.« less
Human Centered Computing for Mars Exploration
NASA Technical Reports Server (NTRS)
Trimble, Jay
2005-01-01
The science objectives are to determine the aqueous, climatic, and geologic history of a site on Mars where conditions may have been favorable to the preservation of evidence of prebiotic or biotic processes. Human Centered Computing is a development process that starts with users and their needs, rather than with technology. The goal is a system design that serves the user, where the technology fits the task and the complexity is that of the task not of the tool.
NASA Astrophysics Data System (ADS)
Ducoté, Julien; Dettoni, Florent; Bouyssou, Régis; Le-Gratiet, Bertrand; Carau, Damien; Dezauzier, Christophe
2015-03-01
Patterning process control of advanced nodes has required major changes over the last few years. Process control needs of critical patterning levels since 28nm technology node is extremely aggressive showing that metrology accuracy/sensitivity must be finely tuned. The introduction of pitch splitting (Litho-Etch-Litho-Etch) at 14FDSOInm node requires the development of specific metrologies to adopt advanced process control (for CD, overlay and focus corrections). The pitch splitting process leads to final line CD uniformities that are a combination of the CD uniformities of the two exposures, while the space CD uniformities are depending on both CD and OVL variability. In this paper, investigations of CD and OVL process control of 64nm minimum pitch at Metal1 level of 14FDSOI technology, within the double patterning process flow (Litho, hard mask etch, line etch) are presented. Various measurements with SEMCD tools (Hitachi), and overlay tools (KT for Image Based Overlay - IBO, and ASML for Diffraction Based Overlay - DBO) are compared. Metrology targets are embedded within a block instanced several times within the field to perform intra-field process variations characterizations. Specific SEMCD targets were designed for independent measurement of both line CD (A and B) and space CD (A to B and B to A) for each exposure within a single measurement during the DP flow. Based on those measurements correlation between overlay determined with SEMCD and with standard overlay tools can be evaluated. Such correlation at different steps through the DP flow is investigated regarding the metrology type. Process correction models are evaluated with respect to the measurement type and the intra-field sampling.
Exploring the Relationship between Self-Regulated Vocabulary Learning and Web-Based Collaboration
ERIC Educational Resources Information Center
Liu, Sarah Hsueh-Jui; Lan, Yu-Ju; Ho, Cloudia Ya-Yu
2014-01-01
Collaborative learning has placed an emphasis on co-constructing knowledge by sharing and negotiating meaning for problem-solving activities, and this cannot be accomplished without governing the self-regulatory processes of students. This study employed a Web-based tool, Google Docs, to determine the effects of Web-based collaboration on…
Classroom Communication and Instructional Processes: Advances through Meta-Analysis
ERIC Educational Resources Information Center
Gayle, Barbara Mae, Ed.; Preiss, Raymond W., Ed.; Burrell, Nancy, Ed.; Allen, Mike, Ed.
2006-01-01
This volume offers a systematic review of the literature on communication education and instruction. Making meta-analysis findings accessible and relevant, the editors of this volume approach the topic from the perspective that meta-analysis serves as a useful tool for summarizing experiments and for determining how and why specific teaching and…
ERIC Educational Resources Information Center
Hardin, Belinda J.; Scott-Little, Catherine; Mereoiu, Mariana
2013-01-01
With the increasing number of preschool-age children of Latino heritage entering U.S. schools comes a growing need to accurately determine children's individual needs and identify potential disabilities, beginning with the screening process. Unfortunately, teachers face many challenges when screening English language learners. Often, parents have…
NASA Astrophysics Data System (ADS)
Sciammarella, Cesar A.
1982-06-01
In conventional holographic interferometry, the observed fringe patterns are determined by the object displacement and deformation, and by the illumination and observation configurations. The obtained information may not be in the most convenient form for further data processing. To overcome this problem, and to create new possibilities, holographic fringe patterns can be changed by modifying the optical setup. As a result of these modifications, well-known procedures of the moire method can be applied to holographic interferometry. Components of displacement and components of the strain tensor can be isolated and measured separately. Surface contours and slopes can also be determined.
A Data-Driven Solution for Performance Improvement
NASA Technical Reports Server (NTRS)
2002-01-01
Marketed as the "Software of the Future," Optimal Engineering Systems P.I. EXPERT(TM) technology offers statistical process control and optimization techniques that are critical to businesses looking to restructure or accelerate operations in order to gain a competitive edge. Kennedy Space Center granted Optimal Engineering Systems the funding and aid necessary to develop a prototype of the process monitoring and improvement software. Completion of this prototype demonstrated that it was possible to integrate traditional statistical quality assurance tools with robust optimization techniques in a user- friendly format that is visually compelling. Using an expert system knowledge base, the software allows the user to determine objectives, capture constraints and out-of-control processes, predict results, and compute optimal process settings.
A Deep Space Orbit Determination Software: Overview and Event Prediction Capability
NASA Astrophysics Data System (ADS)
Kim, Youngkwang; Park, Sang-Young; Lee, Eunji; Kim, Minsik
2017-06-01
This paper presents an overview of deep space orbit determination software (DSODS), as well as validation and verification results on its event prediction capabilities. DSODS was developed in the MATLAB object-oriented programming environment to support the Korea Pathfinder Lunar Orbiter (KPLO) mission. DSODS has three major capabilities: celestial event prediction for spacecraft, orbit determination with deep space network (DSN) tracking data, and DSN tracking data simulation. To achieve its functionality requirements, DSODS consists of four modules: orbit propagation (OP), event prediction (EP), data simulation (DS), and orbit determination (OD) modules. This paper explains the highest-level data flows between modules in event prediction, orbit determination, and tracking data simulation processes. Furthermore, to address the event prediction capability of DSODS, this paper introduces OP and EP modules. The role of the OP module is to handle time and coordinate system conversions, to propagate spacecraft trajectories, and to handle the ephemerides of spacecraft and celestial bodies. Currently, the OP module utilizes the General Mission Analysis Tool (GMAT) as a third-party software component for highfidelity deep space propagation, as well as time and coordinate system conversions. The role of the EP module is to predict celestial events, including eclipses, and ground station visibilities, and this paper presents the functionality requirements of the EP module. The validation and verification results show that, for most cases, event prediction errors were less than 10 millisec when compared with flight proven mission analysis tools such as GMAT and Systems Tool Kit (STK). Thus, we conclude that DSODS is capable of predicting events for the KPLO in real mission applications.
Creating a Positive PLA Experience: A Step-by-Step Look at University PLA
ERIC Educational Resources Information Center
Leiste, Sara M.; Jensen, Kathryn
2011-01-01
A prior learning assessment (PLA) can be an intimidating process for adult learners. Capella University's PLA team has developed best practices, resources, and tools to foster a positive experience and to remove barriers in PLA and uses three criteria to determine how to best administer the assessment. First, a PLA must be motivating, as described…
Tool and ideological knowledge in Street Outreach Office working process.
Kami, Maria Terumi Maruyama; Larocca, Liliana Muller; Chaves, Maria Marta Nolasco; Piosiadlo, Laura Christina Macedo; Albuquerque, Guilherme Souza
2016-01-01
To identify ideological knowledge and tool knowledgethat provide support to the Street Outreach Office working process. Qualitative and exploratory research. TwentyStreet Outreach Office professionals and six users collected the data, applying different semi-structured interview schedules for each category of participants. The resulting categories were analyzed in light of tool and ideological knowledge presented in the working process. From the participant discourses the following ideological knowledge emerged: public policies and the needs of the person ina street situation and tool knowledge, as well as devices and tools for the care of people in street situations and a weekly schedule. The focus on the working process discourse, supported by ideological knowledge, was verified. The structural dimension of the objective reality of the population in street situations was perceptible in the social determination of being situating on the street. When daily situations were revealed, the limitations to be overcome in the working process context were noticed. Identificar os saberes ideológicos e instrumentais que subsidiam o processo de trabalho do Consultório na Rua. Pesquisa qualitativa e exploratória. A coleta de dados foi realizada junto a 20 profissionais e seis usuários do Consultório na Rua de um município do sul do Brasil, por meio de entrevistas com roteiros semiestruturados distintos para cada categoria de participantes. As classes resultantes foram analisadas à luz dos saberes ideológicos e instrumentais presentes no processo de trabalho. Dos discursos dos participantes emergiram os saberes ideológicos: políticas públicas e necessidades da pessoa em situação de rua e os saberes instrumentais: dispositivos e instrumentos no cuidado à pessoa em situação de rua e agenda semanal. Constatou-se a centralidade dos discursos no processo de trabalho, sustentado pelos saberes ideológicos. A dimensão estrutural da realidade objetiva da população em situação de rua foi perceptível na determinação social do situar-se na rua. Ao descortinar contradições no cotidiano, apontam-se limites a serem superados no âmbito do processo de trabalho.
In-line monitoring of pellet coating thickness growth by means of visual imaging.
Oman Kadunc, Nika; Sibanc, Rok; Dreu, Rok; Likar, Boštjan; Tomaževič, Dejan
2014-08-15
Coating thickness is the most important attribute of coated pharmaceutical pellets as it directly affects release profiles and stability of the drug. Quality control of the coating process of pharmaceutical pellets is thus of utmost importance for assuring the desired end product characteristics. A visual imaging technique is presented and examined as a process analytic technology (PAT) tool for noninvasive continuous in-line and real time monitoring of coating thickness of pharmaceutical pellets during the coating process. Images of pellets were acquired during the coating process through an observation window of a Wurster coating apparatus. Image analysis methods were developed for fast and accurate determination of pellets' coating thickness during a coating process. The accuracy of the results for pellet coating thickness growth obtained in real time was evaluated through comparison with an off-line reference method and a good agreement was found. Information about the inter-pellet coating uniformity was gained from further statistical analysis of the measured pellet size distributions. Accuracy and performance analysis of the proposed method showed that visual imaging is feasible as a PAT tool for in-line and real time monitoring of the coating process of pharmaceutical pellets. Copyright © 2014 Elsevier B.V. All rights reserved.
Method and system for downhole clock synchronization
Hall, David R.; Bartholomew, David B.; Johnson, Monte; Moon, Justin; Koehler, Roger O.
2006-11-28
A method and system for use in synchronizing at least two clocks in a downhole network are disclosed. The method comprises determining a total signal latency between a controlling processing element and at least one downhole processing element in a downhole network and sending a synchronizing time over the downhole network to the at least one downhole processing element adjusted for the signal latency. Electronic time stamps may be used to measure latency between processing elements. A system for electrically synchronizing at least two clocks connected to a downhole network comprises a controlling processing element connected to a synchronizing clock in communication over a downhole network with at least one downhole processing element comprising at least one downhole clock. Preferably, the downhole network is integrated into a downhole tool string.
A Taguchi study of the aeroelastic tailoring design process
NASA Technical Reports Server (NTRS)
Bohlmann, Jonathan D.; Scott, Robert C.
1991-01-01
A Taguchi study was performed to determine the important players in the aeroelastic tailoring design process and to find the best composition of the optimization's objective function. The Wing Aeroelastic Synthesis Procedure (TSO) was used to ascertain the effects that factors such as composite laminate constraints, roll effectiveness constraints, and built-in wing twist and camber have on the optimum, aeroelastically tailored wing skin design. The results show the Taguchi method to be a viable engineering tool for computational inquiries, and provide some valuable lessons about the practice of aeroelastic tailoring.
Exploring Flow Procedures for Diazonium Formation.
Hu, Te; Baxendale, Ian R; Baumann, Marcus
2016-07-14
The synthesis of diazonium salts is historically an important transformation extensively utilized in dye manufacture. However the highly reactive nature of the diazonium functionality has additionally led to the development of many new reactions including several carbon-carbon bond forming processes. It is therefore highly desirable to determine optimum conditions for the formation of diazonium compounds utilizing the latest processing tools such as flow chemistry to take advantage of the increased safety and continuous manufacturing capabilities. Herein we report a series of flow-based procedures to prepare diazonium salts for subsequent in-situ consumption.
An application of computer aided requirements analysis to a real time deep space system
NASA Technical Reports Server (NTRS)
Farny, A. M.; Morris, R. V.; Hartsough, C.; Callender, E. D.; Teichroew, D.; Chikofsky, E.
1981-01-01
The entire procedure of incorporating the requirements and goals of a space flight project into integrated, time ordered sequences of spacecraft commands, is called the uplink process. The Uplink Process Control Task (UPCT) was created to examine the uplink process and determine ways to improve it. The Problem Statement Language/Problem Statement Analyzer (PSL/PSA) designed to assist the designer/analyst/engineer in the preparation of specifications of an information system is used as a supporting tool to aid in the analysis. Attention is given to a definition of the uplink process, the definition of PSL/PSA, the construction of a PSA database, the value of analysis to the study of the uplink process, and the PSL/PSA lessons learned.
Development of Tool Representations in the Dorsal and Ventral Visual Object Processing Pathways.
Kersey, Alyssa J; Clark, Tyia S; Lussier, Courtney A; Mahon, Bradford Z; Cantlon, Jessica F
2016-07-01
Tools represent a special class of objects, because they are processed across both the dorsal and ventral visual object processing pathways. Three core regions are known to be involved in tool processing: the left posterior middle temporal gyrus, the medial fusiform gyrus (bilaterally), and the left inferior parietal lobule. A critical and relatively unexplored issue concerns whether, in development, tool preferences emerge at the same time and to a similar degree across all regions of the tool-processing network. To test this issue, we used functional magnetic resonance imaging to measure the neural amplitude, peak location, and the dispersion of tool-related neural responses in the youngest sample of children tested to date in this domain (ages 4-8 years). We show that children recruit overlapping regions of the adult tool-processing network and also exhibit similar patterns of co-activation across the network to adults. The amplitude and co-activation data show that the core components of the tool-processing network are established by age 4. Our findings on the distributions of peak location and dispersion of activation indicate that the tool network undergoes refinement between ages 4 and 8 years. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Aziz, Fazilah Abdul; Razali, Noraini; Najmiyah Jaafar, Nur
2016-02-01
Currently there are many automotive companies still unable to effectively prevent consequences of poor ergonomics in their manufacturing processes. This study purpose is to determine the surrounding factors that influence low ergonomics risk awareness among staffs at early product development phase in Malaysia automotive industry. In this study there are four variables, low ergonomic risk awareness, inappropriate method and tools, tight development schedule and lack of management support. The survey data were gathered from 245 respondents of local automotive companies in Malaysia. The data was analysed through multiple regression and moderated regression using the IBM SPSS software. Study results revealed that low ergonomic risk awareness has influenced by inappropriate method and tool, and tight development schedule. There were positive linear relationships between low ergonomic risk awareness and inappropriate method and tools, and tight development schedule. The more inappropriate method and tools applied; the lower their ergonomic risk awareness. The more tight development schedule is the lower ergonomic risk awareness. The relationship between low ergonomic risk awareness and inappropriate method and tools depends on staff's age, and education level. Furthermore the relationship between low ergonomic risk awareness and tight development schedule depends on staff's working experience and number of project involvement. The main contribution of this paper was identified the number of factors of low ergonomics risk awareness and offers better understanding on ergonomics among researchers and automotive manufacturer's employees during product development process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurnik, Charles W; Benton, Nathanael; Burns, Patrick
Compressed-air systems are used widely throughout industry for many operations, including pneumatic tools, packaging and automation equipment, conveyors, and other industrial process operations. Compressed-air systems are defined as a group of subsystems composed of air compressors, air treatment equipment, controls, piping, pneumatic tools, pneumatically powered machinery, and process applications using compressed air. A compressed-air system has three primary functional subsystems: supply, distribution, and demand. Air compressors are the primary energy consumers in a compressed-air system and are the primary focus of this protocol. The two compressed-air energy efficiency measures specifically addressed in this protocol are: High-efficiency/variable speed drive (VSD) compressormore » replacing modulating, load/unload, or constant-speed compressor; and Compressed-air leak survey and repairs. This protocol provides direction on how to reliably verify savings from these two measures using a consistent approach for each.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, T. Jr
Volume IV represents the results of one of four major study areas under the Automotive Manufacturing Assessment System (AMAS) sponsored by the DOT/Transportation Systems Center. AMAS was designed to assist in the evaluation of industry's capability to produce fuel efficient vehicles. An analysis of automotive engine manufacturing was conducted in order to determine the impact of regulatory changes on tooling costs and the production process. The 351W CID V-8 engine at Ford's Windsor No. 1 Plant was the subject of the analysis. A review of plant history and its product is presented along with an analysis of manufacturing operations, includingmore » material and production flow, plant layout, machining and assembly processes, tooling, supporting facilities, inspection, service and repair. Four levels of product change intensity showing the impact on manufacturing methods and cost is also presented.« less
NASA Technical Reports Server (NTRS)
2005-01-01
The purpose of this document is to analyze the impact of Remotely Operated Aircraft (ROA) operations on current and planned Air Traffic Control (ATC) automation systems in the En Route, Terminal, and Traffic Flow Management domains. The operational aspects of ROA flight, while similar, are not entirely identical to their manned counterparts and may not have been considered within the time-horizons of the automation tools. This analysis was performed to determine if flight characteristics of ROAs would be compatible with current and future NAS automation tools. Improvements to existing systems / processes are recommended that would give Air Traffic Controllers an indication that a particular aircraft is an ROA and modifications to IFR flight plan processing algorithms and / or designation of airspace where an ROA will be operating for long periods of time.
NASA Astrophysics Data System (ADS)
Di Lorenzo, R.; Ingarao, G.; Fonti, V.
2007-05-01
The crucial task in the prevention of ductile fracture is the availability of a tool for the prediction of such defect occurrence. The technical literature presents a wide investigation on this topic and many contributions have been given by many authors following different approaches. The main class of approaches regards the development of fracture criteria: generally, such criteria are expressed by determining a critical value of a damage function which depends on stress and strain paths: ductile fracture is assumed to occur when such critical value is reached during the analysed process. There is a relevant drawback related to the utilization of ductile fracture criteria; in fact each criterion usually has good performances in the prediction of fracture for particular stress - strain paths, i.e. it works very well for certain processes but may provide no good results for other processes. On the other hand, the approaches based on damage mechanics formulation are very effective from a theoretical point of view but they are very complex and their proper calibration is quite difficult. In this paper, two different approaches are investigated to predict fracture occurrence in cold forming operations. The final aim of the proposed method is the achievement of a tool which has a general reliability i.e. it is able to predict fracture for different forming processes. The proposed approach represents a step forward within a research project focused on the utilization of innovative predictive tools for ductile fracture. The paper presents a comparison between an artificial neural network design procedure and an approach based on statistical tools; both the approaches were aimed to predict fracture occurrence/absence basing on a set of stress and strain paths data. The proposed approach is based on the utilization of experimental data available, for a given material, on fracture occurrence in different processes. More in detail, the approach consists in the analysis of experimental tests in which fracture occurs followed by the numerical simulations of such processes in order to track the stress-strain paths in the workpiece region where fracture is expected. Such data are utilized to build up a proper data set which was utilized both to train an artificial neural network and to perform a statistical analysis aimed to predict fracture occurrence. The developed statistical tool is properly designed and optimized and is able to recognize the fracture occurrence. The reliability and predictive capability of the statistical method were compared with the ones obtained from an artificial neural network developed to predict fracture occurrence. Moreover, the approach is validated also in forming processes characterized by a complex fracture mechanics.
Software Process Automation: Experiences from the Trenches.
1996-07-01
Integration of problem database Weaver tions) J Process WordPerfect, All-in-One, Oracle, CM Integration of tools Weaver System K Process Framemaker , CM...handle change requests and problem reports. * Autoplan, a project management tool * Framemaker , a document processing system * Worldview, a document...Cadre, Team Work, FrameMaker , some- thing for requirements traceability, their own homegrown scheduling tool, and their own homegrown tool integrator
NASA Astrophysics Data System (ADS)
Irish, Teresa J.
The aim of this study was to provide insights addressing national concerns in Science, Technology, Engineering, and Mathematics (STEM) education by examining how a set of six perimeter urban K-12 schools were transformed into STEM-focused professional learning communities (PLC). The concept of a STEM Academy as a STEM-focused PLC emphasizes the development of a STEM culture where professional discourse and teaching are focused on STEM learning. The STEM Academies examined used the STEM Academy Measurement Tool and Rubric (Tool) as a catalyst for discussion and change. This Tool was developed with input from stakeholders and used for school-wide initiatives, teacher professional development and K-12 student engagement to improve STEM teaching and learning. Two primary goals of this study were to assess the levels of awareness and use of the tool by all stakeholders involved in the project and to determine how the Tool assisted in the development and advancement of these schools as STEM PLCs. Data from the STEM Academy Participant Survey was analyzed to determine stakeholders' perceptions of the Tool in terms of (i) how aware stakeholders were of the Tool, (ii) whether they participated in the use of the Tool, (iii) how the characteristics of PLCs were perceived in their schools, and finally (iv) how the awareness of the Tool influenced teachers' perceptions of the presence of PLC characteristics. Findings indicate that school faculty were aware of the Tool on a number of different levels and evidence exists that the use of the Tool assisted in the development of STEM Academies, however impact varied from school to school. Implications of this study suggest that the survey should be used for a longer period of time to gain more in-depth knowledge on teachers' perceptions of the Tool as a catalyst across time. Additional findings indicate that the process for using the Tool should be ongoing and involve the stakeholders to have the greatest impact on school culture. This research contributes to the knowledge base related to building STEM PLCs aimed at improving K-12 teacher content and pedagogical knowledge as well as student learning and achievement in STEM education.
Gupta, Ratika; Fonacier, Luz S
2016-06-01
Inhaled, intranasal, and cutaneous steroids are prescribed by physicians for a plethora of disease processes including asthma and rhinitis. While the high efficacy of this class of medication is well known, the wide range of adverse effects, both local and systemic, is not well elucidated. It is imperative to monitor total steroid burden in its varied forms as well as tracking for possible side effects that may be caused by a high cumulative dose of steroids. This review article highlights the adverse effects of different steroid modalities as well as suggests a monitoring tool to determine steroid totality and side effects.
Parallel workflow tools to facilitate human brain MRI post-processing
Cui, Zaixu; Zhao, Chenxi; Gong, Gaolang
2015-01-01
Multi-modal magnetic resonance imaging (MRI) techniques are widely applied in human brain studies. To obtain specific brain measures of interest from MRI datasets, a number of complex image post-processing steps are typically required. Parallel workflow tools have recently been developed, concatenating individual processing steps and enabling fully automated processing of raw MRI data to obtain the final results. These workflow tools are also designed to make optimal use of available computational resources and to support the parallel processing of different subjects or of independent processing steps for a single subject. Automated, parallel MRI post-processing tools can greatly facilitate relevant brain investigations and are being increasingly applied. In this review, we briefly summarize these parallel workflow tools and discuss relevant issues. PMID:26029043
Detection and diagnosis of bearing and cutting tool faults using hidden Markov models
NASA Astrophysics Data System (ADS)
Boutros, Tony; Liang, Ming
2011-08-01
Over the last few decades, the research for new fault detection and diagnosis techniques in machining processes and rotating machinery has attracted increasing interest worldwide. This development was mainly stimulated by the rapid advance in industrial technologies and the increase in complexity of machining and machinery systems. In this study, the discrete hidden Markov model (HMM) is applied to detect and diagnose mechanical faults. The technique is tested and validated successfully using two scenarios: tool wear/fracture and bearing faults. In the first case the model correctly detected the state of the tool (i.e., sharp, worn, or broken) whereas in the second application, the model classified the severity of the fault seeded in two different engine bearings. The success rate obtained in our tests for fault severity classification was above 95%. In addition to the fault severity, a location index was developed to determine the fault location. This index has been applied to determine the location (inner race, ball, or outer race) of a bearing fault with an average success rate of 96%. The training time required to develop the HMMs was less than 5 s in both the monitoring cases.
A Surrogate Technique for Investigating Deterministic Dynamics in Discrete Human Movement.
Taylor, Paul G; Small, Michael; Lee, Kwee-Yum; Landeo, Raul; O'Meara, Damien M; Millett, Emma L
2016-10-01
Entropy is an effective tool for investigation of human movement variability. However, before applying entropy, it can be beneficial to employ analyses to confirm that observed data are not solely the result of stochastic processes. This can be achieved by contrasting observed data with that produced using surrogate methods. Unlike continuous movement, no appropriate method has been applied to discrete human movement. This article proposes a novel surrogate method for discrete movement data, outlining the processes for determining its critical values. The proposed technique reliably generated surrogates for discrete joint angle time series, destroying fine-scale dynamics of the observed signal, while maintaining macro structural characteristics. Comparison of entropy estimates indicated observed signals had greater regularity than surrogates and were not only the result of stochastic but also deterministic processes. The proposed surrogate method is both a valid and reliable technique to investigate determinism in other discrete human movement time series.
Kumar, Vijay; Taylor, Michael K; Mehrotra, Amit; Stagner, William C
2013-06-01
Focused beam reflectance measurement (FBRM) was used as a process analytical technology tool to perform inline real-time particle size analysis of a proprietary granulation manufactured using a continuous twin-screw granulation-drying-milling process. A significant relationship between D20, D50, and D80 length-weighted chord length and sieve particle size was observed with a p value of <0.0001 and R(2) of 0.886. A central composite response surface statistical design was used to evaluate the effect of granulator screw speed and Comil® impeller speed on the length-weighted chord length distribution (CLD) and particle size distribution (PSD) determined by FBRM and nested sieve analysis, respectively. The effect of granulator speed and mill speed on bulk density, tapped density, Compressibility Index, and Flowability Index were also investigated. An inline FBRM probe placed below the Comil-generated chord lengths and CLD data at designated times. The collection of the milled samples for sieve analysis and PSD evaluation were coordinated with the timing of the FBRM determinations. Both FBRM and sieve analysis resulted in similar bimodal distributions for all ten manufactured batches studied. Within the experimental space studied, the granulator screw speed (650-850 rpm) and Comil® impeller speed (1,000-2,000 rpm) did not have a significant effect on CLD, PSD, bulk density, tapped density, Compressibility Index, and Flowability Index (p value > 0.05).
ERIC Educational Resources Information Center
Bergil, Ayfer Su; Sariçoban, Arif
2017-01-01
The current practices in the field of foreign language teacher education have a heavy inclination to make use of traditional means especially throughout the assessment process of student teachers at foreign language departments. Observing the world in terms of teacher education makes it urgent to include more reflective and objective tools in…
Computer analysis of potentiometric data of complexes formation in the solution
NASA Astrophysics Data System (ADS)
Jastrzab, Renata; Kaczmarek, Małgorzata T.; Tylkowski, Bartosz; Odani, Akira
2018-02-01
The determination of equilibrium constants is an important process for many branches of chemistry. In this review we provide the readers with a discussion on computer methods which have been applied for elaboration of potentiometric experimental data generated during complexes formation in solution. The review describes both: general basis of modeling tools and examples of the use of calculated stability constants.
Fengjing Liu; Carolyn Hunsaker; Roger C. Bales
2012-01-01
Processes controlling streamflow generation were determined using geochemical tracers for water years 2004â2007 at eight headwater catchments at the Kings River Experimental Watersheds in southern Sierra Nevada. Four catchments are snowdominated, and four receive a mix of rain and snow. Results of diagnostic tools of mixing models indicate that Ca2+...
Flow in the Proximity of the Pin-Tool in Friction Stir Welding and Its Relation to Weld Homogeneity
NASA Technical Reports Server (NTRS)
Nunes, Arthur C., Jr.
2000-01-01
In the Friction Stir Welding (FSW) process a rotating pin inserted into a seam literally stirs the metal from each side of the seam together. It is proposed that the flow in the vicinity of the pin-tool comprises a primary rapid shear over a cylindrical envelope covering the pin-tool and a relatively slow secondary flow taking the form of a ring vortex about the tool circumference. This model is consistent with a plastic characterization of metal flow, where discontinuities in shear flow are allowed but not viscous effects. It is consistent with experiments employing several different kinds of tracer: atomic markers, shot, and wire. If a rotating disc with angular velocity w is superposed on a translating continuum with linear velocity omega, the trajectories of tracer points become circular arcs centered upon a point displaced laterally a distance v/omega from the center of rotation of the disc in the direction of the advancing side of the disc. In the present model a stream of metal approaching the tool (taken as the coordinate system of observation) is sheared at the slip surface, rapidly rotated around the tool, sheared again on the opposite side of the tool, and deposited in the wake of the tool. Local shearing rates are high, comparable to metal cutting in this model. The flow patterns in the vicinity of the pin-tool determine the level of homogenization and dispersal of contaminants that occurs in the FSW process. The approaching metal streams enfold one another as they are rotated around the tool. Neglecting mixing they return to the same lateral position in the wake of the tool preserving lateral tracer positions as if the metal had flowed past the tool like an extrusion instead of being rotated around it. (The seam is, however, obliterated.) The metal stream of thickness approximately that of the tool diameter D is wiped past the tool at elevated temperatures drawn out to a thickness of v/2(omega) in the wiping zone. Mixing distances in the wiping zone are multiplied in the unfolded metal. Inhomogeneities on a smaller scale than the mixing length are obliterated, but structure on a larger scale may be transmitted to the wake of a FSW weld.
NASA Astrophysics Data System (ADS)
O'Neill, B. C.; Kauffman, B.; Lawrence, P.
2016-12-01
Integrated analysis of questions regarding land, water, and energy resources often requires integration of models of different types. One type of integration is between human and earth system models, since both societal and physical processes influence these resources. For example, human processes such as changes in population, economic conditions, and policies govern the demand for land, water and energy, while the interactions of these resources with physical systems determine their availability and environmental consequences. We have begun to develop and use a toolkit for linking human and earth system models called the Toolbox for Human-Earth System Integration and Scaling (THESIS). THESIS consists of models and software tools to translate, scale, and synthesize information from and between human system models and earth system models (ESMs), with initial application to linking the NCAR integrated assessment model, iPETS, with the NCAR earth system model, CESM. Initial development is focused on urban areas and agriculture, sectors that are both explicitly represented in both CESM and iPETS. Tools are being made available to the community as they are completed (see https://www2.cgd.ucar.edu/sections/tss/iam/THESIS_tools). We discuss four general types of functions that THESIS tools serve (Spatial Distribution, Spatial Properties, Consistency, and Outcome Evaluation). Tools are designed to be modular and can be combined in order to carry out more complex analyses. We illustrate their application to both the exposure of population to climate extremes and to the evaluation of climate impacts on the agriculture sector. For example, projecting exposure to climate extremes involves use of THESIS tools for spatial population, spatial urban land cover, the characteristics of both, and a tool to bring urban climate information together with spatial population information. Development of THESIS tools is continuing and open to the research community.
Analysis of fracture in sheet bending and roll forming
NASA Astrophysics Data System (ADS)
Deole, Aditya D.; Barnett, Matthew; Weiss, Matthias
2018-05-01
The bending limit or minimum bending radius of sheet metal is conventionally measured in a wiping (swing arm) or in a vee bend test and reported as the minimum radius of the tool over which the sheet can be bent without fracture. Frequently the material kinks while bending so that the actual inner bend radius of the sheet metal is smaller than the tool radius giving rise to inaccuracy in these methods. It has been shown in the previous studies that conventional bend test methods may under-estimate formability in bending dominated processes such as roll forming. A new test procedure is proposed here to improve understanding and measurement of fracture in bending and roll forming. In this study, conventional wiping test and vee bend test have been performed on martensitic steel to determine the minimum bend radius. In addition, the vee bend test is performed in an Erichsen sheet metal tester equipped with the GOM Aramis system to enable strain measurement on the outer surface during bending. The strain measurement before the onset of fracture is then used to determine the minimum bend radius. To compare this result with a technological process, a vee channel is roll formed and in-situ strain measurement carried out with the Vialux Autogrid system. The strain distribution at fracture in the roll forming process is compared with that predicted by the conventional bending tests and by the improved process. It is shown that for this forming operation and material, the improved procedure gives a more accurate prediction of fracture.
O'Connor, Annette M; Tsafnat, Guy; Gilbert, Stephen B; Thayer, Kristina A; Wolfe, Mary S
2018-01-09
The second meeting of the International Collaboration for Automation of Systematic Reviews (ICASR) was held 3-4 October 2016 in Philadelphia, Pennsylvania, USA. ICASR is an interdisciplinary group whose aim is to maximize the use of technology for conducting rapid, accurate, and efficient systematic reviews of scientific evidence. Having automated tools for systematic review should enable more transparent and timely review, maximizing the potential for identifying and translating research findings to practical application. The meeting brought together multiple stakeholder groups including users of summarized research, methodologists who explore production processes and systematic review quality, and technologists such as software developers, statisticians, and vendors. This diversity of participants was intended to ensure effective communication with numerous stakeholders about progress toward automation of systematic reviews and stimulate discussion about potential solutions to identified challenges. The meeting highlighted challenges, both simple and complex, and raised awareness among participants about ongoing efforts by various stakeholders. An outcome of this forum was to identify several short-term projects that participants felt would advance the automation of tasks in the systematic review workflow including (1) fostering better understanding about available tools, (2) developing validated datasets for testing new tools, (3) determining a standard method to facilitate interoperability of tools such as through an application programming interface or API, and (4) establishing criteria to evaluate the quality of tools' output. ICASR 2016 provided a beneficial forum to foster focused discussion about tool development and resources and reconfirm ICASR members' commitment toward systematic reviews' automation.
Wirges, M; Funke, A; Serno, P; Knop, K; Kleinebudde, P
2013-05-05
Incorporation of an active pharmaceutical ingredient (API) into the coating layer of film-coated tablets is a method mainly used to formulate fixed-dose combinations. Uniform and precise spray-coating of an API represents a substantial challenge, which could be overcome by applying Raman spectroscopy as process analytical tool. In pharmaceutical industry, Raman spectroscopy is still mainly used as a bench top laboratory analytical method and usually not implemented in the production process. Concerning the application in the production process, a lot of scientific approaches stop at the level of feasibility studies and do not manage the step to production scale and process applications. The present work puts the scale up of an active coating process into focus, which is a step of highest importance during the pharmaceutical development. Active coating experiments were performed at lab and production scale. Using partial least squares (PLS), a multivariate model was constructed by correlating in-line measured Raman spectral data with the coated amount of API. By transferring this model, being implemented for a lab scale process, to a production scale process, the robustness of this analytical method and thus its applicability as a Process Analytical Technology (PAT) tool for the correct endpoint determination in pharmaceutical manufacturing could be shown. Finally, this method was validated according to the European Medicine Agency (EMA) guideline with respect to the special requirements of the applied in-line model development strategy. Copyright © 2013 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fehrmann, Henning; Perdue, Robert
2012-07-01
Cementation of radioactive waste is a common technology. The waste is mixed with cement and water and forms a stable, solid block. The physical properties like compression strength or low leach ability depends strongly on the cement recipe. Due to the fact that this waste cement mixture has to fulfill special requirements, a recipe development is necessary. The Six Sigma{sup TM}' DMAIC methodology, together with the Design of experiment (DoE) approach, was employed to optimize the process of a recipe development for cementation at the Ling Ao nuclear power plant (NPP) in China. The DMAIC offers a structured, systematical andmore » traceable process to derive test parameters. The DoE test plans and statistical analysis is efficient regarding the amount of test runs and the benefit gain by getting a transfer function. A transfer function enables simulation which is useful to optimize the later process and being responsive to changes. The DoE method was successfully applied for developing a cementation recipe for both evaporator concentrate and resin waste in the plant. The key input parameters were determined, evaluated and the control of these parameters were included into the design. The applied Six Sigma{sup TM} tools can help to organize the thinking during the engineering process. Data are organized and clearly presented. Various variables can be limited to the most important ones. The Six Sigma{sup TM} tools help to make the thinking and decision process trace able. The tools can help to make data driven decisions (e.g. C and E Matrix). But the tools are not the only golden way. Results from scoring tools like the C and E Matrix need close review before using them. The DoE is an effective tool for generating test plans. DoE can be used with a small number of tests runs, but gives a valuable result from an engineering perspective in terms of a transfer function. The DoE prediction results, however, are only valid in the tested area. So a careful selection of input parameter and their limits for setting up a DoE is very important. An extrapolation of results is not recommended because the results are not reliable out of the tested area. (authors)« less
Selection into medical school: from tools to domains.
Wilkinson, Tom M; Wilkinson, Tim J
2016-10-03
Most research into the validity of admissions tools focuses on the isolated correlations of individual tools with later outcomes. Instead, looking at how domains of attributes, rather than tools, predict later success is likely to be more generalizable. We aim to produce a blueprint for an admissions scheme that is broadly relevant across institutions. We broke down all measures used for admissions at one medical school into the smallest possible component scores. We grouped these into domains on the basis of a multicollinearity analysis, and conducted a regression analysis to determine the independent validity of each domain to predict outcomes of interest. We identified four broad domains: logical reasoning and problem solving, understanding people, communication skills, and biomedical science. Each was independently and significantly associated with performance in final medical school examinations. We identified two potential errors in the design of admissions schema that can undermine their validity: focusing on tools rather than outcomes, and including a wide range of measures without objectively evaluating the independent contribution of each. Both could be avoided by following a process of programmatic assessment for selection.
Development of Bio-impedance Analyzer (BIA) for Body Fat Calculation
NASA Astrophysics Data System (ADS)
Riyadi, Munawar A.; Nugraha, A.; Santoso, M. B.; Septaditya, D.; Prakoso, T.
2017-04-01
Common weight scales cannot assess body composition or determine fat mass and fat-fress mass that make up the body weight. This research propose bio-impedance analysis (BIA) tool capable to body composition assessment. This tool uses four electrodes, two of which are used for 50 kHz sine wave current flow to the body and the rest are used to measure the voltage produced by the body for impedance analysis. Parameters such as height, weight, age, and gender are provided individually. These parameters together with impedance measurements are then in the process to produce a body fat percentage. The experimental result shows impressive repeatability for successive measurements (stdev ≤ 0.25% fat mass). Moreover, result on the hand to hand node scheme reveals average absolute difference of total subjects between two analyzer tools of 0.48% (fat mass) with maximum absolute discrepancy of 1.22% (fat mass). On the other hand, the relative error normalized to Omron’s HBF-306 as comparison tool reveals less than 2% relative error. As a result, the system performance offers good evaluation tool for fat mass in the body.
NASA Astrophysics Data System (ADS)
Chatwin, Christopher R.; McDonald, Donald W.; Scott, Brian F.
1989-07-01
The absence of an applications led design philosophy has compromised both the development of laser source technology and its effective implementation into manufacturing technology in particular. For example, CO2 lasers are still incapable of processing classes of refractory and non-ferrous metals. Whilst the scope of this paper is restricted to high power CO2 lasers; the design methodology reported herein is applicable to source technology in general, which when exploited, will effect an expansion of applications. The CO2 laser operational envelope should not only be expanded to incorporate high damage threshold materials but also offer a greater degree of controllability. By a combination of modelling and experimentation the requisite beam characteristics, at the workpiece, were determined then utilised to design the Laser Manufacturing System. The design of sub-system elements was achieved by a combination of experimentation and simulation which benefited from a comprehensive set of software tools. By linking these tools the physical processes in the laser - electron processes in the plasma, the history of photons in the resonator, etc. - can be related, in a detailed model, to the heating mechanisms in the workpiece.
NASA Technical Reports Server (NTRS)
Lih, Shyh-Shiuh; Bar-Cohen, Yoseph; Lee, Hyeong Jae; Takano, Nobuyuki; Bao, Xiaoqi
2013-01-01
An advanced signal processing methodology is being developed to monitor the height of condensed water thru the wall of a steel pipe while operating at temperatures as high as 250deg. Using existing techniques, previous study indicated that, when the water height is low or there is disturbance in the environment, the predicted water height may not be accurate. In recent years, the use of the autocorrelation and envelope techniques in the signal processing has been demonstrated to be a very useful tool for practical applications. In this paper, various signal processing techniques including the auto correlation, Hilbert transform, and the Shannon Energy Envelope methods were studied and implemented to determine the water height in the steam pipe. The results have shown that the developed method provides a good capability for monitoring the height in the regular conditions. An alternative solution for shallow water or no water conditions based on a developed hybrid method based on Hilbert transform (HT) with a high pass filter and using the optimized windowing technique is suggested. Further development of the reported methods would provide a powerful tool for the identification of the disturbances of water height inside the pipe.
Structural design/margin assessment
NASA Technical Reports Server (NTRS)
Ryan, R. S.
1993-01-01
Determining structural design inputs and the structural margins following design completion is one of the major activities in space exploration. The end result is a statement of these margins as stability, safety factors on ultimate and yield stresses, fracture limits (fracture control), fatigue lifetime, reuse criteria, operational criteria and procedures, stability factors, deflections, clearance, handling criteria, etc. The process is normally called a load cycle and is time consuming, very complex, and involves much more than structures. The key to successful structural design is the proper implementation of the process. It depends on many factors: leadership and management of the process, adequate analysis and testing tools, data basing, communications, people skills, and training. This process and the various factors involved are discussed.
Rocha, Joana; Coelho, Francisco J R C; Peixe, Luísa; Gomes, Newton C M; Calado, Ricardo
2014-11-11
For several years, knowledge on the microbiome associated with marine invertebrates was impaired by the challenges associated with the characterization of bacterial communities. With the advent of culture independent molecular tools it is possible to gain new insights on the diversity and richness of microorganisms associated with marine invertebrates. In the present study, we evaluated if different preservation and processing methodologies (prior to DNA extraction) can affect the bacterial diversity retrieved from snakelocks anemone Anemonia viridis. Denaturing gradient gel electrophoresis (DGGE) community fingerprints were used as proxy to determine the bacterial diversity retrieved (H'). Statistical analyses indicated that preservation significantly affects H'. The best approach to preserve and process A. viridis biomass for bacterial community fingerprint analysis was flash freezing in liquid nitrogen (preservation) followed by the use of a mechanical homogenizer (process), as it consistently yielded higher H'. Alternatively, biomass samples can be processed fresh followed by cell lyses using a mechanical homogenizer or mortar &pestle. The suitability of employing these two alternative procedures was further reinforced by the quantification of the 16S rRNA gene; no significant differences were recorded when comparing these two approaches and the use of liquid nitrogen followed by processing with a mechanical homogenizer.
Rocha, Joana; Coelho, Francisco J. R. C.; Peixe, Luísa; Gomes, Newton C. M.; Calado, Ricardo
2014-01-01
For several years, knowledge on the microbiome associated with marine invertebrates was impaired by the challenges associated with the characterization of bacterial communities. With the advent of culture independent molecular tools it is possible to gain new insights on the diversity and richness of microorganisms associated with marine invertebrates. In the present study, we evaluated if different preservation and processing methodologies (prior to DNA extraction) can affect the bacterial diversity retrieved from snakelocks anemone Anemonia viridis. Denaturing gradient gel electrophoresis (DGGE) community fingerprints were used as proxy to determine the bacterial diversity retrieved (H′). Statistical analyses indicated that preservation significantly affects H′. The best approach to preserve and process A. viridis biomass for bacterial community fingerprint analysis was flash freezing in liquid nitrogen (preservation) followed by the use of a mechanical homogenizer (process), as it consistently yielded higher H′. Alternatively, biomass samples can be processed fresh followed by cell lyses using a mechanical homogenizer or mortar & pestle. The suitability of employing these two alternative procedures was further reinforced by the quantification of the 16S rRNA gene; no significant differences were recorded when comparing these two approaches and the use of liquid nitrogen followed by processing with a mechanical homogenizer. PMID:25384534
Analysis and design of friction stir welding tool
NASA Astrophysics Data System (ADS)
Jagadeesha, C. B.
2016-12-01
Since its inception no one has done analysis and design of FSW tool. Initial dimensions of FSW tool are decided by educated guess. Optimum stresses on tool pin have been determined at optimized parameters for bead on plate welding on AZ31B-O Mg alloy plate. Fatigue analysis showed that the chosen FSW tool for the welding experiment has not ∞ life and it has determined that the life of FSW tool is 2.66×105 cycles or revolutions. So one can conclude that any arbitrarily decided FSW tool generally has finite life and cannot be used for ∞ life. In general, one can determine the suitability of tool and its material to be used in FSW of the given workpiece materials in advance by this analysis in terms of fatigue life of the tool.
Gossip, Drama, and Technology: How South Asian American Young Women Negotiate Gender On and Offline
ERIC Educational Resources Information Center
Subramanian, Mathangi
2013-01-01
Gossip, defined as evaluative talk about a third party, is a powerful tool for establishing in- and out-group norms and determining belonging. Drama, a form of gossip that is evolving in online spaces, is the process of fighting back against gossip and rumors designed to isolate and ostracise. While literature commonly portrays women as victims or…
A Framework for Automating Cost Estimates in Assembly Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Calton, T.L.; Peters, R.R.
1998-12-09
When a product concept emerges, the manufacturing engineer is asked to sketch out a production strategy and estimate its cost. The engineer is given an initial product design, along with a schedule of expected production volumes. The engineer then determines the best approach to manufacturing the product, comparing a variey of alternative production strategies. The engineer must consider capital cost, operating cost, lead-time, and other issues in an attempt to maximize pro$ts. After making these basic choices and sketching the design of overall production, the engineer produces estimates of the required capital, operating costs, and production capacity. 177is process maymore » iterate as the product design is refined in order to improve its pe~ormance or manufacturability. The focus of this paper is on the development of computer tools to aid manufacturing engineers in their decision-making processes. This computer sof~are tool provides aj?amework in which accurate cost estimates can be seamlessly derivedfiom design requirements at the start of any engineering project. Z+e result is faster cycle times through first-pass success; lower ll~e cycie cost due to requirements-driven design and accurate cost estimates derived early in the process.« less
James, Pam; Bebee, Patty; Beekman, Linda; Browning, David; Innes, Mathew; Kain, Jeannie; Royce-Westcott, Theresa; Waldinger, Marcy
2011-11-01
Quantifying data management and regulatory workload for clinical research is a difficult task that would benefit from a robust tool to assess and allocate effort. As in most clinical research environments, The University of Michigan Comprehensive Cancer Center (UMCCC) Clinical Trials Office (CTO) struggled to effectively allocate data management and regulatory time with frequently inaccurate estimates of how much time was required to complete the specific tasks performed by each role. In a dynamic clinical research environment in which volume and intensity of work ebbs and flows, determining requisite effort to meet study objectives was challenging. In addition, a data-driven understanding of how much staff time was required to complete a clinical trial was desired to ensure accurate trial budget development and effective cost recovery. Accordingly, the UMCCC CTO developed and implemented a Web-based effort-tracking application with the goal of determining the true costs of data management and regulatory staff effort in clinical trials. This tool was developed, implemented, and refined over a 3-year period. This article describes the process improvement and subsequent leveling of workload within data management and regulatory that enhanced the efficiency of UMCCC's clinical trials operation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Portwood, J.T.
1995-12-31
This paper discusses a database of information collected and organized during the past eight years from 2,000 producing oil wells in the United States, all of which have been treated with special applications techniques developed to improve the effectiveness of MEOR technology. The database, believed to be the first of its kind, has been generated for the purpose of statistically evaluating the effectiveness and economics of the MEOR process in a wide variety of oil reservoir environments, and is a tool that can be used to improve the predictability of treatment response. The information in the database has also beenmore » evaluated to determine which, if any, reservoir characteristics are dominant factors in determining the applicability of MEOR.« less
NASA Astrophysics Data System (ADS)
Budi Harja, Herman; Prakosa, Tri; Raharno, Sri; Yuwana Martawirya, Yatna; Nurhadi, Indra; Setyo Nogroho, Alamsyah
2018-03-01
The production characteristic of job-shop industry at which products have wide variety but small amounts causes every machine tool will be shared to conduct production process with dynamic load. Its dynamic condition operation directly affects machine tools component reliability. Hence, determination of maintenance schedule for every component should be calculated based on actual usage of machine tools component. This paper describes study on development of monitoring system to obtaining information about each CNC machine tool component usage in real time approached by component grouping based on its operation phase. A special device has been developed for monitoring machine tool component usage by utilizing usage phase activity data taken from certain electronics components within CNC machine. The components are adaptor, servo driver and spindle driver, as well as some additional components such as microcontroller and relays. The obtained data are utilized for detecting machine utilization phases such as power on state, machine ready state or spindle running state. Experimental result have shown that the developed CNC machine tool monitoring system is capable of obtaining phase information of machine tool usage as well as its duration and displays the information at the user interface application.
[Development of a Text-Data Based Learning Tool That Integrates Image Processing and Displaying].
Shinohara, Hiroyuki; Hashimoto, Takeyuki
2015-01-01
We developed a text-data based learning tool that integrates image processing and displaying by Excel. Knowledge required for programing this tool is limited to using absolute, relative, and composite cell references and learning approximately 20 mathematical functions available in Excel. The new tool is capable of resolution translation, geometric transformation, spatial-filter processing, Radon transform, Fourier transform, convolutions, correlations, deconvolutions, wavelet transform, mutual information, and simulation of proton density-, T1-, and T2-weighted MR images. The processed images of 128 x 128 pixels or 256 x 256 pixels are observed directly within Excel worksheets without using any particular image display software. The results of image processing using this tool were compared with those using C language and the new tool was judged to have sufficient accuracy to be practically useful. The images displayed on Excel worksheets were compared with images using binary-data display software. This comparison indicated that the image quality of the Excel worksheets was nearly equal to the latter in visual impressions. Since image processing is performed by using text-data, the process is visible and facilitates making contrasts by using mathematical equations within the program. We concluded that the newly developed tool is adequate as a computer-assisted learning tool for use in medical image processing.
NASA Astrophysics Data System (ADS)
Luo, Xichun; Tong, Zhen; Liang, Yingchun
2014-12-01
In this article, the shape transferability of using nanoscale multi-tip diamond tools in the diamond turning for scale-up manufacturing of nanostructures has been demonstrated. Atomistic multi-tip diamond tool models were built with different tool geometries in terms of the difference in the tip cross-sectional shape, tip angle, and the feature of tool tip configuration, to determine their effect on the applied forces and the machined nano-groove geometries. The quality of machined nanostructures was characterized by the thickness of the deformed layers and the dimensional accuracy achieved. Simulation results show that diamond turning using nanoscale multi-tip tools offers tremendous shape transferability in machining nanostructures. Both periodic and non-periodic nano-grooves with different cross-sectional shapes can be successfully fabricated using the multi-tip tools. A hypothesis of minimum designed ratio of tool tip distance to tip base width (L/Wf) of the nanoscale multi-tip diamond tool for the high precision machining of nanostructures was proposed based on the analytical study of the quality of the nanostructures fabricated using different types of the multi-tip tools. Nanometric cutting trials using nanoscale multi-tip diamond tools (different in L/Wf) fabricated by focused ion beam (FIB) were then conducted to verify the hypothesis. The investigations done in this work imply the potential of using the nanoscale multi-tip diamond tool for the deterministic fabrication of period and non-periodic nanostructures, which opens up the feasibility of using the process as a versatile manufacturing technique in nanotechnology.
Hoben, Matthias; Bär, Marion; Mahler, Cornelia; Berger, Sarah; Squires, Janet E; Estabrooks, Carole A; Kruse, Andreas; Behrens, Johann
2014-01-31
To study the association between organizational context and research utilization in German residential long term care (LTC), we translated three Canadian assessment instruments: the Alberta Context Tool (ACT), Estabrooks' Kinds of Research Utilization (RU) items and the Conceptual Research Utilization Scale. Target groups for the tools were health care aides (HCAs), registered nurses (RNs), allied health professionals (AHPs), clinical specialists and care managers. Through a cognitive debriefing process, we assessed response processes validity-an initial stage of validity, necessary before more advanced validity assessment. We included 39 participants (16 HCAs, 5 RNs, 7 AHPs, 5 specialists and 6 managers) from five residential LTC facilities. We created lists of questionnaire items containing problematic items plus items randomly selected from the pool of remaining items. After participants completed the questionnaires, we conducted individual semi-structured cognitive interviews using verbal probing. We asked participants to reflect on their answers for list items in detail. Participants' answers were compared to concept maps defining the instrument concepts in detail. If at least two participants gave answers not matching concept map definitions, items were revised and re-tested with new target group participants. Cognitive debriefings started with HCAs. Based on the first round, we modified 4 of 58 ACT items, 1 ACT item stem and all 8 items of the RU tools. All items were understood by participants after another two rounds. We included revised HCA ACT items in the questionnaires for the other provider groups. In the RU tools for the other provider groups, we used different wording than the HCA version, as was done in the original English instruments. Only one cognitive debriefing round was needed with each of the other provider groups. Cognitive debriefing is essential to detect and respond to problematic instrument items, particularly when translating instruments for heterogeneous, less well educated provider groups such as HCAs. Cognitive debriefing is an important step in research tool development and a vital component of establishing response process validity evidence. Publishing cognitive debriefing results helps researchers to determine potentially critical elements of the translated tools and assists with interpreting scores.
2014-01-01
Background To study the association between organizational context and research utilization in German residential long term care (LTC), we translated three Canadian assessment instruments: the Alberta Context Tool (ACT), Estabrooks’ Kinds of Research Utilization (RU) items and the Conceptual Research Utilization Scale. Target groups for the tools were health care aides (HCAs), registered nurses (RNs), allied health professionals (AHPs), clinical specialists and care managers. Through a cognitive debriefing process, we assessed response processes validity–an initial stage of validity, necessary before more advanced validity assessment. Methods We included 39 participants (16 HCAs, 5 RNs, 7 AHPs, 5 specialists and 6 managers) from five residential LTC facilities. We created lists of questionnaire items containing problematic items plus items randomly selected from the pool of remaining items. After participants completed the questionnaires, we conducted individual semi-structured cognitive interviews using verbal probing. We asked participants to reflect on their answers for list items in detail. Participants’ answers were compared to concept maps defining the instrument concepts in detail. If at least two participants gave answers not matching concept map definitions, items were revised and re-tested with new target group participants. Results Cognitive debriefings started with HCAs. Based on the first round, we modified 4 of 58 ACT items, 1 ACT item stem and all 8 items of the RU tools. All items were understood by participants after another two rounds. We included revised HCA ACT items in the questionnaires for the other provider groups. In the RU tools for the other provider groups, we used different wording than the HCA version, as was done in the original English instruments. Only one cognitive debriefing round was needed with each of the other provider groups. Conclusion Cognitive debriefing is essential to detect and respond to problematic instrument items, particularly when translating instruments for heterogeneous, less well educated provider groups such as HCAs. Cognitive debriefing is an important step in research tool development and a vital component of establishing response process validity evidence. Publishing cognitive debriefing results helps researchers to determine potentially critical elements of the translated tools and assists with interpreting scores. PMID:24479645
Biomimetics: process, tools and practice.
Fayemi, P E; Wanieck, K; Zollfrank, C; Maranzana, N; Aoussat, A
2017-01-23
Biomimetics applies principles and strategies abstracted from biological systems to engineering and technological design. With a huge potential for innovation, biomimetics could evolve into a key process in businesses. Yet challenges remain within the process of biomimetics, especially from the perspective of potential users. We work to clarify the understanding of the process of biomimetics. Therefore, we briefly summarize the terminology of biomimetics and bioinspiration. The implementation of biomimetics requires a stated process. Therefore, we present a model of the problem-driven process of biomimetics that can be used for problem-solving activity. The process of biomimetics can be facilitated by existing tools and creative methods. We mapped a set of tools to the biomimetic process model and set up assessment sheets to evaluate the theoretical and practical value of these tools. We analyzed the tools in interdisciplinary research workshops and present the characteristics of the tools. We also present the attempt of a utility tree which, once finalized, could be used to guide users through the process by choosing appropriate tools respective to their own expertize. The aim of this paper is to foster the dialogue and facilitate a closer collaboration within the field of biomimetics.
Detection of Cutting Tool Wear using Statistical Analysis and Regression Model
NASA Astrophysics Data System (ADS)
Ghani, Jaharah A.; Rizal, Muhammad; Nuawi, Mohd Zaki; Haron, Che Hassan Che; Ramli, Rizauddin
2010-10-01
This study presents a new method for detecting the cutting tool wear based on the measured cutting force signals. A statistical-based method called Integrated Kurtosis-based Algorithm for Z-Filter technique, called I-kaz was used for developing a regression model and 3D graphic presentation of I-kaz 3D coefficient during machining process. The machining tests were carried out using a CNC turning machine Colchester Master Tornado T4 in dry cutting condition. A Kistler 9255B dynamometer was used to measure the cutting force signals, which were transmitted, analyzed, and displayed in the DasyLab software. Various force signals from machining operation were analyzed, and each has its own I-kaz 3D coefficient. This coefficient was examined and its relationship with flank wear lands (VB) was determined. A regression model was developed due to this relationship, and results of the regression model shows that the I-kaz 3D coefficient value decreases as tool wear increases. The result then is used for real time tool wear monitoring.
Process tool monitoring and matching using interferometry technique
NASA Astrophysics Data System (ADS)
Anberg, Doug; Owen, David M.; Mileham, Jeffrey; Lee, Byoung-Ho; Bouche, Eric
2016-03-01
The semiconductor industry makes dramatic device technology changes over short time periods. As the semiconductor industry advances towards to the 10 nm device node, more precise management and control of processing tools has become a significant manufacturing challenge. Some processes require multiple tool sets and some tools have multiple chambers for mass production. Tool and chamber matching has become a critical consideration for meeting today's manufacturing requirements. Additionally, process tools and chamber conditions have to be monitored to ensure uniform process performance across the tool and chamber fleet. There are many parameters for managing and monitoring tools and chambers. Particle defect monitoring is a well-known and established example where defect inspection tools can directly detect particles on the wafer surface. However, leading edge processes are driving the need to also monitor invisible defects, i.e. stress, contamination, etc., because some device failures cannot be directly correlated with traditional visualized defect maps or other known sources. Some failure maps show the same signatures as stress or contamination maps, which implies correlation to device performance or yield. In this paper we present process tool monitoring and matching using an interferometry technique. There are many types of interferometry techniques used for various process monitoring applications. We use a Coherent Gradient Sensing (CGS) interferometer which is self-referencing and enables high throughput measurements. Using this technique, we can quickly measure the topography of an entire wafer surface and obtain stress and displacement data from the topography measurement. For improved tool and chamber matching and reduced device failure, wafer stress measurements can be implemented as a regular tool or chamber monitoring test for either unpatterned or patterned wafers as a good criteria for improved process stability.
Research on operation mode of abrasive grain during grinding
NASA Astrophysics Data System (ADS)
Ivanova, T. N.; Dement’ev, V. B.; Nikitina, O. V.
2018-03-01
The processing of materials by cutting with an abrasive tool is carried out by means of thousands of grains bonded together as a single whole. The quality of the abrasive tool is defined by cutting properties of abrasive grains and depends on features of spreading the temperature field in time and in the abrasive grain volume. Grains are exposed to heating and cooling during work. It leads to undesired effects such as a decrease of durability of grain retention in the binder, hardness, intensification of diffusion and oxidation processes between the binder and the grain, the occurrence of considerable temperature stresses in the grain itself. The obtained equation which allows calculation of temperature field of grain for one rotation of grinding wheel shows that the temperature of the wheel depends on grinding modes and thermophysical properties of abrasive material. Thus, as the time of contact of grain with processed material increases, the temperature in the cutting area rises. As thermophysical properties increase, the temperature in cutting area decreases. Thermal working conditions are determined to be different from each other depending on contact time of the grain and the material. For example, in case of creep-feed grinding, the peak value of temperature is higher than during multistep grinding; the depth of expansion is greater. While the speed of the thermal process in creep-feed grinding is 2-3 times lower than in multistep grinding, the gradient reduces 3-4 times. The analysis of machining methods shows that creep-feed grinding ensures greater depth of grain heating, a smaller heating rate and a reduced velocity gradient. It causes a decrease of probable allotropic modifications and prevents from occurring of heat strokes - cracking of grains due to high temperature falls. Consequently, it is necessary to employ creep-feed grinding to increase the efficiency of abrasive tool employing. Three operation modes of grinding wheel including blunting, full self-sharpening, emergency wear and tear are determined as the result of the research on evaluation of cutting ability of grinding wheels. Recommendations for working capacity of grinding wheels in each operation mode and with a transition from one mode to another are given. As a result of the research, different dependencies were determined. They include dependencies, governing the extent of influence of granularity, difference in height and concentration of grains, geometry parameters of the detail to be machined and the grinding wheel on machining modes and the thickness of the layer cutoff by one grain. They have an influence on the grinding process.
Flight Operations Analysis Tool
NASA Technical Reports Server (NTRS)
Easter, Robert; Herrell, Linda; Pomphrey, Richard; Chase, James; Wertz Chen, Julie; Smith, Jeffrey; Carter, Rebecca
2006-01-01
Flight Operations Analysis Tool (FLOAT) is a computer program that partly automates the process of assessing the benefits of planning spacecraft missions to incorporate various combinations of launch vehicles and payloads. Designed primarily for use by an experienced systems engineer, FLOAT makes it possible to perform a preliminary analysis of trade-offs and costs of a proposed mission in days, whereas previously, such an analysis typically lasted months. FLOAT surveys a variety of prior missions by querying data from authoritative NASA sources pertaining to 20 to 30 mission and interface parameters that define space missions. FLOAT provides automated, flexible means for comparing the parameters to determine compatibility or the lack thereof among payloads, spacecraft, and launch vehicles, and for displaying the results of such comparisons. Sparseness, typical of the data available for analysis, does not confound this software. FLOAT effects an iterative process that identifies modifications of parameters that could render compatible an otherwise incompatible mission set.
Ayuela Azcárate, J M; Clau-Terré, F; Vicho Pereira, R; Guerrero de Mier, M; Carrillo López, A; Ochagavia, A; López Pérez, J M; Trenado Alvarez, J; Pérez, L; Llompart-Pou, J A; González de Molina, F J; Fojón, S; Rodríguez Salgado, A; Martínez Díaz, M C; Royo Villa, C; Romero Bermejo, F J; Ruíz Bailén, M; Arroyo Díez, M; Argueso García, M; Fernández Fernández, J L
2014-01-01
Ultrasound has become an essential tool in assisting critically ill patients. His knowledge, use and instruction requires a statement by scientific societies involved in its development and implementation. Our aim are to determine the use of the technique in intensive care medicine, clinical situations where its application is recommended, levels of knowledge, associated responsibility and learning process also implement the ultrasound technique as a common tool in all intensive care units, similar to the rest of european countries. The SEMICYUC's Working Group Cardiac Intensive Care and CPR establishes after literature review and scientific evidence, a consensus document which sets out the requirements for accreditation in ultrasound applied to the critically ill patient and how to acquire the necessary skills. Training and learning requires a structured process within the specialty. The SEMICYUC must agree to disclose this document, build relationships with other scientific societies and give legal cover through accreditation of the training units, training courses and different levels of training. Copyright © 2013 Elsevier España, S.L. y SEMICYUC. All rights reserved.
Crutzen, Rik; Peters, Gjalt-Jorn Ygram; Noijen, Judith
2017-01-01
When developing an intervention aimed at behavior change, one of the crucial steps in the development process is to select the most relevant social-cognitive determinants. These determinants can be seen as the buttons one needs to push to establish behavior change. Insight into these determinants is needed to select behavior change methods (i.e., general behavior change techniques that are applied in an intervention) in the development process. Therefore, a study on determinants is often conducted as formative research in the intervention development process. Ideally, all relevant determinants identified in such a study are addressed by an intervention. However, when developing a behavior change intervention, there are limits in terms of, for example, resources available for intervention development and the amount of content that participants of an intervention can be exposed to. Hence, it is important to select those determinants that are most relevant to the target behavior as these determinants should be addressed in an intervention. The aim of the current paper is to introduce a novel approach to select the most relevant social-cognitive determinants and use them in intervention development. This approach is based on visualization of confidence intervals for the means and correlation coefficients for all determinants simultaneously. This visualization facilitates comparison, which is necessary when making selections. By means of a case study on the determinants of using a high dose of 3,4-methylenedioxymethamphetamine (commonly known as ecstasy), we illustrate this approach. We provide a freely available tool to facilitate the analyses needed in this approach.
Pollock, James; Bolton, Glen; Coffman, Jon; Ho, Sa V; Bracewell, Daniel G; Farid, Suzanne S
2013-04-05
This paper presents an integrated experimental and modelling approach to evaluate the potential of semi-continuous chromatography for the capture of monoclonal antibodies (mAb) in clinical and commercial manufacture. Small-scale single-column experimental breakthrough studies were used to derive design equations for the semi-continuous affinity chromatography system. Verification runs with the semi-continuous 3-column and 4-column periodic counter current (PCC) chromatography system indicated the robustness of the design approach. The product quality profiles and step yields (after wash step optimisation) achieved were comparable to the standard batch process. The experimentally-derived design equations were incorporated into a decisional tool comprising dynamic simulation, process economics and sizing optimisation. The decisional tool was used to evaluate the economic and operational feasibility of whole mAb bioprocesses employing PCC affinity capture chromatography versus standard batch chromatography across a product's lifecycle from clinical to commercial manufacture. The tool predicted that PCC capture chromatography would offer more significant savings in direct costs for early-stage clinical manufacture (proof-of-concept) (∼30%) than for late-stage clinical (∼10-15%) or commercial (∼5%) manufacture. The evaluation also highlighted the potential facility fit issues that could arise with a capture resin (MabSelect) that experiences losses in binding capacity when operated in continuous mode over lengthy commercial campaigns. Consequently, the analysis explored the scenario of adopting the PCC system for clinical manufacture and switching to the standard batch process following product launch. The tool determined the PCC system design required to operate at commercial scale without facility fit issues and with similar costs to the standard batch process whilst pursuing a process change application. A retrofitting analysis established that the direct cost savings obtained by 8 proof-of-concept batches would be sufficient to pay back the investment cost of the pilot-scale semi-continuous chromatography system. Copyright © 2013 Elsevier B.V. All rights reserved.
Rallis, Austin; Fercho, Kelene A; Bosch, Taylor J; Baugh, Lee A
2018-01-31
Tool use is associated with three visual streams-dorso-dorsal, ventro-dorsal, and ventral visual streams. These streams are involved in processing online motor planning, action semantics, and tool semantics features, respectively. Little is known about the way in which the brain represents virtual tools. To directly assess this question, a virtual tool paradigm was created that provided the ability to manipulate tool components in isolation of one another. During functional magnetic resonance imaging (fMRI), adult participants performed a series of virtual tool manipulation tasks in which vision and movement kinematics of the tool were manipulated. Reaction time and hand movement direction were monitored while the tasks were performed. Functional imaging revealed that activity within all three visual streams was present, in a similar pattern to what would be expected with physical tool use. However, a previously unreported network of right-hemisphere activity was found including right inferior parietal lobule, middle and superior temporal gyri and supramarginal gyrus - regions well known to be associated with tool processing within the left hemisphere. These results provide evidence that both virtual and physical tools are processed within the same brain regions, though virtual tools recruit bilateral tool processing regions to a greater extent than physical tools. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Wernicke, S.; Dang, T.; Gies, S.; Tekkaya, A. E.
2018-05-01
The tendency to a higher variety of products requires economical manufacturing processes suitable for the production of prototypes and small batches. In the case of complex hollow-shaped parts, single point incremental forming (SPIF) represents a highly flexible process. The flexibility of this process comes along with a very long process time. To decrease the process time, a new incremental forming approach with multiple forming tools is investigated. The influence of two incremental forming tools on the resulting mechanical and geometrical component properties compared to SPIF is presented. Sheets made of EN AW-1050A were formed to frustums of a pyramid using different tool-path strategies. Furthermore, several variations of the tool-path strategy are analyzed. A time saving between 40% and 60% was observed depending on the tool-path and the radii of the forming tools while the mechanical properties remained unchanged. This knowledge can increase the cost efficiency of incremental forming processes.
Interactive entity resolution in relational data: a visual analytic tool and its evaluation.
Kang, Hyunmo; Getoor, Lise; Shneiderman, Ben; Bilgic, Mustafa; Licamele, Louis
2008-01-01
Databases often contain uncertain and imprecise references to real-world entities. Entity resolution, the process of reconciling multiple references to underlying real-world entities, is an important data cleaning process required before accurate visualization or analysis of the data is possible. In many cases, in addition to noisy data describing entities, there is data describing the relationships among the entities. This relational data is important during the entity resolution process; it is useful both for the algorithms which determine likely database references to be resolved and for visual analytic tools which support the entity resolution process. In this paper, we introduce a novel user interface, D-Dupe, for interactive entity resolution in relational data. D-Dupe effectively combines relational entity resolution algorithms with a novel network visualization that enables users to make use of an entity's relational context for making resolution decisions. Since resolution decisions often are interdependent, D-Dupe facilitates understanding this complex process through animations which highlight combined inferences and a history mechanism which allows users to inspect chains of resolution decisions. An empirical study with 12 users confirmed the benefits of the relational context visualization on the performance of entity resolution tasks in relational data in terms of time as well as users' confidence and satisfaction.
Polymerase chain reaction: A molecular diagnostic tool in periodontology
Maheaswari, Rajendran; Kshirsagar, Jaishree Tukaram; Lavanya, Nallasivam
2016-01-01
This review discusses the principles of polymerase chain reaction (PCR) and its application as a diagnostic tool in periodontology. The relevant MEDLINE and PubMed indexed journals were searched manually and electronically by typing PCR, applications of PCR, PCR in periodontics, polymorphism studies in periodontitis, and molecular techniques in periodontology. The searches were limited to articles in English language and the articles describing PCR process and its relation to periodontology were collected and used to prepare a concise review. PCR has now become a standard diagnostic and research tool in periodontology. Various studies reveal that its sensitivity and specificity allow it as a rapid, efficient method of detecting, identifying, and quantifying organism. Different immune and inflammatory markers can be identified at the mRNA expression level, and also the determination of genetic polymorphisms, thus providing the deeper insight into the mechanisms underlying the periodontal disease. PMID:27143822
Polymerase chain reaction: A molecular diagnostic tool in periodontology.
Maheaswari, Rajendran; Kshirsagar, Jaishree Tukaram; Lavanya, Nallasivam
2016-01-01
This review discusses the principles of polymerase chain reaction (PCR) and its application as a diagnostic tool in periodontology. The relevant MEDLINE and PubMed indexed journals were searched manually and electronically by typing PCR, applications of PCR, PCR in periodontics, polymorphism studies in periodontitis, and molecular techniques in periodontology. The searches were limited to articles in English language and the articles describing PCR process and its relation to periodontology were collected and used to prepare a concise review. PCR has now become a standard diagnostic and research tool in periodontology. Various studies reveal that its sensitivity and specificity allow it as a rapid, efficient method of detecting, identifying, and quantifying organism. Different immune and inflammatory markers can be identified at the mRNA expression level, and also the determination of genetic polymorphisms, thus providing the deeper insight into the mechanisms underlying the periodontal disease.
Horan, Thomas A; Daniels, Susan M; Feldman, Sue S
2009-07-01
The disability community could benefit significantly from the widespread adoption of health information technology, in particular from its ability to streamline and accelerate processing of the estimated 3 million disability benefits applications filed with the Social Security Administration each year. Disability determination is an inefficient, largely paper-based process requiring large volumes of clinical data compiled from multiple provider sources. That, coupled with a lack of transparency within the process, adds unnecessary delays and expense. The objective of this paper is to outline the case for how personal health records, particularly those populated with information from provider-held electronic health records and payer claims data, offer a means to achieve financial savings from shortened disability determination processes, as well as a tool for disability health self-management and care coordination. Drawing from research and policy forums and testimony before the American Health Information Community, the importance of including the disability community as the nation moves forward with health information technology initiatives is explored. Our research suggests that systemwide improvements such as the Nationwide Health Information Network and other such health information technology initiatives could be used to bring benefits to the disability community. The time has come to use health information technology initiatives so that federal policy makers can takes steps to reduce the inefficiencies in the Social Security Administration disability determination process while improving the program's value to those who need it the most.
Development of Waypoint Planning Tool in Response to NASA Field Campaign Challenges
NASA Technical Reports Server (NTRS)
He, Matt; Hardin, Danny; Conover, Helen; Graves, Sara; Meyer, Paul; Blakeslee, Richard; Goodman, Michael
2012-01-01
Airborne real time observations are a major component of NASA's Earth Science research and satellite ground validation studies. For mission scientists, planning a research aircraft mission within the context of meeting the science objectives is a complex task because it requires real time situational awareness of the weather conditions that affect the aircraft track. Multiple aircrafts are often involved in NASA field campaigns. The coordination of the aircrafts with satellite overpasses, other airplanes and the constantly evolving, dynamic weather conditions often determines the success of the campaign. A flight planning tool is needed to provide situational awareness information to the mission scientists, and help them plan and modify the flight tracks. Scientists at the University of Alabama-Huntsville and the NASA Marshall Space Flight Center developed the Waypoint Planning Tool, an interactive software tool that enables scientists to develop their own flight plans (also known as waypoints) with point -and-click mouse capabilities on a digital map filled with real time raster and vector data. The development of this Waypoint Planning Tool demonstrates the significance of mission support in responding to the challenges presented during NASA field campaigns. Analysis during and after each campaign helped identify both issues and new requirements, and initiated the next wave of development. Currently the Waypoint Planning Tool has gone through three rounds of development and analysis processes. The development of this waypoint tool is directly affected by the technology advances on GIS/Mapping technologies. From the standalone Google Earth application and simple KML functionalities, to Google Earth Plugin and Java Web Start/Applet on web platform, and to the rising open source GIS tools with new JavaScript frameworks, the Waypoint Planning Tool has entered its third phase of technology advancement. The newly innovated, cross ]platform, modular designed JavaScript ]controlled Way Point Tool is planned to be integrated with NASA Airborne Science Mission Tool Suite. Adapting new technologies for the Waypoint Planning Tool ensures its success in helping scientists reach their mission objectives. This presentation will discuss the development processes of the Waypoint Planning Tool in responding to field campaign challenges, identify new information technologies, and describe the capabilities and features of the Waypoint Planning Tool with the real time aspect, interactive nature, and the resultant benefits to the airborne science community.
Development of Way Point Planning Tool in Response to NASA Field Campaign Challenges
NASA Astrophysics Data System (ADS)
He, M.; Hardin, D. M.; Conover, H.; Graves, S. J.; Meyer, P.; Blakeslee, R. J.; Goodman, M. L.
2012-12-01
Airborne real time observations are a major component of NASA's Earth Science research and satellite ground validation studies. For mission scientists, planning a research aircraft mission within the context of meeting the science objectives is a complex task because it requires real time situational awareness of the weather conditions that affect the aircraft track. Multiple aircrafts are often involved in NASA field campaigns. The coordination of the aircrafts with satellite overpasses, other airplanes and the constantly evolving, dynamic weather conditions often determines the success of the campaign. A flight planning tool is needed to provide situational awareness information to the mission scientists, and help them plan and modify the flight tracks. Scientists at the University of Alabama-Huntsville and the NASA Marshall Space Flight Center developed the Waypoint Planning Tool, an interactive software tool that enables scientists to develop their own flight plans (also known as waypoints) with point-and-click mouse capabilities on a digital map filled with real time raster and vector data. The development of this Waypoint Planning Tool demonstrates the significance of mission support in responding to the challenges presented during NASA field campaigns. Analysis during and after each campaign helped identify both issues and new requirements, and initiated the next wave of development. Currently the Waypoint Planning Tool has gone through three rounds of development and analysis processes. The development of this waypoint tool is directly affected by the technology advances on GIS/Mapping technologies. From the standalone Google Earth application and simple KML functionalities, to Google Earth Plugin and Java Web Start/Applet on web platform, and to the rising open source GIS tools with new JavaScript frameworks, the Waypoint Planning Tool has entered its third phase of technology advancement. The newly innovated, cross-platform, modular designed JavaScript-controlled Way Point Tool is planned to be integrated with NASA Airborne Science Mission Tool Suite. Adapting new technologies for the Waypoint Planning Tool ensures its success in helping scientists reach their mission objectives. This presentation will discuss the development processes of the Waypoint Planning Tool in responding to field campaign challenges, identify new information technologies, and describe the capabilities and features of the Waypoint Planning Tool with the real time aspect, interactive nature, and the resultant benefits to the airborne science community.
Alchemy to reason: Effective use of Cumulative Effects Assessment in resource management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hegmann, George, E-mail: george.hegmann@stantec.com; Yarranton, G.A., E-mail: yarran@shaw.ca
2011-09-15
Cumulative Effects Assessment (CEA) is a tool that can be useful in making decisions about natural resource management and allocation. The decisions to be made include those (i) necessary to construct planning and regulatory frameworks to control development activity so that societal goals will be achieved and (ii) whether or not to approve individual development projects, with or without conditions. The evolution of CEA into a more successful tool cannot occur independently of the evolution of decision making processes. Currently progress is painfully slow on both fronts. This paper explores some opportunities to accelerate improvements in decision making in naturalmore » resource management and in the utility of CEA as a tool to assist in making such decisions. The focus of the paper is on how to define the public interest by determining what is acceptable.« less
NASA Astrophysics Data System (ADS)
Schneider, Robert; Haberl, Alexander; Rascher, Rolf
2017-06-01
The trend in the optic industry shows, that it is increasingly important to be able to manufacture complex lens geometries on a high level of precision. From a certain limit on the required shape accuracy of optical workpieces, the processing is changed from the two-dimensional to point-shaped processing. It is very important that the process is as stable as possible during the in point-shaped processing. To ensure stability, usually only one process parameter is varied during processing. It is common that this parameter is the feed rate, which corresponds to the dwell time. In the research project ArenA-FOi (Application-oriented analysis of resource-saving and energy-efficient design of industrial facilities for the optical industry), a touching procedure is used in the point-attack, and in this case a close look is made as to whether a change of several process parameters is meaningful during a processing. The ADAPT tool in size R20 from Satisloh AG is used, which is also available for purchase. The behavior of the tool is tested under constant conditions in the MCP 250 CNC by OptoTech GmbH. A series of experiments should enable the TIF (tool influence function) to be determined using three variable parameters. Furthermore, the maximum error frequency that can be processed is calculated as an example for one parameter set and serves as an outlook for further investigations. The test results serve as the basic for the later removal simulation, which must be able to deal with a variable TIF. This topic has already been successfully implemented in another research project of the Institute for Precision Manufacturing and High-Frequency Technology (IPH) and thus this algorithm can be used. The next step is the useful implementation of the collected knowledge. The TIF must be selected on the basis of the measured data. It is important to know the error frequencies to select the optimal TIF. Thus, it is possible to compare the simulated results with real measurement data and to carry out a revision. From this point onwards, it is possible to evaluate the potential of this approach, and in the ideal case it will be further researched and later found in the production.
NASA Astrophysics Data System (ADS)
Adhitama, Egy; Fauzi, Ahmad
2018-05-01
In this study, a pendulum experimental tool with a light-based timer has been developed to measure the period of a simple pendulum. The obtained data was automatically recorded in an Excel spreadsheet. The intensity of monochromatic light, sensed by a 3DU5C phototransistor, dynamically changes as the pendulum swings. The changed intensity varies the resistance value and was processed by the microcontroller, ATMega328, to obtain a signal period as a function of time and brightness when the pendulum crosses the light. Through the experiment, using calculated average periods, the gravitational acceleration value has been accurately and precisely determined.
Risk Assessment as an Environmental Management Tool: Considerations for Freshwater Wetlands
A. Dennis Lemly
1997-01-01
This paper presents a foundation for improving the risk assessment process for freshwater wetlands. Integrating wetland science, i.e., use of an ecosystem-based approach, is the key concept. Each biotic and abiotic wetland component should be identified and its contribution to ecosystem functions and societal values determined when deciding whether a stressor poses an...
López-Bolaños, Lizbeth; Campos-Rivera, Marisol; Villanueva-Borbolla, María Ángeles
2018-01-01
Objective. To reflect on the process of committing to participation in the implementation of a health strategic plan, using Participative Systematization of Social Experiences as a tool. Our study was a qualitative research-intervention study, based on the Dialectical Methodological Conception approach. We designed and implemented a two-day workshop, six hours daily, using Systematization methodology with a Community Work Group (CWG). During the workshop, women systematized their experience, with compromise as axis of the process. Using Grounded Theory techniques, we applied micro-analysis to data in order to identify and strengthen categories that emerged during the systematization process. We completed open and axial coding. The CWG identified that commitment and participation itself is influenced by group dynamics and structural determinants. They also reconsidered the way they understood and exercised commitment and participation, and generated knowledge, empowering them to improve their future practice. Commitment and participation were determined by group dynamics and structural factors such as socioeconomic conditions and gender roles. These determinants must be visible and understood in order to generate proposals that are aimed at strengthening the participation and organization of groups.
Liu, Yan; Shen, Yali; Zheng, Shasha; Liao, Jiayu
2015-12-01
SUMOylation (the process of adding the SUMO [small ubiquitin-like modifier] to substrates) is an important post-translational modification of critical proteins in multiple processes. Sentrin/SUMO-specific proteases (SENPs) act as endopeptidases to process the pre-SUMO or as isopeptidases to deconjugate the SUMO from its substrate. Determining the kinetics of SENPs is important for understanding their activities. Förster resonance energy transfer (FRET) technology has been widely used in biomedical research and is a powerful tool for elucidating protein interactions. In this paper we report a novel quantitative FRET-based protease assay for SENP2 endopeptidase activity that accounts for the self-fluorescent emissions of the donor (CyPet) and the acceptor (YPet). The kinetic parameters, k(cat), K(M), and catalytic efficiency (k(cat)/K(M)) of catalytic domain SENP2 toward pre-SUMO1/2/3, were obtained by this novel design. Although we use SENP2 to demonstrate our method, the general principles of this quantitative FRET-based protease kinetic determination can be readily applied to other proteases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, David; Bucknor, Matthew; Jerden, James
2016-02-01
The overall objective of the SFR Regulatory Technology Development Plan (RTDP) effort is to identify and address potential impediments to the SFR regulatory licensing process. In FY14, an analysis by Argonne identified the development of an SFR-specific MST methodology as an existing licensing gap with high regulatory importance and a potentially long lead-time to closure. This work was followed by an initial examination of the current state-of-knowledge regarding SFR source term development (ANLART-3), which reported several potential gaps. Among these were the potential inadequacies of current computational tools to properly model and assess the transport and retention of radionuclides duringmore » a metal fuel pool-type SFR core damage incident. The objective of the current work is to determine the adequacy of existing computational tools, and the associated knowledge database, for the calculation of an SFR MST. To accomplish this task, a trial MST calculation will be performed using available computational tools to establish their limitations with regard to relevant radionuclide release/retention/transport phenomena. The application of existing modeling tools will provide a definitive test to assess their suitability for an SFR MST calculation, while also identifying potential gaps in the current knowledge base and providing insight into open issues regarding regulatory criteria/requirements. The findings of this analysis will assist in determining future research and development needs.« less
NASA Astrophysics Data System (ADS)
Bhat, C.; Dix, B.; Choate, A.; Wong, A.; Asam, S.; Schultz, P. A.
2016-12-01
Policy makers can implement more effective climate change adaptation programs if they are provided with two tools: accessible information on the impacts that they need to prepare for, and clear guidance on how to integrate climate change considerations into their work. This presentation will highlight recent and ongoing efforts at the City of Philadelphia to integrate climate science into their decision-making. These efforts include developing a climate change information visualization tool, climate change risk assessments across the city, and processes to integrate climate change into routine planning and budgeting practices. The goal of these efforts is to make climate change science highly targeted to decision maker needs, non-political, easily accessible, and actionable. While sea level rise inundation maps have been available to communities for years, the maps do not effectively communicate how the design of a building or a piece of infrastructure would need to be modified to protect it. The Philadelphia Flood Risk Viewer is an interactive planning tool that allows Philadelphia to identify projected depths of flooding for any location within the City, for a variety of sea level rise and storm surge scenarios. Users can also determine whether a location is located in a FEMA floodplain. By having access to information on the projected depth of flooding at a given location, the City can determine what flood protection measures may be effective, or even inform the long-term viability of developing a particular area. With an understanding of climate vulnerabilities, cities have the opportunity to make smart, climate-resilient investments with their capital budgets that will yield multiple benefits for years to come. Few, however, have established protocols for doing so. Philadelphia, with support from ICF, developed a guidance document that identifies recommendations for integrating climate change considerations throughout the Capital Program and capital budgeting process. For each recommendation, the guidance also provides supplemental resources and information to make the recommendations actionable. Philadelphia is applying the guidance in their FY 2017 capital planning activities and taking advantage of opportunities to grow stronger in the face of climate change.
The Way Point Planning Tool: Real Time Flight Planning for Airborne Science
NASA Technical Reports Server (NTRS)
He, Yubin; Blakeslee, Richard; Goodman, Michael; Hall, John
2012-01-01
Airborne real time observation are a major component of NASA's Earth Science research and satellite ground validation studies. For mission scientist, planning a research aircraft mission within the context of meeting the science objective is a complex task because it requires real time situational awareness of the weather conditions that affect the aircraft track. Multiple aircraft are often involved in the NASA field campaigns the coordination of the aircraft with satellite overpasses, other airplanes and the constantly evolving dynamic weather conditions often determine the success of the campaign. A flight planning tool is needed to provide situational awareness information to the mission scientist and help them plan and modify the flight tracks successfully. Scientists at the University of Alabama Huntsville and the NASA Marshal Space Flight Center developed the Waypoint Planning Tool (WPT), an interactive software tool that enables scientist to develop their own flight plans (also known as waypoints), with point and click mouse capabilities on a digital map filled with time raster and vector data. The development of this Waypoint Planning Tool demonstrates the significance of mission support in responding to the challenges presented during NASA field campaigns. Analyses during and after each campaign helped identify both issues and new requirements, initiating the next wave of development. Currently the Waypoint Planning Tool has gone through three rounds of development and analysis processes. The development of this waypoint tool is directly affected by the technology advances on GIS/Mapping technologies. From the standalone Google Earth application and simple KML functionalities to the Google Earth Plugin and Java Web Start/Applet on web platform, as well as to the rising open source GIS tools with new JavaScript frameworks, the Waypoint planning Tool has entered its third phase of technology advancement. The newly innovated, cross-platform, modular designed JavaScript-controled Waypoint tool is planned to be integrated with the NASA Airborne Science Mission Tool Suite. Adapting new technologies for the Waypoint Planning Tool ensures its success in helping scientist reach their mission objectives. This presentation will discuss the development process of the Waypoint Planning Tool in responding to field campaign challenges, identifying new information technologies, and describing the capabilities and features of the Waypoint Planning Tool with the real time aspect, interactive nature, and the resultant benefits to the airborne science community.
Category-selective attention modulates unconscious processes in the middle occipital gyrus.
Tu, Shen; Qiu, Jiang; Martens, Ulla; Zhang, Qinglin
2013-06-01
Many studies have revealed the top-down modulation (spatial attention, attentional load, etc.) on unconscious processing. However, there is little research about how category-selective attention could modulate the unconscious processing. In the present study, using functional magnetic resonance imaging (fMRI), the results showed that category-selective attention modulated unconscious face/tool processing in the middle occipital gyrus (MOG). Interestingly, MOG effects were of opposed direction for face and tool processes. During unconscious face processing, activation in MOG decreased under the face-selective attention compared with tool-selective attention. This result was in line with the predictive coding theory. During unconscious tool processing, however, activation in MOG increased under the tool-selective attention compared with face-selective attention. The different effects might be ascribed to an interaction between top-down category-selective processes and bottom-up processes in the partial awareness level as proposed by Kouider, De Gardelle, Sackur, and Dupoux (2010). Specifically, we suppose an "excessive activation" hypothesis. Copyright © 2013 Elsevier Inc. All rights reserved.
Design and fabrication of a freeform phase plate for high-order ocular aberration correction
NASA Astrophysics Data System (ADS)
Yi, Allen Y.; Raasch, Thomas W.
2005-11-01
In recent years it has become possible to measure and in some instances to correct the high-order aberrations of human eyes. We have investigated the correction of wavefront error of human eyes by using phase plates designed to compensate for that error. The wavefront aberrations of the four eyes of two subjects were experimentally determined, and compensating phase plates were machined with an ultraprecision diamond-turning machine equipped with four independent axes. A slow-tool servo freeform trajectory was developed for the machine tool path. The machined phase-correction plates were measured and compared with the original design values to validate the process. The position of the phase-plate relative to the pupil is discussed. The practical utility of this mode of aberration correction was investigated with visual acuity testing. The results are consistent with the potential benefit of aberration correction but also underscore the critical positioning requirements of this mode of aberration correction. This process is described in detail from optical measurements, through machining process design and development, to final results.
Data Processing on Database Management Systems with Fuzzy Query
NASA Astrophysics Data System (ADS)
Şimşek, Irfan; Topuz, Vedat
In this study, a fuzzy query tool (SQLf) for non-fuzzy database management systems was developed. In addition, samples of fuzzy queries were made by using real data with the tool developed in this study. Performance of SQLf was tested with the data about the Marmara University students' food grant. The food grant data were collected in MySQL database by using a form which had been filled on the web. The students filled a form on the web to describe their social and economical conditions for the food grant request. This form consists of questions which have fuzzy and crisp answers. The main purpose of this fuzzy query is to determine the students who deserve the grant. The SQLf easily found the eligible students for the grant through predefined fuzzy values. The fuzzy query tool (SQLf) could be used easily with other database system like ORACLE and SQL server.
Challa, Shruthi; Potumarthi, Ravichandra
2013-01-01
Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.
Analytic hierarchy process (AHP) as a tool in asset allocation
NASA Astrophysics Data System (ADS)
Zainol Abidin, Siti Nazifah; Mohd Jaffar, Maheran
2013-04-01
Allocation capital investment into different assets is the best way to balance the risk and reward. This can prevent from losing big amount of money. Thus, the aim of this paper is to help investors in making wise investment decision in asset allocation. This paper proposes modifying and adapting Analytic Hierarchy Process (AHP) model. The AHP model is widely used in various fields of study that are related in decision making. The results of the case studies show that the proposed model can categorize stocks and determine the portion of capital investment. Hence, it can assist investors in decision making process and reduce the risk of loss in stock market investment.
Stamp forming optimization for formability and crystallinity
NASA Astrophysics Data System (ADS)
Donderwinkel, T. G.; Rietman, B.; Haanappel, S. P.; Akkerman, R.
2016-10-01
The stamp forming process is well suited for high volume production of thermoplastic composite parts. The process can be characterized as highly non-isothermal as it involves local quench-cooling of a molten thermoplastic composite blank where it makes contact with colder tooling. The formability of the thermoplastic composite depends on the viscoelastic material behavior of the matrix material, which is sensitive to temperature and degree of crystallinity. An experimental study was performed to determine the effect of temperature and crystallinity on the storage modulus during cooling for a woven glass fiber polyamide-6 composite material. An increase of two decades in modulus was observed during crystallization. As this will significantly impede the blank formability, the onset of crystallization effectively governs the time available for forming. Besides the experimental work, a numerical model is developed to study the temperature and crystallinity throughout the stamp forming process. A process window can be determined by feeding the model with the experimentally obtained data on crystallization.
NASA Astrophysics Data System (ADS)
Kraft, M.; Bürgel, U.
2017-09-01
Modern press shops in the automotive industry have to deal with many challenges. One challenge is to achieve a consistent part quality. In order to reach this target, modern press systems and tools are equipped with several types of sensors. For example, there are sensors to measure characteristic values of blanks or sensors to measure the temperature in the tools. Often several sensors are used simultaneously. A significant parameter for determining the quality of draw panels is the draw-in amount. Previously, it was only possible to measure selective points of the draw-in amount due to sensors in the tools. All the known sensors have disadvantages, for example, they are exposed to wearing or susceptible to contamination. In this paper, a sensor system will be introduced that allows the measurement of the global draw-in amount of a drawn panel. Here, the draw-in amount is not measured in the draw die, it is measured during the transportation of the part to the following operation. Within the short transport time the part can be fully covered by an optical system. This leads to a multitude of advantages compared with previously known systems. For example, it is no longer necessary to equip every tool with sensor technology to measure the draw-in amount. Now it is sufficient to equip every press line with a single system to measure the draw-in. This fact leads not only to lower costs, it also simplifies the tool design. In addition, the risk of contamination of the sensor system is greatly reduced. The paper will also introduce an actuator that was built to locally vary the blankholder forces for a sheet metal forming process. Furthermore, an FEM model is presented that allows the determination of the effective range of these actuators. With the knowledge from the FEM simulation, an approach for an open loop control is presented. With this approach, the press shops at Opel are developing a control procedure in order to influence the stamping process positively.
NASA Astrophysics Data System (ADS)
Lin, S. Y.; Chung, C. T.; Cheng, Y. Y.
2011-01-01
The main objective of this study is to develop a thermo-elastic-plastic coupling model, based on a combination skill of ultrasonically assisted cutting and cryogenic cooling, under large deformation for Inconel 718 alloy machining process. The improvement extent on cutting performance and tool life promotion may be examined from this investigation. The critical value of the strain energy density of the workpiece will be utilized as the chip separation and the discontinuous chip segmentation criteria. The forced convection cooling and a hydrodynamic lubrication model will be considered and formulated in the model. Finite element method will be applied to create a complete numerical solution for this ultrasonic vibration cutting model. During the analysis, the cutting tool is incrementally advanced forward with superimposed ultrasonic vibration in a back and forth step-by-step manner, from an incipient stage of tool-workpiece engagement to a steady state of chip formation, a whole simulation of orthogonal cutting process under plane strain deformation is thus undertaken. High shear strength induces a fluctuation phenomenon of shear angle, high shear strain rate, variation of chip types and chip morphology, tool-chip contact length variation, the temperature distributions within the workpiece, chip and tool, periodic fluctuation in cutting forces can be determined from the developed model. A complete comparison of machining characteristics between some different combinations of ultrasonically assisted cutting and cryogenic cooling with conventional cutting operation can be acquired. Finally, the high-speed turning experiment for Inconel 718 alloy will be taken in the laboratory to validate the accuracy of the model, and the progressive flank wear, crater wear, notching and chipping of the tool edge can also be measured in the experiments.
Rosenbaum, Benjamin P; Silkin, Nikolay; Miller, Randolph A
2014-01-01
Real-time alerting systems typically warn providers about abnormal laboratory results or medication interactions. For more complex tasks, institutions create site-wide 'data warehouses' to support quality audits and longitudinal research. Sophisticated systems like i2b2 or Stanford's STRIDE utilize data warehouses to identify cohorts for research and quality monitoring. However, substantial resources are required to install and maintain such systems. For more modest goals, an organization desiring merely to identify patients with 'isolation' orders, or to determine patients' eligibility for clinical trials, may adopt a simpler, limited approach based on processing the output of one clinical system, and not a data warehouse. We describe a limited, order-entry-based, real-time 'pick off' tool, utilizing public domain software (PHP, MySQL). Through a web interface the tool assists users in constructing complex order-related queries and auto-generates corresponding database queries that can be executed at recurring intervals. We describe successful application of the tool for research and quality monitoring.
Development and Testing of the Church Environment Audit Tool.
Kaczynski, Andrew T; Jake-Schoffman, Danielle E; Peters, Nathan A; Dunn, Caroline G; Wilcox, Sara; Forthofer, Melinda
2018-05-01
In this paper, we describe development and reliability testing of a novel tool to evaluate the physical environment of faith-based settings pertaining to opportunities for physical activity (PA) and healthy eating (HE). Tool development was a multistage process including a review of similar tools, stakeholder review, expert feedback, and pilot testing. Final tool sections included indoor opportunities for PA, outdoor opportunities for PA, food preparation equipment, kitchen type, food for purchase, beverages for purchase, and media. Two independent audits were completed at 54 churches. Interrater reliability (IRR) was determined with Kappa and percent agreement. Of 218 items, 102 were assessed for IRR and 116 could not be assessed because they were not present at enough churches. Percent agreement for all 102 items was over 80%. For 42 items, the sample was too homogeneous to assess Kappa. Forty-six of the remaining items had Kappas greater than 0.60 (25 items 0.80-1.00; 21 items 0.60-0.79), indicating substantial to almost perfect agreement. The tool proved reliable and efficient for assessing church environments and identifying potential intervention points. Future work can focus on applications within faith-based partnerships to understand how church environments influence diverse health outcomes.
Three Dimensional Transient Turbulent Simulations of Scramjet Fuel Injection and Combustion
NASA Astrophysics Data System (ADS)
Bahbaz, Marwane
2011-11-01
Scramjet is a propulsion system that is more effective for hypersonic flights (M >5). The main objective of the simulation is to understand both the mixing and combustion process of air flow using hydrogen fuel in high speed environment s. The understanding of this phenomenon is used to determine the number of fuel injectors required to increase combustion efficiency and energy transfer. Due to the complexity of this simulation, multiple software tools are used to achieve this objective. First, Solid works is used to draw a scramjet combustor with accurate measurements. Second software tool used is Gambit; It is used to make several types of meshes for the scramjet combustor. Finally, Open Foam and CFD++ are software used to process and post process the scramjet combustor. At this stage, the simulation is divided into two categories. The cold flow category is a series of simulations that include subsonic and supersonic turbulent air flow across the combustor channel with fuel interaction from one or more injectors'. The second category is the combustion simulations which involve fluid flow and fuel mixing with ignition. The simulation and modeling of scramjet combustor will assist to investigate and understand the combustion process and energy transfer in hypersonic environment.
Feasibility study tool for semi-rigid joints design of high-rise buildings steel structures
NASA Astrophysics Data System (ADS)
Bagautdinov, Ruslan; Monastireva, Daria; Bodak, Irina; Potapova, Irina
2018-03-01
There are many ways to consider the final cost of the high-rise building structures and to define, which of their different variations are the most effective from different points of view. The research of Jaakko Haapio is conducted in Tampere University of Technology, which aims to develop a method that allows determining the manufacturing and installation costs of steel structures already at the tender phase while taking into account their details. This paper is aimed to make the analysis of the Feature-Based Costing Method for skeletal steel structures proposed by Jaakko Haapio. The most appropriate ways to improve the tool and to implement it in the Russian circumstances for high-rise building design are derived. Presented tool can be useful not only for the designers but, also, for the steel structures manufacturing organizations, which can help to utilize BIM technologies in the organization process and controlling on the factory.
Forecasting municipal solid waste generation using prognostic tools and regression analysis.
Ghinea, Cristina; Drăgoi, Elena Niculina; Comăniţă, Elena-Diana; Gavrilescu, Marius; Câmpean, Teofil; Curteanu, Silvia; Gavrilescu, Maria
2016-11-01
For an adequate planning of waste management systems the accurate forecast of waste generation is an essential step, since various factors can affect waste trends. The application of predictive and prognosis models are useful tools, as reliable support for decision making processes. In this paper some indicators such as: number of residents, population age, urban life expectancy, total municipal solid waste were used as input variables in prognostic models in order to predict the amount of solid waste fractions. We applied Waste Prognostic Tool, regression analysis and time series analysis to forecast municipal solid waste generation and composition by considering the Iasi Romania case study. Regression equations were determined for six solid waste fractions (paper, plastic, metal, glass, biodegradable and other waste). Accuracy Measures were calculated and the results showed that S-curve trend model is the most suitable for municipal solid waste (MSW) prediction. Copyright © 2016 Elsevier Ltd. All rights reserved.
Communicators' perspective on snow avalanche risk communication
NASA Astrophysics Data System (ADS)
Charriere, M. K. M.; Bogaard, T.; Mostert, E.
2014-12-01
Among all the natural hazards, snow avalanches are the only ones for which a public danger scale is globally used. It consists of 5 levels of danger displayed with a given number and colour and for each of them, behavioural advices are provided. Even though this is standardized in most of the countries affected by this natural hazard, the tools (usually websites or smartphone applications) with which the information is disseminated to the general pubic differs, particularly in terms of target audience and level of details. This study aims at gathering the perspectives of several communicators that are responsible for these communication practices. The survey was created to assess how and why choices were made in the design process of the communication tools and to determine how their effectiveness is evaluated. Along with a review of existing avalanche risk communication tools, this study provides guidelines for communication and the evaluation of its effectiveness.
The X-windows interactive navigation data editor
NASA Technical Reports Server (NTRS)
Rinker, G. C.
1992-01-01
A new computer program called the X-Windows Interactive Data Editor (XIDE) was developed and demonstrated as a prototype application for editing radio metric data in the orbit-determination process. The program runs on a variety of workstations and employs pull-down menus and graphical displays, which allow users to easily inspect and edit radio metric data in the orbit data files received from the Deep Space Network (DSN). The XIDE program is based on the Open Software Foundation OSF/Motif Graphical User Interface (GUI) and has proven to be an efficient tool for editing radio metric data in the navigation operations environment. It was adopted by the Magellan Navigation Team as their primary data-editing tool. Because the software was designed from the beginning to be portable, the prototype was successfully moved to new workstation environments. It was also itegrated into the design of the next-generation software tool for DSN multimission navigation interactive launch support.
NASA Astrophysics Data System (ADS)
Hunt, M. J.; Nuttle, W. K.; Cosby, B. J.; Marshall, F. E.
2005-05-01
Establishing minimum flow requirements in aquatic ecosystems is one way to stipulate controls on water withdrawals in a watershed. The basis of the determination is to identify the amount of flow needed to sustain a threshold ecological function. To develop minimum flow criteria an understanding of ecological response in relation to flow is essential. Several steps are needed including: (1) identification of important resources and ecological functions, (2) compilation of available information, (3) determination of historical conditions, (4) establishment of technical relationships between inflow and resources, and (5) identification of numeric criteria that reflect the threshold at which resources are harmed. The process is interdisciplinary requiring the integration of hydrologic and ecologic principles with quantitative assessments. The tools used quantify the ecological response and key questions related to how the quantity of flow influences the ecosystem are examined by comparing minimum flow determination in two different aquatic systems in South Florida. Each system is characterized by substantial hydrologic alteration. The first, the Caloosahatchee River is a riverine system, located on the southwest coast of Florida. The second, the Everglades- Florida Bay ecotone, is a wetland mangrove ecosystem, located on the southern tip of the Florida peninsula. In both cases freshwater submerged aquatic vegetation (Vallisneria americana or Ruppia maritima), located in areas of the saltwater- freshwater interface has been identified as a basis for minimum flow criteria. The integration of field studies, laboratory studies, and literature review was required. From this information we developed ecological modeling tools to quantify and predict plant growth in response to varying environmental variables. Coupled with hydrologic modeling tools questions relating to the quantity and timing of flow and ecological consequences in relation to normal variability are addressed.
Risk Management Implementation Tool
NASA Technical Reports Server (NTRS)
Wright, Shayla L.
2004-01-01
Continuous Risk Management (CM) is a software engineering practice with processes, methods, and tools for managing risk in a project. It provides a controlled environment for practical decision making, in order to assess continually what could go wrong, determine which risk are important to deal with, implement strategies to deal with those risk and assure the measure effectiveness of the implemented strategies. Continuous Risk Management provides many training workshops and courses to teach the staff how to implement risk management to their various experiments and projects. The steps of the CRM process are identification, analysis, planning, tracking, and control. These steps and the various methods and tools that go along with them, identification, and dealing with risk is clear-cut. The office that I worked in was the Risk Management Office (RMO). The RMO at NASA works hard to uphold NASA s mission of exploration and advancement of scientific knowledge and technology by defining and reducing program risk. The RMO is one of the divisions that fall under the Safety and Assurance Directorate (SAAD). I worked under Cynthia Calhoun, Flight Software Systems Engineer. My task was to develop a help screen for the Continuous Risk Management Implementation Tool (RMIT). The Risk Management Implementation Tool will be used by many NASA managers to identify, analyze, track, control, and communicate risks in their programs and projects. The RMIT will provide a means for NASA to continuously assess risks. The goals and purposes for this tool is to provide a simple means to manage risks, be used by program and project managers throughout NASA for managing risk, and to take an aggressive approach to advertise and advocate the use of RMIT at each NASA center.
Reduction of Defects in Germanium-Silicon
NASA Technical Reports Server (NTRS)
Szofran, F. R.; Benz, K. W.; Cobb, S. D.; Croell, A.; Dold, P.; Kaiser, N.; Motakel, S.; Walker, J. S.
2000-01-01
Crystals grown without contact with a container have far superior quality to otherwise similar crystals grown in direct contact with a container. In addition to float-zone processing, detached-Bridgman growth is a promising tool to improve crystal quality, without the limitations of float zoning. Detached growth has been found to occur frequently during microg experiments and considerable improvements of crystal quality have been reported for those cases. However, no thorough understanding of the process or quantitative assessment of the quality improvements exists so far. This project is determining the means to reproducibly grow Ge-Si alloys in the detached mode.
García-Hernández, M-Noelia; Fraga-Hernández, Ma Elena; Mahtani-Chugani, Vinita
2014-12-01
To determine from the health care professionals perspective the impact on clinical practice of incorporating an assessment tool for primary care paediatric emergency. Qualitative study based on the collection of written documents. Twenty-four wide and detailed documents were collected. Thematic analysis was used. Participants were 9 nurses and 7 paediatricians, all with experience in the Paediatric Emergency Department. The results are grouped into three areas: perception of previous situation; benefits perceived; difficulties of the change process related to the triage instrument. The benefits perceived include the achievement of the objectives related to triage as well as collateral benefits for the organization and distribution of structural resources, adequacy of human resources, self-assessment and professional recognition, improvement of team communication and users service perception. The difficulties identified are related to the feasibility of using this instrument when patient flow is high and to the need of specialized training. All participants perceived more benefits than disadvantages, and both nurses and paediatricians experienced the process as a positive experience. The introduction of the assessment tool had a broader impact than expected.
Influence of Process Parameters on the Process Efficiency in Laser Metal Deposition Welding
NASA Astrophysics Data System (ADS)
Güpner, Michael; Patschger, Andreas; Bliedtner, Jens
Conventionally manufactured tools are often completely constructed of a high-alloyed, expensive tool steel. An alternative way to manufacture tools is the combination of a cost-efficient, mild steel and a functional coating in the interaction zone of the tool. Thermal processing methods, like laser metal deposition, are always characterized by thermal distortion. The resistance against the thermal distortion decreases with the reduction of the material thickness. As a consequence, there is a necessity of a special process management for the laser based coating of thin parts or tools. The experimental approach in the present paper is to keep the energy and the mass per unit length constant by varying the laser power, the feed rate and the powder mass flow. The typical seam parameters are measured in order to characterize the cladding process, define process limits and evaluate the process efficiency. Ways to optimize dilution, angular distortion and clad height are presented.
An automated performance budget estimator: a process for use in instrumentation
NASA Astrophysics Data System (ADS)
Laporte, Philippe; Schnetler, Hermine; Rees, Phil
2016-08-01
Current day astronomy projects continue to increase in size and are increasingly becoming more complex, regardless of the wavelength domain, while risks in terms of safety, cost and operability have to be reduced to ensure an affordable total cost of ownership. All of these drivers have to be considered carefully during the development process of an astronomy project at the same time as there is a big drive to shorten the development life-cycle. From the systems engineering point of view, this evolution is a significant challenge. Big instruments imply management of interfaces within large consortia and dealing with tight design phase schedules which necessitate efficient and rapid interactions between all the stakeholders to firstly ensure that the system is defined correctly and secondly that the designs will meet all the requirements. It is essential that team members respond quickly such that the time available for the design team is maximised. In this context, performance prediction tools can be very helpful during the concept phase of a project to help selecting the best design solution. In the first section of this paper we present the development of such a prediction tool that can be used by the system engineer to determine the overall performance of the system and to evaluate the impact on the science based on the proposed design. This tool can also be used in "what-if" design analysis to assess the impact on the overall performance of the system based on the simulated numbers calculated by the automated system performance prediction tool. Having such a tool available from the beginning of a project can allow firstly for a faster turn-around between the design engineers and the systems engineer and secondly, between the systems engineer and the instrument scientist. Following the first section we described the process for constructing a performance estimator tool, followed by describing three projects in which such a tool has been utilised to illustrate how such a tool have been used in astronomy projects. The three use-cases are; EAGLE, one of the European Extremely Large Telescope (E-ELT) Multi-Object Spectrograph (MOS) instruments that was studied from 2007 to 2009, the Multi-Object Optical and Near-Infrared Spectrograph (MOONS) for the European Southern Observatory's Very Large Telescope (VLT), currently under development and SST-GATE.
Applications of colored petri net and genetic algorithms to cluster tool scheduling
NASA Astrophysics Data System (ADS)
Liu, Tung-Kuan; Kuo, Chih-Jen; Hsiao, Yung-Chin; Tsai, Jinn-Tsong; Chou, Jyh-Horng
2005-12-01
In this paper, we propose a method, which uses Coloured Petri Net (CPN) and genetic algorithm (GA) to obtain an optimal deadlock-free schedule and to solve re-entrant problem for the flexible process of the cluster tool. The process of the cluster tool for producing a wafer usually can be classified into three types: 1) sequential process, 2) parallel process, and 3) sequential parallel process. But these processes are not economical enough to produce a variety of wafers in small volume. Therefore, this paper will propose the flexible process where the operations of fabricating wafers are randomly arranged to achieve the best utilization of the cluster tool. However, the flexible process may have deadlock and re-entrant problems which can be detected by CPN. On the other hand, GAs have been applied to find the optimal schedule for many types of manufacturing processes. Therefore, we successfully integrate CPN and GAs to obtain an optimal schedule with the deadlock and re-entrant problems for the flexible process of the cluster tool.
An entrustable professional activity (EPA) for handoffs as a model for EPA assessment development.
Aylward, Michael; Nixon, James; Gladding, Sophia
2014-10-01
Medical education is moving toward assessment of educational outcomes rather than educational processes. The American Board of Internal Medicine and American Board of Pediatrics milestones and the concept of entrustable professional activities (EPA)--skills essential to the practice of medicine that educators progressively entrust learners to perform--provide new approaches to assessing outcomes. Although some defined EPAs exist for internal medicine and pediatrics, the continued development and implementation of EPAs remains challenging. As residency programs are expected to begin reporting milestone-based performance, however, they will need examples of how to overcome these challenges. The authors describe a model for the development and implementation of an EPA using the resident handoff as an example. The model includes nine steps: selecting the EPA, determining where skills are practiced and assessed, addressing barriers to assessment, determining components of the EPA, determining needed assessment tools, developing new assessments if needed, determining criteria for advancement through entrustment levels, mapping milestones to the EPA, and faculty development. Following implementation, 78% of interns at the University of Minnesota Medical School were observed giving handoffs and provided feedback. The authors suggest that this model of EPA development--which includes engaging stakeholders, an iterative process to describing the behavioral characteristics of each domain at each level of entrustment, and the development of specific assessment tools that support both formative feedback and summative decisions about entrustment--can serve as a model for EPA development for other clinical skills and specialty areas.
Linear positioning laser calibration setup of CNC machine tools
NASA Astrophysics Data System (ADS)
Sui, Xiulin; Yang, Congjing
2002-10-01
The linear positioning laser calibration setup of CNC machine tools is capable of executing machine tool laser calibraiotn and backlash compensation. Using this setup, hole locations on CNC machien tools will be correct and machien tool geometry will be evaluated and adjusted. Machien tool laser calibration and backlash compensation is a simple and straightforward process. First the setup is to 'find' the stroke limits of the axis. Then the laser head is then brought into correct alignment. Second is to move the machine axis to the other extreme, the laser head is now aligned, using rotation and elevation adjustments. Finally the machine is moved to the start position and final alignment is verified. The stroke of the machine, and the machine compensation interval dictate the amount of data required for each axis. These factors determine the amount of time required for a through compensation of the linear positioning accuracy. The Laser Calibrator System monitors the material temperature and the air density; this takes into consideration machine thermal growth and laser beam frequency. This linear positioning laser calibration setup can be used on CNC machine tools, CNC lathes, horizontal centers and vertical machining centers.
Testing linen disinfection procedures in practice with phage-charged-bioindicators.
Gerhardts, Anja; Mucha, Helmut; Höfer, Dirk
2012-01-01
Disinfecting laundry processes are essential to avoid contamination of laundering machines and linen during commercial laundry reprocessing in the health care sector. Recently a bacteriophage-charged bioindicator has been developed using MS2 as surrogate virus for testing of low-temperature disinfecting laundry processing on efficacy against viruses related to practice. This paper therefore aims to investigate application of MS2-bioindicators in chemothermal processes under practical conditions (phase 2/step 2) and in practice (phase 3). The experimental design was developed and modified according to the German Society for Hygiene and Microbiology (DGHM) Standard Methods for Testing Chemical Disinfection Processes. Tests under practical conditions were performed at 60 degrees C and 70 degrees C. Additional tests in tunnel washers were carried out at 60 degrees C and 70 degrees C. In all experiments validated disinfecting laundry processes, recommended for bactericidal and virucidal performance (categories A and B), were applied. The results show a temperature-dependent gradual efficacy against the test virus MS2 up to reduction values of more than 8 log10-steps. Therefore MS2-bioindicators prove to be suitable as a tool to determine the performance of disinfection procedures against viruses in practice. Phage-charged bioindicators may be a tool to provide further insights into the reliability of antiviral laundry processes for health care quality management and for infection control.
Respirometric screening of several types of manure and mixtures intended for composting.
Barrena, Raquel; Turet, Josep; Busquets, Anna; Farrés, Moisès; Font, Xavier; Sánchez, Antoni
2011-01-01
The viability of mixtures from manure and agricultural wastes as composting sources were systematically studied using a physicochemical and biological characterization. The combination of different parameters such as C:N ratio, free air space (FAS) and moisture content can help in the formulation of the mixtures. Nevertheless, the composting process may be challenging, particularly at industrial scales. The results of this study suggest that if the respirometric potential is known, it is possible to predict the behaviour of a full scale composting process. Respiration indices can be used as a tool for determining the suitability of composting as applied to manure and complementary wastes. Accordingly, manure and agricultural wastes with a high potential for composting and some proposed mixtures have been characterized in terms of respiration activity. Specifically, the potential of samples to be composted has been determined by means of the oxygen uptake rate (OUR) and the dynamic respirometric index (DRI). During this study, four of these mixtures were composted at full scale in a system consisting of a confined pile with forced aeration. The biological activity was monitored by means of the oxygen uptake rate inside the material (OURinsitu). This new parameter represents the real activity of the process. The comparison between the potential respirometric activities at laboratory scale with the in situ respirometric activity observed at full scale may be a useful tool in the design and optimization of composting systems for manure and other organic agricultural wastes. Copyright © 2010 Elsevier Ltd. All rights reserved.