Hydrogasification reactor and method of operating same
Hobbs, Raymond; Karner, Donald; Sun, Xiaolei; Boyle, John; Noguchi, Fuyuki
2013-09-10
The present invention provides a system and method for evaluating effects of process parameters on hydrogasification processes. The system includes a hydrogasification reactor, a pressurized feed system, a hopper system, a hydrogen gas source, and a carrier gas source. Pressurized carbonaceous material, such as coal, is fed to the reactor using the carrier gas and reacted with hydrogen to produce natural gas.
Production of synthetic fuels using syngas from a steam hydrogasification and reforming process
NASA Astrophysics Data System (ADS)
Raju, Arun Satheesh Kumar
This thesis is aimed at the research, optimization and development of a thermo-chemical process aimed at the production of synthesis gas (mixture of H2 and CO) with a flexible H2 to CO ratio using coupled steam hydrogasification and steam reforming processes. The steam hydrogasification step generates a product gas containing significant amounts of methane by gasifying a carbonaceous feed material with steam and internally generated H2. This product gas is converted to synthesis gas with an excess H2 to CO using the steam reformer. Research involving experimental and simulation work has been conducted on steam hydrogasification, steam reforming and the Fischer-Tropsch reaction. The Aspen Plus simulation tool has been used to develop a process model that can perform heat and mass balance calculations of the whole process using built-in reactor modules and an empirical FT model available in the literature. This model has been used to estimate optimum feed ratios and process conditions for specific feedstocks and products. Steam hydrogasification of coal and wood mixtures of varying coal to wood ratios has been performed in a stirred batch reactor. The carbon conversion of the feedstocks to gaseous products is around 60% at 700°C and 80% at 800°C. The coal to wood ratio of the feedstock does not exert a significant influence on the carbon conversion. The rates of formation of CO, CO 2 and CH4 during gasification have been calculated based on the experimental results using a simple kinetic model. Experimental research on steam reforming has been performed. It has been shown that temperature and the feed CO2/CH4 ratio play a dominant role in determining the product gas H2/CO ratio. Reforming of typical steam hydrogasification product-gas stream has been investigated over a commercial steam reforming catalyst. The results demonstrate that the combined use of steam hydrogasification process with a reformer can generate a synthesis gas with a predetermined H2/CO ratio from carbonaceous feedstocks. Experimental work on the Fischer-Tropsch synthesis has also been performed. A life cycle analysis has been performed with the objective of comparing the life cycle energy consumption and emissions of synthetic diesel fuel produced through the CE-CERT process with other fuel/vehicle combinations. The experimental and simulation results presented here demonstrate that the CE-CERT process is versatile and can potentially handle a number of different feedstocks. CE-CERT process appears to be suitable for commercialization in very large scales with a coal feedstock and also in a distributed network of smaller scale reactors utilizing localized renewable feedstocks.
EVALUATION OF BIOMASS REACTIVITY IN HYDROGASIFICATION FOR THE HYNOL PROCESS
The report gives results of an evaluation of the reactivity of poplar wood in hydrogasification under the operating conditions specific for the Hynol process, using a thermobalance reactor. Parameters affecting gasification behavior (e.g., gas velocity, particle size, system pres...
Experimental Study of Hydrogasification of Lignite and Subbituminous Coal Chars
Gil, Stanisław
2015-01-01
The experimental facility for pressure hydrogasification research was adapted to the pressure of 10 MPa and temperature of 1300 K, which ensured repeatability of results and hydrogen heating to the process temperature. A hydrogasification reaction of chars produced from two rank coals was investigated at temperatures up to 1173 K, pressures up to 8 MPa, and the gas flow rates of 0.5–5 dmn 3/min. Reactivity of the “Szczerców” lignite char was found to be slightly higher than that of the subbituminous “Janina” coal char produced under the same conditions. A high value of the char reactivity was observed to a certain carbon conversion degree, above which a sharp drop took place. It was shown that, to achieve proper carbon conversion, the hydrogasification reaction must proceed at a temperature above 1200 K. PMID:26065028
Production of Substitute Natural Gas from Coal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrew Lucero
2009-01-31
The goal of this research program was to develop and demonstrate a novel gasification technology to produce substitute natural gas (SNG) from coal. The technology relies on a continuous sequential processing method that differs substantially from the historic methanation or hydro-gasification processing technologies. The thermo-chemistry relies on all the same reactions, but the processing sequences are different. The proposed concept is appropriate for western sub-bituminous coals, which tend to be composed of about half fixed carbon and about half volatile matter (dry ash-free basis). In the most general terms the process requires four steps (1) separating the fixed carbon frommore » the volatile matter (pyrolysis); (2) converting the volatile fraction into syngas (reforming); (3) reacting the syngas with heated carbon to make methane-rich fuel gas (methanation and hydro-gasification); and (4) generating process heat by combusting residual char (combustion). A key feature of this technology is that no oxygen plant is needed for char combustion.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Xiaolei; Rink, Nancy T
2011-04-29
This report presents an integrated energy system that combines the production of substitute natural gas through coal hydrogasification with an algae process for beneficial carbon dioxide (CO2) use and biofuel production (funded under Department of Energy (DOE) contract DE-FE0001099). The project planned to develop, test, operate and evaluate a 2 ton-per-day coal hydrogasification plant and 25-acre algae farm at the Arizona Public Service (APS) 1000 Megawatt (MW) Cholla coal-fired power plant in Joseph City, Arizona. Conceptual design of the integrated system was undertaken with APS partners Air Liquide (AL) and Parsons. The process engineering was separated into five major areas:more » flue gas preparation and CO2 delivery, algae farming, water management, hydrogasification, and biofuel production. The process flow diagrams, energy and material balances, and preliminary major equipment needs for each major area were prepared to reflect integrated process considerations and site infrastructure design basis. The total project also included research and development on a bench-scale hydrogasifier, one-dimensional (1-D) kinetic-model simulation, extensive algae stressing, oil extraction, lipid analysis and a half-acre algae farm demonstration at APS?s Redhawk testing facility. During the project, a two-acre algae testing facility with a half-acre algae cultivation area was built at the APS Redhawk 1000 MW natural gas combined cycle power plant located 55 miles west of Phoenix. The test site integrated flue gas delivery, CO2 capture and distribution, algae cultivation, algae nursery, algae harvesting, dewatering and onsite storage as well as water treatment. The site environmental, engineering, and biological parameters for the cultivators were monitored remotely. Direct biodiesel production from biomass through an acid-catalyzed transesterification reaction and a supercritical methanol transesterification reaction were evaluated. The highest oil-to-biodiesel conversion of 79.9% was achieved with a stressed algae sample containing 40% algae oil. The effort concluded that producing biodiesel directly from the algae biomass could be an efficient, cost-effective and readily scalable way to produce biodiesel by eliminating the oil extraction process.« less
A FEASIBILITY STUDY FOR THE COPROCESSING OF FOSSIL FUELS WITH BIOMASS BY THE HYDROCARB PROCESS
The report describes and gives results of an assessment of a new process concept for the production of carbon and methanol from fossil fuels. The Hydrocarb Process consists of the hydrogasification of carbonaceous material to produce methane, which is subsequently thermally decom...
NASA Technical Reports Server (NTRS)
1975-01-01
Hydrogen production from coal by hydrogasification is described. The process involves the solubilization of coal to form coal liquids, which are hydrogasified to produce synthetic pipeline gas; steam reforming this synthetic gas by a nuclear heat source produces hydrogen. A description is given of the hydrogen plant, its performance, and its effect on the environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sansone, M.J.
1979-02-01
On the basis of simple, first approximation calculations, it has been shown that catalytic gasification and hydrogasification are inherently superior to conventional gasification with respect to carbon utilization and thermal efficiency. However, most processes which are directed toward the production of substitute natural gas (SNG) by direct combination of coal with steam at low temperatures (catalytic processes) or with hydrogen (hydrogasification) will require a step for separation of product SNG from a recycle stream. The success or falure of the process could well depend upon the economics of this separation scheme. The energetics for the separation of mixtures of idealmore » gases has been considered in some detail. Minimum energies for complete separation of representative effluent mixtures have been calculated as well as energies for separation into product and recycle streams. The gas mixtures include binary systems of H/sub 2/ and CH/sub 4/ and ternary mixtures of H/sub 2/, CH/sub 4/, and CO. A brief summary of a number of different real separation schemes has also been included. We have arbitrarily divided these into five categories: liquefaction, absorption, adsorption, chemical, and diffusional methods. These separation methods will be screened and the more promising methods examined in more detail in later reports. Finally, a brief mention of alternative coal conversion processes concludes this report.« less
Assessment of advanced coal gasification processes
NASA Technical Reports Server (NTRS)
Mccarthy, J.; Ferrall, J.; Charng, T.; Houseman, J.
1981-01-01
A technical assessment of the following advanced coal gasification processes is presented: high throughput gasification (HTG) process; single stage high mass flux (HMF) processes; (CS/R) hydrogasification process; and the catalytic coal gasification (CCG) process. Each process is evaluated for its potential to produce synthetic natural gas from a bituminous coal. Key similarities, differences, strengths, weaknesses, and potential improvements to each process are identified. The HTG and the HMF gasifiers share similarities with respect to: short residence time (SRT), high throughput rate, slagging, and syngas as the initial raw product gas. The CS/R hydrogasifier is also SRT, but is nonslagging and produces a raw gas high in methane content. The CCG gasifier is a long residence time, catalytic, fluidbed reactor producing all of the raw product methane in the gasifier.
We expect to find that the application of our technology can significantly benefit the school, environmentally and economically. Our lab-scale demonstration results would lead to larger demonstration scale investigations and ultimately implementation on campus showcasing the p...
Hydrogen production from coal using a nuclear heat source
NASA Technical Reports Server (NTRS)
Quade, R. N.
1976-01-01
A strong candidate for hydrogen production in the intermediate time frame of 1985 to 1995 is a coal-based process using a high-temperature gas-cooled reactor (HTGR) as a heat source. Expected process efficiencies in the range of 60 to 70% are considerably higher than all other hydrogen production processes except steam reforming of a natural gas. The process involves the preparation of a coal liquid, hydrogasification of that liquid, and steam reforming of the resulting gaseous or light liquid product. A study showing process efficiency and cost of hydrogen vs nuclear reactor core outlet temperature has been completed, and shows diminishing returns at process temperatures above about 1500 F. A possible scenario combining the relatively abundant and low-cost Western coal deposits with the Gulf Coast hydrogen users is presented which provides high-energy density transportation utilizing coal liquids and uranium.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Xiaolei; Rink, Nancy
This report presents the results of the research and development conducted on an Advanced Hydrogasification Process (AHP) conceived and developed by Arizona Public Service Company (APS) under U.S. Department of Energy (DOE) contract: DE-FC26-06NT42759 for Substitute Natural Gas (SNG) production from western coal. A double-wall (i.e., a hydrogasification contained within a pressure shell) down-flow hydrogasification reactor was designed, engineered, constructed, commissioned and operated by APS, Phoenix, AZ. The reactor is ASME-certified under Section VIII with a rating of 1150 pounds per square inch gage (psig) maximum allowable working pressure at 1950 degrees Fahrenheit (°F). The reaction zone had a 1.75more » inch inner diameter and 13 feet length. The initial testing of a sub-bituminous coal demonstrated ~ 50% carbon conversion and ~10% methane yield in the product gas under 1625°F, 1000 psig pressure, with a 11 seconds (s) residence time, and 0.4 hydrogen-to-coal mass ratio. Liquid by-products mainly contained Benzene, Toluene, Xylene (BTX) and tar. Char collected from the bottom of the reactor had 9000-British thermal units per pound (Btu/lb) heating value. A three-dimensional (3D) computational fluid dynamic model simulation of the hydrodynamics around the reactor head was utilized to design the nozzles for injecting the hydrogen into the gasifier to optimize gas-solid mixing to achieve improved carbon conversion. The report also presents the evaluation of using algae for carbon dioxide (CO 2) management and biofuel production. Nannochloropsis, Selenastrum and Scenedesmus were determined to be the best algae strains for the project purpose and were studied in an outdoor system which included a 6-meter (6M) radius cultivator with a total surface area of 113 square meters (m 2) and a total culture volume between 10,000 to 15,000 liters (L); a CO 2 on-demand feeding system; an on-line data collection system for temperature, pH, Photosynthetically Activate Radiation (PAR) and dissolved oxygen (DO); and a ~2 gallons per minute (gpm) algae culture dewatering system. Among the three algae strains, Scenedesmus showed the most tolerance to temperature and irradiance conditions in Phoenix and the best self-settling characteristics. Experimental findings and operational strategies determined through these tests guided the operation of the algae cultivation system for the scale-up study. Effect of power plant flue gas, especially heavy metals, on algae growth and biomass adsorption were evaluated as well.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raymond Hobbs
2007-05-31
The Advanced Hydrogasification Process (AHP)--conversion of coal to methane--is being developed through NETL with a DOE Grant and has successfully completed its first phase of development. The results so far are encouraging and have led to commitment by DOE/NETL to begin a second phase--bench scale reactor vessel testing, expanded engineering analysis and economic perspective review. During the next decade new means of generating electricity, and other forms of energy, will be introduced. The members of the AHP Team envision a need for expanded sources of natural gas or substitutes for natural gas, to fuel power generating plants. The initial workmore » the team has completed on a process to use hydrogen to convert coal to methane (pipeline ready gas) shows promising potential. The Team has intentionally slanted its efforts toward the needs of US electric utilities, particularly on fuels that can be used near urban centers where the greatest need for new electric generation is found. The process, as it has evolved, would produce methane from coal by adding hydrogen. The process appears to be efficient using western coals for conversion to a highly sought after fuel with significantly reduced CO{sub 2} emissions. Utilities have a natural interest in the preservation of their industry, which will require a dramatic reduction in stack emissions and an increase in sustainable technologies. Utilities tend to rank long-term stable supplies of fuel higher than most industries and are willing to trade some ratio of cost for stability. The need for sustainability, stability and environmentally compatible production are key drivers in the formation and progression of the AHP development. In Phase II, the team will add a focus on water conservation to determine how the basic gasification process can be best integrated with all the plant components to minimize water consumption during SNG production. The process allows for several CO{sub 2} reduction options including consumption of the CO{sub 2} in the original process as converted to methane. The process could under another option avoid emissions following the conversion to SNG through an adjunct algae conversion process. The algae would then be converted to fuels or other products. An additional application of the algae process at the end use natural gas fired plant could further reduce emissions. The APS team fully recognizes the competition facing the process from natural gas and imported liquid natural gas. While we expect those resources to set the price for methane in the near-term, the team's work to date indicates that the AHP process can be commercially competitive, with the added benefit of assuring long-term energy supplies from North American resources. Conversion of coal to a more readily transportable fuel that can be employed near load centers with an overall reduction of greenhouses gases is edging closer to reality.« less
System and method for producing substitute natural gas from coal
Hobbs, Raymond [Avondale, AZ
2012-08-07
The present invention provides a system and method for producing substitute natural gas and electricity, while mitigating production of any greenhouse gasses. The system includes a hydrogasification reactor, to form a gas stream including natural gas and a char stream, and an oxygen burner to combust the char material to form carbon oxides. The system also includes an algae farm to convert the carbon oxides to hydrocarbon material and oxygen.
Assessment of Advanced Coal Gasification Processes
NASA Technical Reports Server (NTRS)
McCarthy, John; Ferrall, Joseph; Charng, Thomas; Houseman, John
1981-01-01
This report represents a technical assessment of the following advanced coal gasification processes: AVCO High Throughput Gasification (HTG) Process; Bell Single-Stage High Mass Flux (HMF) Process; Cities Service/Rockwell (CS/R) Hydrogasification Process; Exxon Catalytic Coal Gasification (CCG) Process. Each process is evaluated for its potential to produce SNG from a bituminous coal. In addition to identifying the new technology these processes represent, key similarities/differences, strengths/weaknesses, and potential improvements to each process are identified. The AVCO HTG and the Bell HMF gasifiers share similarities with respect to: short residence time (SRT), high throughput rate, slagging and syngas as the initial raw product gas. The CS/R Hydrogasifier is also SRT but is non-slagging and produces a raw gas high in methane content. The Exxon CCG gasifier is a long residence time, catalytic, fluidbed reactor producing all of the raw product methane in the gasifier. The report makes the following assessments: 1) while each process has significant potential as coal gasifiers, the CS/R and Exxon processes are better suited for SNG production; 2) the Exxon process is the closest to a commercial level for near-term SNG production; and 3) the SRT processes require significant development including scale-up and turndown demonstration, char processing and/or utilization demonstration, and reactor control and safety features development.
NASA Astrophysics Data System (ADS)
Steinberg, M.; Dong, Yuanji
1993-10-01
The Hynol process is proposed to meet the demand for an economical process for methanol production with reduced CO2 emission. This new process consists of three reaction steps: (1) hydrogasification of biomass, (2) steam reforming of the produced gas with additional natural gas feedstock, and (3) methanol synthesis of the hydrogen and carbon monoxide produced during the previous two steps. The H2-rich gas remaining after methanol synthesis is recycled to gasify the biomass in an energy neutral reactor so that there is no need for an expensive oxygen plant as required by commercial steam gasifiers. Recycling gas allows the methanol synthesis reactor to perform at a relatively lower pressure than conventional while the plant still maintains high methanol yield. Energy recovery designed into the process minimizes heat loss and increases the process thermal efficiency. If the Hynol methanol is used as an alternative and more efficient automotive fuel, an overall 41% reduction in CO2 emission can be achieved compared to the use of conventional gasoline fuel. A preliminary economic estimate shows that the total capital investment for a Hynol plant is 40% lower than that for a conventional biomass gasification plant. The methanol production cost is $0.43/gal for a 1085 million gal/yr Hynol plant which is competitive with current U.S. methanol and equivalent gasoline prices. Process flowsheet and simulation data using biomass and natural gas as cofeedstocks are presented. The Hynol process can convert any condensed carbonaceous material, especially municipal solid waste (MSW), to produce methanol.
DOE Coal Gasification Multi-Test Facility: fossil fuel processing technical/professional services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hefferan, J.K.; Lee, G.Y.; Boesch, L.P.
1979-07-13
A conceptual design, including process descriptions, heat and material balances, process flow diagrams, utility requirements, schedule, capital and operating cost estimate, and alternative design considerations, is presented for the DOE Coal Gasification Multi-Test Facility (GMTF). The GMTF, an engineering scale facility, is to provide a complete plant into which different types of gasifiers and conversion/synthesis equipment can be readily integrated for testing in an operational environment at relatively low cost. The design allows for operation of several gasifiers simultaneously at a total coal throughput of 2500 tons/day; individual gasifiers operate at up to 1200 tons/day and 600 psig using airmore » or oxygen. Ten different test gasifiers can be in place at the facility, but only three can be operated at one time. The GMTF can produce a spectrum of saleable products, including low Btu, synthesis and pipeline gases, hydrogen (for fuel cells or hydrogasification), methanol, gasoline, diesel and fuel oils, organic chemicals, and electrical power (potentially). In 1979 dollars, the base facility requires a $288 million capital investment for common-use units, $193 million for four gasification units and four synthesis units, and $305 million for six years of operation. Critical reviews of detailed vendor designs are appended for a methanol synthesis unit, three entrained flow gasifiers, a fluidized bed gasifier, and a hydrogasifier/slag-bath gasifier.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agrawal, P.D.; Butt, N.M.; Sarma, K.R.
This report covers a preliminary conceptual design and economic evaluation of a commercial scale plant capable of converting high-sulfur bituminous caking coal to a high-Btu pipeline quality SNG. The plant, which has a rated capacity of 250 Billion Btu per day SNG, is based on Cities Service/Rockwell hydrogasification technology. Two cases of plant design were examined to produce cost estimates accurate to +-25% in 1979 dollars. The base case, designed for moderate production of liquids (5.8% conversion of carbon to liquid product), has a cost of SNG of $4.43/MMBtu using the utility financing method (UFM) and $6.42/MMBtu using the discountedmore » cash flow method (DCFM) of financing. The alternate case, zero liquids production, has gas costs of $5.00 (UFM) and $6.96 (DCFM). Further tests by Rockwell have indicated that 11.4% carbon conversion to liquid products (99% benzene) is possible. If the plant is scaled up to produce the same amoung of SNG with this increased yield of liquid, and if the value of the benzene produced is estimated to be $0.90 per gallon, the costs of gas for this case are $4.38/MMBtu (UFM) and $6,48/MMBtu (DCFM). If the value of benzene is taken as $2.00 per gallon, these costs become $3.14/MMBtu (UFM) and $5.23/MMBtu (DCFM). The economic assumptions involved in these calculations are detailed.« less
NASA Technical Reports Server (NTRS)
Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.
1974-01-01
A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.
NASA Astrophysics Data System (ADS)
Raju, B. S.; Sekhar, U. Chandra; Drakshayani, D. N.
2017-08-01
The paper investigates optimization of stereolithography process for SL5530 epoxy resin material to enhance part quality. The major characteristics indexed for performance selected to evaluate the processes are tensile strength, Flexural strength, Impact strength and Density analysis and corresponding process parameters are Layer thickness, Orientation and Hatch spacing. In this study, the process is intrinsically with multiple parameters tuning so that grey relational analysis which uses grey relational grade as performance index is specially adopted to determine the optimal combination of process parameters. Moreover, the principal component analysis is applied to evaluate the weighting values corresponding to various performance characteristics so that their relative importance can be properly and objectively desired. The results of confirmation experiments reveal that grey relational analysis coupled with principal component analysis can effectively acquire the optimal combination of process parameters. Hence, this confirm that the proposed approach in this study can be an useful tool to improve the process parameters in stereolithography process, which is very useful information for machine designers as well as RP machine users.
Criteria for Comparing Domain Analysis Approaches Version 01.00.00
1991-12-01
Down-Bottom-Up Domain Analysis Process (1990 Version) ..... 14 Figure 8. FODAs Domain Analysis Process ............................................ 16... FODA , which uses the Design Approach for Real-Time Systems (DARTS) design method (Gomaa 1984)? 1 1. hntmduction Domain analysis is still immature... Analysis Process 16 2. An Orvicw of Some Domain AnabAppro•d•a 2.4.3 ExAzs The FODA report illustrates the process by using the window management
NASA Technical Reports Server (NTRS)
Deckert, George
2010-01-01
This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts
Designing Image Analysis Pipelines in Light Microscopy: A Rational Approach.
Arganda-Carreras, Ignacio; Andrey, Philippe
2017-01-01
With the progress of microscopy techniques and the rapidly growing amounts of acquired imaging data, there is an increased need for automated image processing and analysis solutions in biological studies. Each new application requires the design of a specific image analysis pipeline, by assembling a series of image processing operations. Many commercial or free bioimage analysis software are now available and several textbooks and reviews have presented the mathematical and computational fundamentals of image processing and analysis. Tens, if not hundreds, of algorithms and methods have been developed and integrated into image analysis software, resulting in a combinatorial explosion of possible image processing sequences. This paper presents a general guideline methodology to rationally address the design of image processing and analysis pipelines. The originality of the proposed approach is to follow an iterative, backwards procedure from the target objectives of analysis. The proposed goal-oriented strategy should help biologists to better apprehend image analysis in the context of their research and should allow them to efficiently interact with image processing specialists.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boggs, Paul T.; Althsuler, Alan; Larzelere, Alex R.
2005-08-01
The Design-through-Analysis Realization Team (DART) is chartered with reducing the time Sandia analysts require to complete the engineering analysis process. The DART system analysis team studied the engineering analysis processes employed by analysts in Centers 9100 and 8700 at Sandia to identify opportunities for reducing overall design-through-analysis process time. The team created and implemented a rigorous analysis methodology based on a generic process flow model parameterized by information obtained from analysts. They also collected data from analysis department managers to quantify the problem type and complexity distribution throughout Sandia's analyst community. They then used this information to develop a communitymore » model, which enables a simple characterization of processes that span the analyst community. The results indicate that equal opportunity for reducing analysis process time is available both by reducing the ''once-through'' time required to complete a process step and by reducing the probability of backward iteration. In addition, reducing the rework fraction (i.e., improving the engineering efficiency of subsequent iterations) offers approximately 40% to 80% of the benefit of reducing the ''once-through'' time or iteration probability, depending upon the process step being considered. Further, the results indicate that geometry manipulation and meshing is the largest portion of an analyst's effort, especially for structural problems, and offers significant opportunity for overall time reduction. Iteration loops initiated late in the process are more costly than others because they increase ''inner loop'' iterations. Identifying and correcting problems as early as possible in the process offers significant opportunity for time savings.« less
Methods utilized in evaluating the profitability of commercial space processing
NASA Technical Reports Server (NTRS)
Bloom, H. L.; Schmitt, P. T.
1976-01-01
Profitability analysis is applied to commercial space processing on the basis of business concept definition and assessment and the relationship between ground and space functions. Throughput analysis is demonstrated by analysis of the space manufacturing of surface acoustic wave devices. The paper describes a financial analysis model for space processing and provides key profitability measures for space processed isoenzymes.
QUAGOL: a guide for qualitative data analysis.
Dierckx de Casterlé, Bernadette; Gastmans, Chris; Bryon, Els; Denier, Yvonne
2012-03-01
Data analysis is a complex and contested part of the qualitative research process, which has received limited theoretical attention. Researchers are often in need of useful instructions or guidelines on how to analyze the mass of qualitative data, but face the lack of clear guidance for using particular analytic methods. The aim of this paper is to propose and discuss the Qualitative Analysis Guide of Leuven (QUAGOL), a guide that was developed in order to be able to truly capture the rich insights of qualitative interview data. The article describes six major problems researchers are often struggling with during the process of qualitative data analysis. Consequently, the QUAGOL is proposed as a guide to facilitate the process of analysis. Challenges emerged and lessons learned from own extensive experiences with qualitative data analysis within the Grounded Theory Approach, as well as from those of other researchers (as described in the literature), were discussed and recommendations were presented. Strengths and pitfalls of the proposed method were discussed in detail. The Qualitative Analysis Guide of Leuven (QUAGOL) offers a comprehensive method to guide the process of qualitative data analysis. The process consists of two parts, each consisting of five stages. The method is systematic but not rigid. It is characterized by iterative processes of digging deeper, constantly moving between the various stages of the process. As such, it aims to stimulate the researcher's intuition and creativity as optimal as possible. The QUAGOL guide is a theory and practice-based guide that supports and facilitates the process of analysis of qualitative interview data. Although the method can facilitate the process of analysis, it cannot guarantee automatic quality. The skills of the researcher and the quality of the research team remain the most crucial components of a successful process of analysis. Additionally, the importance of constantly moving between the various stages throughout the research process cannot be overstated. Copyright © 2011 Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-12
... of performing the technical analysis, management assessment, and program evaluation tasks required to.... Analysis of elements of the review process (including the presubmission process, and investigational device... time to facilitate a more efficient process. This includes analysis of root causes for inefficiencies...
Uncertainty Budget Analysis for Dimensional Inspection Processes (U)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valdez, Lucas M.
2012-07-26
This paper is intended to provide guidance and describe how to prepare an uncertainty analysis of a dimensional inspection process through the utilization of an uncertainty budget analysis. The uncertainty analysis is stated in the same methodology as that of the ISO GUM standard for calibration and testing. There is a specific distinction between how Type A and Type B uncertainty analysis is used in a general and specific process. All theory and applications are utilized to represent both a generalized approach to estimating measurement uncertainty and how to report and present these estimations for dimensional measurements in a dimensionalmore » inspection process. The analysis of this uncertainty budget shows that a well-controlled dimensional inspection process produces a conservative process uncertainty, which can be attributed to the necessary assumptions in place for best possible results.« less
Tidal analysis and Arrival Process Mining Using Automatic Identification System (AIS) Data
2017-01-01
files, organized by location. The data were processed using the Python programming language (van Rossum and Drake 2001), the Pandas data analysis...ER D C/ CH L TR -1 7- 2 Coastal Inlets Research Program Tidal Analysis and Arrival Process Mining Using Automatic Identification System...17-2 January 2017 Tidal Analysis and Arrival Process Mining Using Automatic Identification System (AIS) Data Brandan M. Scully Coastal and
Post-test navigation data analysis techniques for the shuttle ALT
NASA Technical Reports Server (NTRS)
1975-01-01
Postflight test analysis data processing techniques for shuttle approach and landing tests (ALT) navigation data are defined. Postfight test processor requirements are described along with operational and design requirements, data input requirements, and software test requirements. The postflight test data processing is described based on the natural test sequence: quick-look analysis, postflight navigation processing, and error isolation processing. Emphasis is placed on the tradeoffs that must remain open and subject to analysis until final definition is achieved in the shuttle data processing system and the overall ALT plan. A development plan for the implementation of the ALT postflight test navigation data processing system is presented. Conclusions are presented.
Schaub, Jochen; Clemens, Christoph; Kaufmann, Hitto; Schulz, Torsten W
2012-01-01
Development of efficient bioprocesses is essential for cost-effective manufacturing of recombinant therapeutic proteins. To achieve further process improvement and process rationalization comprehensive data analysis of both process data and phenotypic cell-level data is essential. Here, we present a framework for advanced bioprocess data analysis consisting of multivariate data analysis (MVDA), metabolic flux analysis (MFA), and pathway analysis for mapping of large-scale gene expression data sets. This data analysis platform was applied in a process development project with an IgG-producing Chinese hamster ovary (CHO) cell line in which the maximal product titer could be increased from about 5 to 8 g/L.Principal component analysis (PCA), k-means clustering, and partial least-squares (PLS) models were applied to analyze the macroscopic bioprocess data. MFA and gene expression analysis revealed intracellular information on the characteristics of high-performance cell cultivations. By MVDA, for example, correlations between several essential amino acids and the product concentration were observed. Also, a grouping into rather cell specific productivity-driven and process control-driven processes could be unraveled. By MFA, phenotypic characteristics in glycolysis, glutaminolysis, pentose phosphate pathway, citrate cycle, coupling of amino acid metabolism to citrate cycle, and in the energy yield could be identified. By gene expression analysis 247 deregulated metabolic genes were identified which are involved, inter alia, in amino acid metabolism, transport, and protein synthesis.
Dynamic analysis of process reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shadle, L.J.; Lawson, L.O.; Noel, S.D.
1995-06-01
The approach and methodology of conducting a dynamic analysis is presented in this poster session in order to describe how this type of analysis can be used to evaluate the operation and control of process reactors. Dynamic analysis of the PyGas{trademark} gasification process is used to illustrate the utility of this approach. PyGas{trademark} is the gasifier being developed for the Gasification Product Improvement Facility (GPIF) by Jacobs-Siffine Engineering and Riley Stoker. In the first step of the analysis, process models are used to calculate the steady-state conditions and associated sensitivities for the process. For the PyGas{trademark} gasifier, the process modelsmore » are non-linear mechanistic models of the jetting fluidized-bed pyrolyzer and the fixed-bed gasifier. These process sensitivities are key input, in the form of gain parameters or transfer functions, to the dynamic engineering models.« less
[Process management in the hospital pharmacy for the improvement of the patient safety].
Govindarajan, R; Perelló-Juncá, A; Parès-Marimòn, R M; Serrais-Benavente, J; Ferrandez-Martí, D; Sala-Robinat, R; Camacho-Calvente, A; Campabanal-Prats, C; Solà-Anderiu, I; Sanchez-Caparrós, S; Gonzalez-Estrada, J; Martinez-Olalla, P; Colomer-Palomo, J; Perez-Mañosas, R; Rodríguez-Gallego, D
2013-01-01
To define a process management model for a hospital pharmacy in order to measure, analyse and make continuous improvements in patient safety and healthcare quality. In order to implement process management, Igualada Hospital was divided into different processes, one of which was the Hospital Pharmacy. A multidisciplinary management team was given responsibility for each process. For each sub-process one person was identified to be responsible, and a working group was formed under his/her leadership. With the help of each working group, a risk analysis using failure modes and effects analysis (FMEA) was performed, and the corresponding improvement actions were implemented. Sub-process indicators were also identified, and different process management mechanisms were introduced. The first risk analysis with FMEA produced more than thirty preventive actions to improve patient safety. Later, the weekly analysis of errors, as well as the monthly analysis of key process indicators, permitted us to monitor process results and, as each sub-process manager participated in these meetings, also to assume accountability and responsibility, thus consolidating the culture of excellence. The introduction of different process management mechanisms, with the participation of people responsible for each sub-process, introduces a participative management tool for the continuous improvement of patient safety and healthcare quality. Copyright © 2012 SECA. Published by Elsevier Espana. All rights reserved.
Global Sensitivity Analysis for Process Identification under Model Uncertainty
NASA Astrophysics Data System (ADS)
Ye, M.; Dai, H.; Walker, A. P.; Shi, L.; Yang, J.
2015-12-01
The environmental system consists of various physical, chemical, and biological processes, and environmental models are always built to simulate these processes and their interactions. For model building, improvement, and validation, it is necessary to identify important processes so that limited resources can be used to better characterize the processes. While global sensitivity analysis has been widely used to identify important processes, the process identification is always based on deterministic process conceptualization that uses a single model for representing a process. However, environmental systems are complex, and it happens often that a single process may be simulated by multiple alternative models. Ignoring the model uncertainty in process identification may lead to biased identification in that identified important processes may not be so in the real world. This study addresses this problem by developing a new method of global sensitivity analysis for process identification. The new method is based on the concept of Sobol sensitivity analysis and model averaging. Similar to the Sobol sensitivity analysis to identify important parameters, our new method evaluates variance change when a process is fixed at its different conceptualizations. The variance considers both parametric and model uncertainty using the method of model averaging. The method is demonstrated using a synthetic study of groundwater modeling that considers recharge process and parameterization process. Each process has two alternative models. Important processes of groundwater flow and transport are evaluated using our new method. The method is mathematically general, and can be applied to a wide range of environmental problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haurykiewicz, John Paul; Dinehart, Timothy Grant; Parker, Robert Young
2016-05-12
The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with informationmore » and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.« less
Chemical Sensing in Process Analysis.
ERIC Educational Resources Information Center
Hirschfeld, T.; And Others
1984-01-01
Discusses: (1) rationale for chemical sensors in process analysis; (2) existing types of process chemical sensors; (3) sensor limitations, considering lessons of chemometrics; (4) trends in process control sensors; and (5) future prospects. (JN)
Articulating the Resources for Business Process Analysis and Design
ERIC Educational Resources Information Center
Jin, Yulong
2012-01-01
Effective process analysis and modeling are important phases of the business process management lifecycle. When many activities and multiple resources are involved, it is very difficult to build a correct business process specification. This dissertation provides a resource perspective of business processes. It aims at a better process analysis…
Conducting Qualitative Data Analysis: Qualitative Data Analysis as a Metaphoric Process
ERIC Educational Resources Information Center
Chenail, Ronald J.
2012-01-01
In the second of a series of "how-to" essays on conducting qualitative data analysis, Ron Chenail argues the process can best be understood as a metaphoric process. From this orientation he suggests researchers follow Kenneth Burke's notion of metaphor and see qualitative data analysis as the analyst systematically considering the "this-ness" of…
Techno-economic analysis Process model development for existing and conceptual processes Detailed heat integration Economic analysis of integrated processes Integration of process simulation learnings into control ;Conceptual Process Design and Techno-Economic Assessment of Ex Situ Catalytic Fast Pyrolysis of Biomass: A
[A SAS marco program for batch processing of univariate Cox regression analysis for great database].
Yang, Rendong; Xiong, Jie; Peng, Yangqin; Peng, Xiaoning; Zeng, Xiaomin
2015-02-01
To realize batch processing of univariate Cox regression analysis for great database by SAS marco program. We wrote a SAS macro program, which can filter, integrate, and export P values to Excel by SAS9.2. The program was used for screening survival correlated RNA molecules of ovarian cancer. A SAS marco program could finish the batch processing of univariate Cox regression analysis, the selection and export of the results. The SAS macro program has potential applications in reducing the workload of statistical analysis and providing a basis for batch processing of univariate Cox regression analysis.
Using task analysis to improve the requirements elicitation in health information system.
Teixeira, Leonor; Ferreira, Carlos; Santos, Beatriz Sousa
2007-01-01
This paper describes the application of task analysis within the design process of a Web-based information system for managing clinical information in hemophilia care, in order to improve the requirements elicitation and, consequently, to validate the domain model obtained in a previous phase of the design process (system analysis). The use of task analysis in this case proved to be a practical and efficient way to improve the requirements engineering process by involving users in the design process.
The finite element simulation analysis research of 38CrSi cylindrical power spinning
NASA Astrophysics Data System (ADS)
Liang, Wei; Lv, Qiongying; Zhao, Yujuan; Lv, Yunxia
2018-01-01
In order to grope for the influence of the main cylindrical spinning process parameters on the spinning process, this paper combines with real tube power spinning process and uses ABAQUS finite element analysis software to simulate the tube power spinning process of 38CrSi steel materials, through the analysis of the stress, strain of the part forming process, analyzes the influence of the thickness reduction and the feed rate to the forming process, and analyzes the variation of the spinning force, finally determines the reasonable main spinning process parameters combination.
Tornado detection data reduction and analysis
NASA Technical Reports Server (NTRS)
Davisson, L. D.
1977-01-01
Data processing and analysis was provided in support of tornado detection by analysis of radio frequency interference in various frequency bands. Sea state determination data from short pulse radar measurements were also processed and analyzed. A backscatter simulation was implemented to predict radar performance as a function of wind velocity. Computer programs were developed for the various data processing and analysis goals of the effort.
Optical analysis of crystal growth
NASA Technical Reports Server (NTRS)
Workman, Gary L.; Passeur, Andrea; Harper, Sabrina
1994-01-01
Processing and data reduction of holographic images from Spacelab presents some interesting challenges in determining the effects of microgravity on crystal growth processes. Evaluation of several processing techniques, including the Computerized Holographic Image Processing System and the image processing software ITEX150, will provide fundamental information for holographic analysis of the space flight data.
Integrated Structural Analysis and Test Program
NASA Technical Reports Server (NTRS)
Kaufman, Daniel
2005-01-01
An integrated structural-analysis and structure-testing computer program is being developed in order to: Automate repetitive processes in testing and analysis; Accelerate pre-test analysis; Accelerate reporting of tests; Facilitate planning of tests; Improve execution of tests; Create a vibration, acoustics, and shock test database; and Integrate analysis and test data. The software package includes modules pertaining to sinusoidal and random vibration, shock and time replication, acoustics, base-driven modal survey, and mass properties and static/dynamic balance. The program is commanded by use of ActiveX controls. There is minimal need to generate command lines. Analysis or test files are selected by opening a Windows Explorer display. After selecting the desired input file, the program goes to a so-called analysis data process or test data process, depending on the type of input data. The status of the process is given by a Windows status bar, and when processing is complete, the data are reported in graphical, tubular, and matrix form.
Negotiation Process Analysis: A Research and Training Tool.
ERIC Educational Resources Information Center
Williams, Timothy
This paper proposes the use of interaction process analysis to study negotiation behaviors. Following a review of current literature in the field, the paper presents a theoretical framework for the analysis of both labor/management and social negotiation processes. Central to the framework described are two systems of activities that together…
Quantitative analysis of geomorphic processes using satellite image data at different scales
NASA Technical Reports Server (NTRS)
Williams, R. S., Jr.
1985-01-01
When aerial and satellite photographs and images are used in the quantitative analysis of geomorphic processes, either through direct observation of active processes or by analysis of landforms resulting from inferred active or dormant processes, a number of limitations in the use of such data must be considered. Active geomorphic processes work at different scales and rates. Therefore, the capability of imaging an active or dormant process depends primarily on the scale of the process and the spatial-resolution characteristic of the imaging system. Scale is an important factor in recording continuous and discontinuous active geomorphic processes, because what is not recorded will not be considered or even suspected in the analysis of orbital images. If the geomorphic process of landform change caused by the process is less than 200 m in x to y dimension, then it will not be recorded. Although the scale factor is critical, in the recording of discontinuous active geomorphic processes, the repeat interval of orbital-image acquisition of a planetary surface also is a consideration in order to capture a recurring short-lived geomorphic process or to record changes caused by either a continuous or a discontinuous geomorphic process.
Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.
Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias
2016-01-01
To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.
Thermodynamic analysis of resources used in manufacturing processes.
Gutowski, Timothy G; Branham, Matthew S; Dahmus, Jeffrey B; Jones, Alissa J; Thiriez, Alexandre
2009-03-01
In this study we use a thermodynamic framework to characterize the material and energy resources used in manufacturing processes. The analysis and data span a wide range of processes from "conventional" processes such as machining, casting, and injection molding, to the so-called "advanced machining" processes such as electrical discharge machining and abrasive waterjet machining, and to the vapor-phase processes used in semiconductor and nanomaterials fabrication. In all, 20 processes are analyzed. The results show that the intensity of materials and energy used per unit of mass of material processed (measured either as specific energy or exergy) has increased by at least 6 orders of magnitude over the past several decades. The increase of material/energy intensity use has been primarily a consequence of the introduction of new manufacturing processes, rather than changes in traditional technologies. This phenomenon has been driven by the desire for precise small-scale devices and product features and enabled by stable and declining material and energy prices over this period. We illustrate the relevance of thermodynamics (including exergy analysis) for all processes in spite of the fact that long-lasting focus in manufacturing has been on product quality--not necessarily energy/material conversion efficiency. We promote the use of thermodynamics tools for analysis of manufacturing processes within the context of rapidly increasing relevance of sustainable human enterprises. We confirm that exergy analysis can be used to identify where resources are lost in these processes, which is the first step in proposing and/or redesigning new more efficient processes.
Thermochemical Conversion Techno-Economic Analysis | Bioenergy | NREL
Conversion Techno-Economic Analysis Thermochemical Conversion Techno-Economic Analysis NREL's Thermochemical Conversion Analysis team focuses on the conceptual process design and techno-economic analysis , detailed process models, and TEA developed under this project provide insights into the potential economic
Structured Analysis and the Data Flow Diagram: Tools for Library Analysis.
ERIC Educational Resources Information Center
Carlson, David H.
1986-01-01
This article discusses tools developed to aid the systems analysis process (program evaluation and review technique, Gantt charts, organizational charts, decision tables, flowcharts, hierarchy plus input-process-output). Similarities and differences among techniques, library applications of analysis, structured systems analysis, and the data flow…
Meta-analysis using Dirichlet process.
Muthukumarana, Saman; Tiwari, Ram C
2016-02-01
This article develops a Bayesian approach for meta-analysis using the Dirichlet process. The key aspect of the Dirichlet process in meta-analysis is the ability to assess evidence of statistical heterogeneity or variation in the underlying effects across study while relaxing the distributional assumptions. We assume that the study effects are generated from a Dirichlet process. Under a Dirichlet process model, the study effects parameters have support on a discrete space and enable borrowing of information across studies while facilitating clustering among studies. We illustrate the proposed method by applying it to a dataset on the Program for International Student Assessment on 30 countries. Results from the data analysis, simulation studies, and the log pseudo-marginal likelihood model selection procedure indicate that the Dirichlet process model performs better than conventional alternative methods. © The Author(s) 2012.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-19
... performing the technical analysis, management assessment, and program evaluation tasks required to address... premarket reviews that meet regulatory review standards. 2. Analysis of elements of the review process... process. This includes analysis of root causes for inefficiencies that may affect review performance and...
NASA Technical Reports Server (NTRS)
Fegley, K. A.; Hayden, J. H.; Rehmann, D. W.
1974-01-01
The feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems is investigated. It is shown that a digital computer may be used in an interactive mode for the design, modeling, analysis, and comparison of power processing systems.
Code of Federal Regulations, 2014 CFR
2014-07-01
..., or interpretation of any geological data and information. Initial analysis and processing are the stages of analysis or processing where the data and information first become available for in-house... geochemical) data and information describing each operation of analysis, processing, and interpretation; (2...
Code of Federal Regulations, 2014 CFR
2014-07-01
... initial analysis, processing, or interpretation of any geological data and information. Initial analysis and processing are the stages of analysis or processing where the data and information first become... information are available for submission, inspection, and selection? 580.40 Section 580.40 Mineral Resources...
Code of Federal Regulations, 2012 CFR
2012-07-01
..., or interpretation of any geological data and information. Initial analysis and processing are the stages of analysis or processing where the data and information first become available for in-house... geochemical) data and information describing each operation of analysis, processing, and interpretation; (2...
Code of Federal Regulations, 2012 CFR
2012-07-01
... initial analysis, processing, or interpretation of any geological data and information. Initial analysis and processing are the stages of analysis or processing where the data and information first become... information are available for submission, inspection, and selection? 580.40 Section 580.40 Mineral Resources...
Code of Federal Regulations, 2013 CFR
2013-07-01
..., or interpretation of any geological data and information. Initial analysis and processing are the stages of analysis or processing where the data and information first become available for in-house... geochemical) data and information describing each operation of analysis, processing, and interpretation; (2...
Code of Federal Regulations, 2013 CFR
2013-07-01
... initial analysis, processing, or interpretation of any geological data and information. Initial analysis and processing are the stages of analysis or processing where the data and information first become... information are available for submission, inspection, and selection? 580.40 Section 580.40 Mineral Resources...
Code of Federal Regulations, 2011 CFR
2011-07-01
... complete the initial analysis, processing, or interpretation of any geological data and information. Initial analysis and processing are the stages of analysis or processing where the data and information... information are available for submission, inspection, and selection? 280.40 Section 280.40 Mineral Resources...
Yan, Binjun; Fang, Zhonghua; Shen, Lijuan; Qu, Haibin
2015-01-01
The batch-to-batch quality consistency of herbal drugs has always been an important issue. To propose a methodology for batch-to-batch quality control based on HPLC-MS fingerprints and process knowledgebase. The extraction process of Compound E-jiao Oral Liquid was taken as a case study. After establishing the HPLC-MS fingerprint analysis method, the fingerprints of the extract solutions produced under normal and abnormal operation conditions were obtained. Multivariate statistical models were built for fault detection and a discriminant analysis model was built using the probabilistic discriminant partial-least-squares method for fault diagnosis. Based on multivariate statistical analysis, process knowledge was acquired and the cause-effect relationship between process deviations and quality defects was revealed. The quality defects were detected successfully by multivariate statistical control charts and the type of process deviations were diagnosed correctly by discriminant analysis. This work has demonstrated the benefits of combining HPLC-MS fingerprints, process knowledge and multivariate analysis for the quality control of herbal drugs. Copyright © 2015 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.
1994-01-01
Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.
Analysis of acoustic emission signals and monitoring of machining processes
Govekar; Gradisek; Grabec
2000-03-01
Monitoring of a machining process on the basis of sensor signals requires a selection of informative inputs in order to reliably characterize and model the process. In this article, a system for selection of informative characteristics from signals of multiple sensors is presented. For signal analysis, methods of spectral analysis and methods of nonlinear time series analysis are used. With the aim of modeling relationships between signal characteristics and the corresponding process state, an adaptive empirical modeler is applied. The application of the system is demonstrated by characterization of different parameters defining the states of a turning machining process, such as: chip form, tool wear, and onset of chatter vibration. The results show that, in spite of the complexity of the turning process, the state of the process can be well characterized by just a few proper characteristics extracted from a representative sensor signal. The process characterization can be further improved by joining characteristics from multiple sensors and by application of chaotic characteristics.
Assessing Group Interaction with Social Language Network Analysis
NASA Astrophysics Data System (ADS)
Scholand, Andrew J.; Tausczik, Yla R.; Pennebaker, James W.
In this paper we discuss a new methodology, social language network analysis (SLNA), that combines tools from social language processing and network analysis to assess socially situated working relationships within a group. Specifically, SLNA aims to identify and characterize the nature of working relationships by processing artifacts generated with computer-mediated communication systems, such as instant message texts or emails. Because social language processing is able to identify psychological, social, and emotional processes that individuals are not able to fully mask, social language network analysis can clarify and highlight complex interdependencies between group members, even when these relationships are latent or unrecognized.
Process mining techniques: an application to time management
NASA Astrophysics Data System (ADS)
Khowaja, Ali Raza
2018-04-01
In an environment people have to make sure that all of their work are completed within a given time in accordance with its quality. In order to achieve the real phenomenon of process mining one needs to understand all of these processes in a detailed manner. Personal Information and communication has always been a highlighting issue on internet but for now information and communication tools within factual life refers to their daily schedule, location analysis, environmental analysis and, more generally, social media applications support these systems which makes data available for data analysis generated through event logs, but also for process analysis which combines environmental and location analysis. Process mining can be used to exploit all these real live processes with the help of the event logs which are already available in those datasets through user censored data or may be user labeled data. These processes could be used to redesign a user's flow and understand all these processes in a bit more detailed manner. In order to increase the quality of each of the processes that we go through our daily lives is to give a closer look to each of the processes and after analyzing them, one should make changes to get better results. On the contrarily, we applied process mining techniques on seven different subjects combined in a single dataset collected from Korea. Above all, the following paper comments on the efficiency of processes in the event logs referring to time management's sphere of influence.
Statistical process control methods allow the analysis and improvement of anesthesia care.
Fasting, Sigurd; Gisvold, Sven E
2003-10-01
Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.
NASA Technical Reports Server (NTRS)
Singh, S. P.
1979-01-01
The computer software developed to set up a method for Wiener spectrum analysis of photographic films is presented. This method is used for the quantitative analysis of the autoradiographic enhancement process. The software requirements and design for the autoradiographic enhancement process are given along with the program listings and the users manual. A software description and program listings modification of the data analysis software are included.
Logistics Process Analysis ToolProcess Analysis Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
2008-03-31
LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fortmore » Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less
Code of Federal Regulations, 2011 CFR
2011-07-01
...) refrigerant to be returned to a refrigerant reclamation facility that will process it to the appropriate ARI... and Assembly Processes (Process FMEA) and Effects Analysis for Machinery (Machinery FMEA). SAE... Manufacturing and Assembly Processes (Process FMEA), and Potential Failure Mode and Effects Analysis for...
Code of Federal Regulations, 2013 CFR
2013-07-01
...) refrigerant to be returned to a refrigerant reclamation facility that will process it to the appropriate ARI... and Assembly Processes (Process FMEA) and Effects Analysis for Machinery (Machinery FMEA). SAE... Manufacturing and Assembly Processes (Process FMEA), and Potential Failure Mode and Effects Analysis for...
Code of Federal Regulations, 2012 CFR
2012-07-01
...) refrigerant to be returned to a refrigerant reclamation facility that will process it to the appropriate ARI... and Assembly Processes (Process FMEA) and Effects Analysis for Machinery (Machinery FMEA). SAE... Manufacturing and Assembly Processes (Process FMEA), and Potential Failure Mode and Effects Analysis for...
Code of Federal Regulations, 2010 CFR
2010-07-01
...) refrigerant to be returned to a refrigerant reclamation facility that will process it to the appropriate ARI... and Assembly Processes (Process FMEA) and Effects Analysis for Machinery (Machinery FMEA). SAE... Manufacturing and Assembly Processes (Process FMEA), and Potential Failure Mode and Effects Analysis for...
Code of Federal Regulations, 2014 CFR
2014-07-01
...) refrigerant to be returned to a refrigerant reclamation facility that will process it to the appropriate ARI... and Assembly Processes (Process FMEA) and Effects Analysis for Machinery (Machinery FMEA). SAE... Manufacturing and Assembly Processes (Process FMEA), and Potential Failure Mode and Effects Analysis for...
An empirical analysis of the distribution of overshoots in a stationary Gaussian stochastic process
NASA Technical Reports Server (NTRS)
Carter, M. C.; Madison, M. W.
1973-01-01
The frequency distribution of overshoots in a stationary Gaussian stochastic process is analyzed. The primary processes involved in this analysis are computer simulation and statistical estimation. Computer simulation is used to simulate stationary Gaussian stochastic processes that have selected autocorrelation functions. An analysis of the simulation results reveals a frequency distribution for overshoots with a functional dependence on the mean and variance of the process. Statistical estimation is then used to estimate the mean and variance of a process. It is shown that for an autocorrelation function, the mean and the variance for the number of overshoots, a frequency distribution for overshoots can be estimated.
NASA Technical Reports Server (NTRS)
Casasent, D.
1978-01-01
The article discusses several optical configurations used for signal processing. Electronic-to-optical transducers are outlined, noting fixed window transducers and moving window acousto-optic transducers. Folded spectrum techniques are considered, with reference to wideband RF signal analysis, fetal electroencephalogram analysis, engine vibration analysis, signal buried in noise, and spatial filtering. Various methods for radar signal processing are described, such as phased-array antennas, the optical processing of phased-array data, pulsed Doppler and FM radar systems, a multichannel one-dimensional optical correlator, correlations with long coded waveforms, and Doppler signal processing. Means for noncoherent optical signal processing are noted, including an optical correlator for speech recognition and a noncoherent optical correlator.
Conducting qualitative research in mental health: Thematic and content analyses.
Crowe, Marie; Inder, Maree; Porter, Richard
2015-07-01
The objective of this paper is to describe two methods of qualitative analysis - thematic analysis and content analysis - and to examine their use in a mental health context. A description of the processes of thematic analysis and content analysis is provided. These processes are then illustrated by conducting two analyses of the same qualitative data. Transcripts of qualitative interviews are analysed using each method to illustrate these processes. The illustration of the processes highlights the different outcomes from the same set of data. Thematic and content analyses are qualitative methods that serve different research purposes. Thematic analysis provides an interpretation of participants' meanings, while content analysis is a direct representation of participants' responses. These methods provide two ways of understanding meanings and experiences and provide important knowledge in a mental health context. © The Royal Australian and New Zealand College of Psychiatrists 2015.
Canister Storage Building (CSB) Hazard Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
POWERS, T.B.
2000-03-16
This report describes the methodology used in conducting the Canister Storage Building (CSB) Hazard Analysis to support the final CSB Safety Analysis Report and documents the results. This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis process identified hazardous conditions and material-at-risk, determined causes for potential accidents, identified preventive and mitigative features, and qualitatively estimated the frequencies and consequences of specific occurrences. The hazard analysis was performed by a team of cognizant CSB operations and design personnel, safetymore » analysts familiar with the CSB, and technical experts in specialty areas. The material included in this report documents the final state of a nearly two-year long process. Attachment A provides two lists of hazard analysis team members and describes the background and experience of each. The first list is a complete list of the hazard analysis team members that have been involved over the two-year long process. The second list is a subset of the first list and consists of those hazard analysis team members that reviewed and agreed to the final hazard analysis documentation. The material included in this report documents the final state of a nearly two-year long process involving formal facilitated group sessions and independent hazard and accident analysis work. The hazard analysis process led to the selection of candidate accidents for further quantitative analysis. New information relative to the hazards, discovered during the accident analysis, was incorporated into the hazard analysis data in order to compile a complete profile of facility hazards. Through this process, the results of the hazard and accident analyses led directly to the identification of safety structures, systems, and components, technical safety requirements, and other controls required to protect the public, workers, and environment.« less
Neurophysiological analysis of echolocation in bats
NASA Technical Reports Server (NTRS)
Suga, N.
1972-01-01
An analysis of echolocation and signal processing in brown bats is presented. Data cover echo detection, echo ranging, echolocalization, and echo analysis. Efforts were also made to identify the part of the brain that carries out the most essential processing function for echolocation. Results indicate the inferior colliculus and the auditory nuclei function together to process this information.
National Job Corps Study: Report on the Process Analysis. Research and Evaluation Report Series.
ERIC Educational Resources Information Center
Johnson, Terry; Gritz, Mark; Jackson, Russell; Burghardt, John; Boussy, Carol; Leonard, Jan; Orians, Carlyn
This report presents results of a process analysis that describes and documents Job Corps services and operations. Chapter one provides overviews of Job Corps, the national Job Corps study, and the process analysis. Chapter two describes the administrative structure of Job Corps and presents data on the geographic distribution and characteristics…
Research on the raw data processing method of the hydropower construction project
NASA Astrophysics Data System (ADS)
Tian, Zhichao
2018-01-01
In this paper, based on the characteristics of the fixed data, this paper compares the various mathematical statistics analysis methods and chooses the improved Grabs criterion to analyze the data, and through the analysis of the data processing, the data processing method is not suitable. It is proved that this method can be applied to the processing of fixed raw data. This paper provides a reference for reasonably determining the effective quota analysis data.
Usage of information safety requirements in improving tube bending process
NASA Astrophysics Data System (ADS)
Livshitz, I. I.; Kunakov, E.; Lontsikh, P. A.
2018-05-01
This article is devoted to an improvement of the technological process's analysis with the information security requirements implementation. The aim of this research is the competition increase analysis in aircraft industry enterprises due to the information technology implementation by the example of the tube bending technological process. The article analyzes tube bending kinds and current technique. In addition, a potential risks analysis in a tube bending technological process is carried out in terms of information security.
Real-time fMRI processing with physiological noise correction - Comparison with off-line analysis.
Misaki, Masaya; Barzigar, Nafise; Zotev, Vadim; Phillips, Raquel; Cheng, Samuel; Bodurka, Jerzy
2015-12-30
While applications of real-time functional magnetic resonance imaging (rtfMRI) are growing rapidly, there are still limitations in real-time data processing compared to off-line analysis. We developed a proof-of-concept real-time fMRI processing (rtfMRIp) system utilizing a personal computer (PC) with a dedicated graphic processing unit (GPU) to demonstrate that it is now possible to perform intensive whole-brain fMRI data processing in real-time. The rtfMRIp performs slice-timing correction, motion correction, spatial smoothing, signal scaling, and general linear model (GLM) analysis with multiple noise regressors including physiological noise modeled with cardiac (RETROICOR) and respiration volume per time (RVT). The whole-brain data analysis with more than 100,000voxels and more than 250volumes is completed in less than 300ms, much faster than the time required to acquire the fMRI volume. Real-time processing implementation cannot be identical to off-line analysis when time-course information is used, such as in slice-timing correction, signal scaling, and GLM. We verified that reduced slice-timing correction for real-time analysis had comparable output with off-line analysis. The real-time GLM analysis, however, showed over-fitting when the number of sampled volumes was small. Our system implemented real-time RETROICOR and RVT physiological noise corrections for the first time and it is capable of processing these steps on all available data at a given time, without need for recursive algorithms. Comprehensive data processing in rtfMRI is possible with a PC, while the number of samples should be considered in real-time GLM. Copyright © 2015 Elsevier B.V. All rights reserved.
A framework supporting the development of a Grid portal for analysis based on ROI.
Ichikawa, K; Date, S; Kaishima, T; Shimojo, S
2005-01-01
In our research on brain function analysis, users require two different simultaneous types of processing: interactive processing to a specific part of data and high-performance batch processing to an entire dataset. The difference between these two types of processing is in whether or not the analysis is for data in the region of interest (ROI). In this study, we propose a Grid portal that has a mechanism to freely assign computing resources to the users on a Grid environment according to the users' two different types of processing requirements. We constructed a Grid portal which integrates interactive processing and batch processing by the following two mechanisms. First, a job steering mechanism controls job execution based on user-tagged priority among organizations with heterogeneous computing resources. Interactive jobs are processed in preference to batch jobs by this mechanism. Second, a priority-based result delivery mechanism that administrates a rank of data significance. The portal ensures a turn-around time of interactive processing by the priority-based job controlling mechanism, and provides the users with quality of services (QoS) for interactive processing. The users can access the analysis results of interactive jobs in preference to the analysis results of batch jobs. The Grid portal has also achieved high-performance computation of MEG analysis with batch processing on the Grid environment. The priority-based job controlling mechanism has been realized to freely assign computing resources to the users' requirements. Furthermore the achievement of high-performance computation contributes greatly to the overall progress of brain science. The portal has thus made it possible for the users to flexibly include the large computational power in what they want to analyze.
Optimization of Parameter Ranges for Composite Tape Winding Process Based on Sensitivity Analysis
NASA Astrophysics Data System (ADS)
Yu, Tao; Shi, Yaoyao; He, Xiaodong; Kang, Chao; Deng, Bo; Song, Shibo
2017-08-01
This study is focus on the parameters sensitivity of winding process for composite prepreg tape. The methods of multi-parameter relative sensitivity analysis and single-parameter sensitivity analysis are proposed. The polynomial empirical model of interlaminar shear strength is established by response surface experimental method. Using this model, the relative sensitivity of key process parameters including temperature, tension, pressure and velocity is calculated, while the single-parameter sensitivity curves are obtained. According to the analysis of sensitivity curves, the stability and instability range of each parameter are recognized. Finally, the optimization method of winding process parameters is developed. The analysis results show that the optimized ranges of the process parameters for interlaminar shear strength are: temperature within [100 °C, 150 °C], tension within [275 N, 387 N], pressure within [800 N, 1500 N], and velocity within [0.2 m/s, 0.4 m/s], respectively.
Biomedical image analysis and processing in clouds
NASA Astrophysics Data System (ADS)
Bednarz, Tomasz; Szul, Piotr; Arzhaeva, Yulia; Wang, Dadong; Burdett, Neil; Khassapov, Alex; Chen, Shiping; Vallotton, Pascal; Lagerstrom, Ryan; Gureyev, Tim; Taylor, John
2013-10-01
Cloud-based Image Analysis and Processing Toolbox project runs on the Australian National eResearch Collaboration Tools and Resources (NeCTAR) cloud infrastructure and allows access to biomedical image processing and analysis services to researchers via remotely accessible user interfaces. By providing user-friendly access to cloud computing resources and new workflow-based interfaces, our solution enables researchers to carry out various challenging image analysis and reconstruction tasks. Several case studies will be presented during the conference.
Problem Based Learning: Cognitive and Metacognitive Processes during Problem Analysis.
ERIC Educational Resources Information Center
De Grave, W. S.; And Others
1996-01-01
To investigate whether problem-based learning leads to conceptual change, the cognitive and metacognitive processes of a group of medical students were studied during the problem analysis phase, and their verbal communication and thinking processes were analyzed. Stimulated recall of the thinking process during the discussion detected a conceptual…
Multi-Disciplinary, Multi-Fidelity Discrete Data Transfer Using Degenerate Geometry Forms
NASA Technical Reports Server (NTRS)
Olson, Erik D.
2016-01-01
In a typical multi-fidelity design process, different levels of geometric abstraction are used for different analysis methods, and transitioning from one phase of design to the next often requires a complete re-creation of the geometry. To maintain consistency between lower-order and higher-order analysis results, Vehicle Sketch Pad (OpenVSP) recently introduced the ability to generate and export several degenerate forms of the geometry, representing the type of abstraction required to perform low- to medium-order analysis for a range of aeronautical disciplines. In this research, the functionality of these degenerate models was extended, so that in addition to serving as repositories for the geometric information that is required as input to an analysis, the degenerate models can also store the results of that analysis mapped back onto the geometric nodes. At the same time, the results are also mapped indirectly onto the nodes of lower-order degenerate models using a process called aggregation, and onto higher-order models using a process called disaggregation. The mapped analysis results are available for use by any subsequent analysis in an integrated design and analysis process. A simple multi-fidelity analysis process for a single-aisle subsonic transport aircraft is used as an example case to demonstrate the value of the approach.
Carroll, Adam J; Badger, Murray R; Harvey Millar, A
2010-07-14
Standardization of analytical approaches and reporting methods via community-wide collaboration can work synergistically with web-tool development to result in rapid community-driven expansion of online data repositories suitable for data mining and meta-analysis. In metabolomics, the inter-laboratory reproducibility of gas-chromatography/mass-spectrometry (GC/MS) makes it an obvious target for such development. While a number of web-tools offer access to datasets and/or tools for raw data processing and statistical analysis, none of these systems are currently set up to act as a public repository by easily accepting, processing and presenting publicly submitted GC/MS metabolomics datasets for public re-analysis. Here, we present MetabolomeExpress, a new File Transfer Protocol (FTP) server and web-tool for the online storage, processing, visualisation and statistical re-analysis of publicly submitted GC/MS metabolomics datasets. Users may search a quality-controlled database of metabolite response statistics from publicly submitted datasets by a number of parameters (eg. metabolite, species, organ/biofluid etc.). Users may also perform meta-analysis comparisons of multiple independent experiments or re-analyse public primary datasets via user-friendly tools for t-test, principal components analysis, hierarchical cluster analysis and correlation analysis. They may interact with chromatograms, mass spectra and peak detection results via an integrated raw data viewer. Researchers who register for a free account may upload (via FTP) their own data to the server for online processing via a novel raw data processing pipeline. MetabolomeExpress https://www.metabolome-express.org provides a new opportunity for the general metabolomics community to transparently present online the raw and processed GC/MS data underlying their metabolomics publications. Transparent sharing of these data will allow researchers to assess data quality and draw their own insights from published metabolomics datasets.
Viscoelastic properties of chalcogenide glasses and the simulation of their molding processes
NASA Astrophysics Data System (ADS)
Liu, Weiguo; Shen, Ping; Jin, Na
In order to simulate the precision molding process, the viscoelastic properties of chalcogenide glasses under high temperatures were investigated. Thermomechanical analysis were performed to measure and analysis the thermomechanical properties of chalcogenide glasses. The creep responses of the glasses at different temperatures were obtained. Finite element analysis was applied for the simulation of the molding processes. The simulation results were in consistence with previously reported experiment results. Stress concentration and evolution during the molding processes was also described with the simulation results.
Microscopic Evaluation of Friction Plug Welds- Correlation to a Processing Analysis
NASA Technical Reports Server (NTRS)
Rabenberg, Ellen M.; Chen, Poshou; Gorti, Sridhar
2017-01-01
Recently an analysis of dynamic forge load data from the friction plug weld (FPW) process and the corresponding tensile test results showed that good plug welds fit well within an analytically determined processing parameter box. There were, however, some outliers that compromised the predictions. Here the microstructure of the plug weld material is presented in view of the load analysis with the intent of further understanding the FPW process and how it is affected by the grain structure and subsequent mechanical properties.
The application of digital techniques to the analysis of metallurgical experiments
NASA Technical Reports Server (NTRS)
Rathz, T. J.
1977-01-01
The application of a specific digital computer system (known as the Image Data Processing System) to the analysis of three NASA-sponsored metallurgical experiments is discussed in some detail. The basic hardware and software components of the Image Data Processing System are presented. Many figures are presented in the discussion of each experimental analysis in an attempt to show the accuracy and speed that the Image Data Processing System affords in analyzing photographic images dealing with metallurgy, and in particular with material processing.
Vision-sensing image analysis for GTAW process control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Long, D.D.
1994-11-01
Image analysis of a gas tungsten arc welding (GTAW) process was completed using video images from a charge coupled device (CCD) camera inside a specially designed coaxial (GTAW) electrode holder. Video data was obtained from filtered and unfiltered images, with and without the GTAW arc present, showing weld joint features and locations. Data Translation image processing boards, installed in an IBM PC AT 386 compatible computer, and Media Cybernetics image processing software were used to investigate edge flange weld joint geometry for image analysis.
Online Analysis Enhances Use of NASA Earth Science Data
NASA Technical Reports Server (NTRS)
Acker, James G.; Leptoukh, Gregory
2007-01-01
Giovanni, the Goddard Earth Sciences Data and Information Services Center (GES DISC) Interactive Online Visualization and Analysis Infrastructure, has provided researchers with advanced capabilities to perform data exploration and analysis with observational data from NASA Earth observation satellites. In the past 5-10 years, examining geophysical events and processes with remote-sensing data required a multistep process of data discovery, data acquisition, data management, and ultimately data analysis. Giovanni accelerates this process by enabling basic visualization and analysis directly on the World Wide Web. In the last two years, Giovanni has added new data acquisition functions and expanded analysis options to increase its usefulness to the Earth science research community.
Sensitivity Analysis in RIPless Compressed Sensing
2014-10-01
SECURITY CLASSIFICATION OF: The compressive sensing framework finds a wide range of applications in signal processing and analysis. Within this...Analysis of Compressive Sensing Solutions Report Title The compressive sensing framework finds a wide range of applications in signal processing and...compressed sensing. More specifically, we show that in a noiseless and RIP-less setting [11], the recovery process of a compressed sensing framework is
Self-conscious robotic system design process--from analysis to implementation.
Chella, Antonio; Cossentino, Massimo; Seidita, Valeria
2011-01-01
Developing robotic systems endowed with self-conscious capabilities means realizing complex sub-systems needing ad-hoc software engineering techniques for their modelling, analysis and implementation. In this chapter the whole process (from analysis to implementation) to model the development of self-conscious robotic systems is presented and the new created design process, PASSIC, supporting each part of it, is fully illustrated.
STATISTICAL ANALYSIS OF SNAP 10A THERMOELECTRIC CONVERTER ELEMENT PROCESS DEVELOPMENT VARIABLES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitch, S.H.; Morris, J.W.
1962-12-15
Statistical analysis, primarily analysis of variance, was applied to evaluate several factors involved in the development of suitable fabrication and processing techniques for the production of lead telluride thermoelectric elements for the SNAP 10A energy conversion system. The analysis methods are described as to their application for determining the effects of various processing steps, estabIishing the value of individual operations, and evaluating the significance of test results. The elimination of unnecessary or detrimental processing steps was accomplished and the number of required tests was substantially reduced by application of these statistical methods to the SNAP 10A production development effort. (auth)
NASA Technical Reports Server (NTRS)
Consoli, Robert David; Sobieszczanski-Sobieski, Jaroslaw
1990-01-01
Advanced multidisciplinary analysis and optimization methods, namely system sensitivity analysis and non-hierarchical system decomposition, are applied to reduce the cost and improve the visibility of an automated vehicle design synthesis process. This process is inherently complex due to the large number of functional disciplines and associated interdisciplinary couplings. Recent developments in system sensitivity analysis as applied to complex non-hierarchic multidisciplinary design optimization problems enable the decomposition of these complex interactions into sub-processes that can be evaluated in parallel. The application of these techniques results in significant cost, accuracy, and visibility benefits for the entire design synthesis process.
NASA Astrophysics Data System (ADS)
Aprilia, Ayu Rizky; Santoso, Imam; Ekasari, Dhita Murita
2017-05-01
Yogurt is a product based on milk, which has beneficial effects for health. The process for the production of yogurt is very susceptible to failure because it involves bacteria and fermentation. For an industry, the risks may cause harm and have a negative impact. In order for a product to be successful and profitable, it requires the analysis of risks that may occur during the production process. Risk analysis can identify the risks in detail and prevent as well as determine its handling, so that the risks can be minimized. Therefore, this study will analyze the risks of the production process with a case study in CV.XYZ. The method used in this research is the Fuzzy Failure Mode and Effect Analysis (fuzzy FMEA) and Fault Tree Analysis (FTA). The results showed that there are 6 risks from equipment variables, raw material variables, and process variables. Those risks include the critical risk, which is the risk of a lack of an aseptic process, more specifically if starter yogurt is damaged due to contamination by fungus or other bacteria and a lack of sanitation equipment. The results of quantitative analysis of FTA showed that the highest probability is the probability of the lack of an aseptic process, with a risk of 3.902%. The recommendations for improvement include establishing SOPs (Standard Operating Procedures), which include the process, workers, and environment, controlling the starter of yogurt and improving the production planning and sanitation equipment using hot water immersion.
ERIC Educational Resources Information Center
Ho, Hsuan-Fu; Hung, Chia-Chi
2008-01-01
Purpose: The purpose of this paper is to examine how a graduate institute at National Chiayi University (NCYU), by using a model that integrates analytic hierarchy process, cluster analysis and correspondence analysis, can develop effective marketing strategies. Design/methodology/approach: This is primarily a quantitative study aimed at…
Identifying influential factors of business process performance using dependency analysis
NASA Astrophysics Data System (ADS)
Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank
2011-02-01
We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.
Zeng, Rui; Fu, Juan; Wu, La-Bin; Huang, Lin-Fang
2013-07-01
To analyze components of Citrus reticulata and salt-processed C. reticulata by ultra-performance liquid chromatography coupled with quadrupole-time-of-flight mass spectrometry (UPLC-Q-TOF/MS), and compared the changes in components before and after being processed with salt. Principal component analysis (PCA) and partial least squares discriminant analysis (OPLS-DA) were adopted to analyze the difference in fingerprint between crude and processed C. reticulata, showing increased content of eriocitrin, limonin, nomilin and obacunone increase in salt-processed C. reticulata. Potential chemical markers were identified as limonin, obacunone and nomilin, which could be used for distinguishing index components of crude and processed C. reticulata.
NASA Astrophysics Data System (ADS)
Han, Zhenyu; Sun, Shouzheng; Fu, Yunzhong; Fu, Hongya
2017-10-01
Viscidity is an important physical indicator for assessing fluidity of resin that is beneficial to contact resin with the fibers effectively and reduce manufacturing defects during automated fiber placement (AFP) process. However, the effect of processing parameters on viscidity evolution is rarely studied during AFP process. In this paper, viscidities under different scales are analyzed based on multi-scale analysis method. Firstly, viscous dissipation energy (VDE) within meso-unit under different processing parameters is assessed by using finite element method (FEM). According to multi-scale energy transfer model, meso-unit energy is used as the boundary condition for microscopic analysis. Furthermore, molecular structure of micro-system is built by molecular dynamics (MD) method. And viscosity curves are then obtained by integrating stress autocorrelation function (SACF) with time. Finally, the correlation characteristics of processing parameters to viscosity are revealed by using gray relational analysis method (GRAM). A group of processing parameters is found out to achieve the stability of viscosity and better fluidity of resin.
Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0
Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; ...
2008-01-01
The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less
Implementing EVM Data Analysis Adding Value from a NASA Project Manager's Perspective
NASA Technical Reports Server (NTRS)
Counts, Stacy; Kerby, Jerald
2006-01-01
Data Analysis is one of the keys to an effective Earned Value Management (EVM) Process. Project Managers (PM) must continually evaluate data in assessing the health of their projects. Good analysis of data can assist PMs in making better decisions in managing projects. To better support our P Ms, National Aeronautics and Space Administration (NASA) - Marshall Space Flight Center (MSFC) recently renewed its emphasis on sound EVM data analysis practices and processes, During this presentation we will discuss the approach that MSFC followed in implementing better data analysis across its Center. We will address our approach to effectively equip and support our projects in applying a sound data analysis process. In addition, the PM for the Space Station Biological Research Project will share her experiences of how effective data analysis can benefit a PM in the decision making process. The PM will discuss how the emphasis on data analysis has helped create a solid method for assessing the project s performance. Using data analysis successfully can be an effective and efficient tool in today s environment with increasing workloads and downsizing workforces
NASA Technical Reports Server (NTRS)
Bonine, Lauren
2015-01-01
The presentation provides insight into the schedule risk analysis process used by the Stratospheric Aerosol and Gas Experiment III on the International Space Station Project. The presentation focuses on the schedule risk analysis process highlighting the methods for identification of risk inputs, the inclusion of generic risks identified outside the traditional continuous risk management process, and the development of tailored analysis products used to improve risk informed decision making.
Analysis of Hospital Processes with Process Mining Techniques.
Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises
2015-01-01
Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.
A generic Transcriptomics Reporting Framework (TRF) for 'omics data processing and analysis.
Gant, Timothy W; Sauer, Ursula G; Zhang, Shu-Dong; Chorley, Brian N; Hackermüller, Jörg; Perdichizzi, Stefania; Tollefsen, Knut E; van Ravenzwaay, Ben; Yauk, Carole; Tong, Weida; Poole, Alan
2017-12-01
A generic Transcriptomics Reporting Framework (TRF) is presented that lists parameters that should be reported in 'omics studies used in a regulatory context. The TRF encompasses the processes from transcriptome profiling from data generation to a processed list of differentially expressed genes (DEGs) ready for interpretation. Included within the TRF is a reference baseline analysis (RBA) that encompasses raw data selection; data normalisation; recognition of outliers; and statistical analysis. The TRF itself does not dictate the methodology for data processing, but deals with what should be reported. Its principles are also applicable to sequencing data and other 'omics. In contrast, the RBA specifies a simple data processing and analysis methodology that is designed to provide a comparison point for other approaches and is exemplified here by a case study. By providing transparency on the steps applied during 'omics data processing and analysis, the TRF will increase confidence processing of 'omics data, and regulatory use. Applicability of the TRF is ensured by its simplicity and generality. The TRF can be applied to all types of regulatory 'omics studies, and it can be executed using different commonly available software tools. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.
Thread concept for automatic task parallelization in image analysis
NASA Astrophysics Data System (ADS)
Lueckenhaus, Maximilian; Eckstein, Wolfgang
1998-09-01
Parallel processing of image analysis tasks is an essential method to speed up image processing and helps to exploit the full capacity of distributed systems. However, writing parallel code is a difficult and time-consuming process and often leads to an architecture-dependent program that has to be re-implemented when changing the hardware. Therefore it is highly desirable to do the parallelization automatically. For this we have developed a special kind of thread concept for image analysis tasks. Threads derivated from one subtask may share objects and run in the same context but may process different threads of execution and work on different data in parallel. In this paper we describe the basics of our thread concept and show how it can be used as basis of an automatic task parallelization to speed up image processing. We further illustrate the design and implementation of an agent-based system that uses image analysis threads for generating and processing parallel programs by taking into account the available hardware. The tests made with our system prototype show that the thread concept combined with the agent paradigm is suitable to speed up image processing by an automatic parallelization of image analysis tasks.
Principal process analysis of biological models.
Casagranda, Stefano; Touzeau, Suzanne; Ropers, Delphine; Gouzé, Jean-Luc
2018-06-14
Understanding the dynamical behaviour of biological systems is challenged by their large number of components and interactions. While efforts have been made in this direction to reduce model complexity, they often prove insufficient to grasp which and when model processes play a crucial role. Answering these questions is fundamental to unravel the functioning of living organisms. We design a method for dealing with model complexity, based on the analysis of dynamical models by means of Principal Process Analysis. We apply the method to a well-known model of circadian rhythms in mammals. The knowledge of the system trajectories allows us to decompose the system dynamics into processes that are active or inactive with respect to a certain threshold value. Process activities are graphically represented by Boolean and Dynamical Process Maps. We detect model processes that are always inactive, or inactive on some time interval. Eliminating these processes reduces the complex dynamics of the original model to the much simpler dynamics of the core processes, in a succession of sub-models that are easier to analyse. We quantify by means of global relative errors the extent to which the simplified models reproduce the main features of the original system dynamics and apply global sensitivity analysis to test the influence of model parameters on the errors. The results obtained prove the robustness of the method. The analysis of the sub-model dynamics allows us to identify the source of circadian oscillations. We find that the negative feedback loop involving proteins PER, CRY, CLOCK-BMAL1 is the main oscillator, in agreement with previous modelling and experimental studies. In conclusion, Principal Process Analysis is a simple-to-use method, which constitutes an additional and useful tool for analysing the complex dynamical behaviour of biological systems.
Qin, Kunming; Wang, Bin; Li, Weidong; Cai, Hao; Chen, Danni; Liu, Xiao; Yin, Fangzhou; Cai, Baochang
2015-05-01
In traditional Chinese medicine, raw and processed herbs are used to treat different diseases. Suitable quality assessment methods are crucial for the discrimination between raw and processed herbs. The dried fruit of Arctium lappa L. and their processed products are widely used in traditional Chinese medicine, yet their therapeutic effects are different. In this study, a novel strategy using high-performance liquid chromatography and diode array detection coupled with multivariate statistical analysis to rapidly explore raw and processed Arctium lappa L. was proposed and validated. Four main components in a total of 30 batches of raw and processed Fructus Arctii samples were analyzed, and ten characteristic peaks were identified in the fingerprint common pattern. Furthermore, similarity evaluation, principal component analysis, and hierachical cluster analysis were applied to demonstrate the distinction. The results suggested that the relative amounts of the chemical components of raw and processed Fructus Arctii samples are different. This new method has been successfully applied to detect the raw and processed Fructus Arctii in marketed herbal medicinal products. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Integration of rocket turbine design and analysis through computer graphics
NASA Technical Reports Server (NTRS)
Hsu, Wayne; Boynton, Jim
1988-01-01
An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.
Launch COLA Gap Analysis for Protection of the International Space Station
NASA Astrophysics Data System (ADS)
Jenkin, Alan B.; McVey, John P.; Peterson, Glenn E.; Sorge, Marlon E.
2013-08-01
For launch missions in general, a collision avoidance (COLA) gap exists between the end of the time interval covered by standard launch COLA screening and the time that other spacecraft can clear a collision with the newly launched objects. To address this issue for the International Space Station (ISS), a COLA gap analysis process has been developed. The first part of the process, nodal separation analysis, identifies launch dates and launch window opportunities when the orbit traces of a launched object and the ISS could cross during the COLA gap. The second and newest part of the analysis process, Monte Carlo conjunction probability analysis, is performed closer to the launch dates of concern to reopen some of the launch window opportunities that would be closed by nodal separation analysis alone. Both parts of the process are described and demonstrated on sample missions.
NASA Technical Reports Server (NTRS)
Phillips, Dave; Haas, William; Barth, Tim; Benjamin, Perakath; Graul, Michael; Bagatourova, Olga
2005-01-01
Range Process Simulation Tool (RPST) is a computer program that assists managers in rapidly predicting and quantitatively assessing the operational effects of proposed technological additions to, and/or upgrades of, complex facilities and engineering systems such as the Eastern Test Range. Originally designed for application to space transportation systems, RPST is also suitable for assessing effects of proposed changes in industrial facilities and large organizations. RPST follows a model-based approach that includes finite-capacity schedule analysis and discrete-event process simulation. A component-based, scalable, open architecture makes RPST easily and rapidly tailorable for diverse applications. Specific RPST functions include: (1) definition of analysis objectives and performance metrics; (2) selection of process templates from a processtemplate library; (3) configuration of process models for detailed simulation and schedule analysis; (4) design of operations- analysis experiments; (5) schedule and simulation-based process analysis; and (6) optimization of performance by use of genetic algorithms and simulated annealing. The main benefits afforded by RPST are provision of information that can be used to reduce costs of operation and maintenance, and the capability for affordable, accurate, and reliable prediction and exploration of the consequences of many alternative proposed decisions.
Text analysis devices, articles of manufacture, and text analysis methods
Turner, Alan E; Hetzler, Elizabeth G; Nakamura, Grant C
2013-05-28
Text analysis devices, articles of manufacture, and text analysis methods are described according to some aspects. In one aspect, a text analysis device includes processing circuitry configured to analyze initial text to generate a measurement basis usable in analysis of subsequent text, wherein the measurement basis comprises a plurality of measurement features from the initial text, a plurality of dimension anchors from the initial text and a plurality of associations of the measurement features with the dimension anchors, and wherein the processing circuitry is configured to access a viewpoint indicative of a perspective of interest of a user with respect to the analysis of the subsequent text, and wherein the processing circuitry is configured to use the viewpoint to generate the measurement basis.
This Applications Analysis Report evaluates the Soliditech, Inc., solidification/ stabilization process for the on-site treatment of waste materials. The Soliditech process mixes and chemically treats waste material with Urrichem (a proprietary reagent), additives, pozzolanic mat...
NASA Technical Reports Server (NTRS)
Lieberman, S. L.
1974-01-01
Tables are presented which include: material properties; elemental analysis; silicone RTV formulations; polyester systems and processing; epoxy preblends and processing; urethane materials and processing; epoxy-urethanes elemental analysis; flammability test results, and vacuum effects.
Applications of High-speed motion analysis system on Solid Rocket Motor (SRM)
NASA Astrophysics Data System (ADS)
Liu, Yang; He, Guo-qiang; Li, Jiang; Liu, Pei-jin; Chen, Jian
2007-01-01
High-speed motion analysis system could record images up to 12,000fps and analyzed with the image processing system. The system stored data and images directly in electronic memory convenient for managing and analyzing. The high-speed motion analysis system and the X-ray radiography system were established the high-speed real-time X-ray radiography system, which could diagnose and measure the dynamic and high-speed process in opaque. The image processing software was developed for improve quality of the original image for acquiring more precise information. The typical applications of high-speed motion analysis system on solid rocket motor (SRM) were introduced in the paper. The research of anomalous combustion of solid propellant grain with defects, real-time measurement experiment of insulator eroding, explosion incision process of motor, structure and wave character of plume during the process of ignition and flameout, measurement of end burning of solid propellant, measurement of flame front and compatibility between airplane and missile during the missile launching were carried out using high-speed motion analysis system. The significative results were achieved through the research. Aim at application of high-speed motion analysis system on solid rocket motor, the key problem, such as motor vibrancy, electrical source instability, geometry aberrance, and yawp disturbance, which damaged the image quality, was solved. The image processing software was developed which improved the capability of measuring the characteristic of image. The experimental results showed that the system was a powerful facility to study instantaneous and high-speed process in solid rocket motor. With the development of the image processing technique, the capability of high-speed motion analysis system was enhanced.
First On-Site Data Analysis System for Subaru/Suprime-Cam
NASA Astrophysics Data System (ADS)
Furusawa, Hisanori; Okura, Yuki; Mineo, Sogo; Takata, Tadafumi; Nakata, Fumiaki; Tanaka, Manobu; Katayama, Nobuhiko; Itoh, Ryosuke; Yasuda, Naoki; Miyazaki, Satoshi; Komiyama, Yutaka; Utsumi, Yousuke; Uchida, Tomohisa; Aihara, Hiroaki
2011-03-01
We developed an automated on-site quick analysis system for mosaic CCD data of Suprime-Cam, which is a wide-field camera mounted at the prime focus of the Subaru Telescope, Mauna Kea, Hawaii. The first version of the data-analysis system was constructed, and started to operate in general observations. This system is a new function of observing support at the Subaru Telescope to provide the Subaru user community with an automated on-site data evaluation, aiming at improvements of observers' productivity, especially in large imaging surveys. The new system assists the data evaluation tasks in observations by the continuous monitoring of the characteristics of every data frame during observations. The evaluation results and data frames processed by this system are also useful for reducing the data-processing time in a full analysis after an observation. The primary analysis functions implemented in the data-analysis system are composed of automated realtime analysis for data evaluation and on-demand analysis, which is executed upon request, including mosaicing analysis and flat making analysis. In data evaluation, which is controlled by the organizing software, the database keeps track of the analysis histories, as well as the evaluated values of data frames, including seeing and sky background levels; it also helps in the selection of frames for mosaicing and flat making analysis. We examined the system performance and confirmed an improvement in the data-processing time by a factor of 9 with the aid of distributed parallel data processing and on-memory data processing, which makes the automated data evaluation effective.
Pedagogical issues for effective teaching of biosignal processing and analysis.
Sandham, William A; Hamilton, David J
2010-01-01
Biosignal processing and analysis is generally perceived by many students to be a challenging topic to understand, and to become adept with the necessary analytical skills. This is a direct consequence of the high mathematical content involved, and the many abstract features of the topic. The MATLAB and Mathcad software packages offer an excellent algorithm development environment for teaching biosignal processing and analysis modules, and can also be used effectively in many biosignal, and indeed bioengineering, research areas. In this paper, traditional introductory and advanced biosignal processing (and analysis) syllabi are reviewed, and the use of MATLAB and Mathcad for teaching and research is illustrated with a number of examples.
An analysis of the Petri net based model of the human body iron homeostasis process.
Sackmann, Andrea; Formanowicz, Dorota; Formanowicz, Piotr; Koch, Ina; Blazewicz, Jacek
2007-02-01
In the paper a Petri net based model of the human body iron homeostasis is presented and analyzed. The body iron homeostasis is an important but not fully understood complex process. The modeling of the process presented in the paper is expressed in the language of Petri net theory. An application of this theory to the description of biological processes allows for very precise analysis of the resulting models. Here, such an analysis of the body iron homeostasis model from a mathematical point of view is given.
ERIC Educational Resources Information Center
Verger, Antoni; Hermo, Javier Pablo
2010-01-01
The article analyses two processes of higher education regionalisation, MERCOSUR-Educativo in Latin America and the Bologna Process in Europe, from a comparative perspective. The comparative analysis is centered on the content and the governance of both processes and, specifically, on the reasons of their uneven evolution and implementation. We…
Analysis of Doppler radar windshear data
NASA Technical Reports Server (NTRS)
Williams, F.; Mckinney, P.; Ozmen, F.
1989-01-01
The objective of this analysis is to process Lincoln Laboratory Doppler radar data obtained during FLOWS testing at Huntsville, Alabama, in the summer of 1986, to characterize windshear events. The processing includes plotting velocity and F-factor profiles, histogram analysis to summarize statistics, and correlation analysis to demonstrate any correlation between different data fields.
ERIC Educational Resources Information Center
Tutlys, Vidmantas; Spöttl, Georg
2017-01-01
Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…
IRB Process Improvements: A Machine Learning Analysis.
Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A
2017-06-01
Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.
Teaching concept analysis to graduate nursing students.
Schiller, Catharine J
2018-04-01
To provide guidance to educators who use the Wilson (1963) concept analysis method, as modified by Walker and Avant (2011), in their graduate nursing curriculum BACKGROUND: While graduate nursing curricula often include a concept analysis assignment, there is a paucity of literature to assist educators in guiding students through this challenging process. This article details one way for educators to assist graduate nursing students in learning how to undertake each step of the Wilson (1963) concept analysis method, as modified by Walker and Avant (2011). Wilson (1963) concept analysis method, as modified by Walker and Avant (2011). Using examples, this article walks the reader through the Walker and Avant (2011) concept analysis process and addresses those issues commonly encountered by educators during this process. This article presented one way of walking students through a Walker and Avant (2011) concept analysis. Having clear information about the steps involved in developing a concept analysis will make it easier for educators to incorporate it into their graduate nursing curriculum and to effectively guide students on their journey through this process. © 2018 Wiley Periodicals, Inc.
Environmental Assessment: Hurlburt Field Soundside Boathouse and Restroom Facility Construction
2007-08-01
seq., and Air Force Instruction (AFI) 32-7061, The Environmental Impact Analysis Process, the USAF concludes that the Proposed Action will have no...U.S.C.) §4321, et seq., and Air Force Instruction (AFI) 32-7061, The Environmental Impact Analysis Process, the USAF concludes that the Proposed...et seq. • AFI 32-7061, The Environmental Impact Analysis Process These regulations require federal agencies to analyze the potential environmental
Signal Processing and Interpretation Using Multilevel Signal Abstractions.
1986-06-01
mappings expressed in the Fourier domain. Pre- viously proposed causal analysis techniques for diagnosis are based on the analysis of intermediate data ...can be processed either as individual one-dimensional waveforms or as multichannel data 26 I P- - . . . ." " ." h9. for source detection and direction...microphone data . The signal processing for both spectral analysis of microphone signals and direc- * tion determination of acoustic sources involves
NASA Astrophysics Data System (ADS)
Nesladek, Pavel; Wiswesser, Andreas; Sass, Björn; Mauermann, Sebastian
2008-04-01
The Critical dimension off-target (CDO) is a key parameter for mask house customer, affecting directly the performance of the mask. The CDO is the difference between the feature size target and the measured feature size. The change of CD during the process is either compensated within the process or by data correction. These compensation methods are commonly called process bias and data bias, respectively. The difference between data bias and process bias in manufacturing results in systematic CDO error, however, this systematic error does not take into account the instability of the process bias. This instability is a result of minor variations - instabilities of manufacturing processes and changes in materials and/or logistics. Using several masks the CDO of the manufacturing line can be estimated. For systematic investigation of the unit process contribution to CDO and analysis of the factors influencing the CDO contributors, a solid understanding of each unit process and huge number of masks is necessary. Rough identification of contributing processes and splitting of the final CDO variation between processes can be done with approx. 50 masks with identical design, material and process. Such amount of data allows us to identify the main contributors and estimate the effect of them by means of Analysis of variance (ANOVA) combined with multivariate analysis. The analysis does not provide information about the root cause of the variation within the particular unit process, however, it provides a good estimate of the impact of the process on the stability of the manufacturing line. Additionally this analysis can be used to identify possible interaction between processes, which cannot be investigated if only single processes are considered. Goal of this work is to evaluate limits for CDO budgeting models given by the precision and the number of measurements as well as partitioning the variation within the manufacturing process. The CDO variation splits according to the suggested model into contributions from particular processes or process groups. Last but not least the power of this method to determine the absolute strength of each parameter will be demonstrated. Identification of the root cause of this variation within the unit process itself is not scope of this work.
Salari-Moghaddam, Asma; Milajerdi, Alireza; Larijani, Bagher; Esmaillzadeh, Ahmad
2018-06-01
No earlier study has summarized findings from previous publications on processed red meat intake and risk of Chronic Obstructive Pulmonary Disease (COPD). This systematic review and meta-analysis was conducted to examine the association between processed red meat intake and COPD risk. We searched in PubMed/Medline, ISI Web of Knowledge, Scopus, EMBASE and Google Scholar up to April 2018 to identify relevant studies. Prospective cohort studies that considered processed red meat as the exposure variable and COPD as the main outcome variable or as one of the outcomes were included in the systematic review. Publications in which hazard ratios (HRs) were reported as effect size were included in the meta-analysis. Finally, five cohort studies were considered in this systematic review and meta-analysis. In total, 289,952 participants, including 8338 subjects with COPD, aged ≥27 years were included in the meta-analysis. These studies were from Sweden and the US. Linear dose response meta-analysis revealed that each 50 gr/week increase in processed red meat intake was associated with 8% higher risk of COPD (HR: 1.08; 95% CI: 1.03, 1.13). There was an evidence of non-linear association between processed red meat intake and risk of COPD (P < 0.001). In this systematic review and meta-analysis, we found a significant positive association between processed red meat intake and risk of COPD. CRD42017077971. Copyright © 2018 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.
NASA Astrophysics Data System (ADS)
Yussup, N.; Rahman, N. A. A.; Ibrahim, M. M.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.
2017-01-01
Neutron Activation Analysis (NAA) process has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s. Most of the procedures established especially from sample registration to sample analysis are performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient. Hence, a software to support the system automation is developed to provide an effective method to replace redundant manual data entries and produce faster sample analysis and calculation process. This paper describes the design and development of automation software for NAA process which consists of three sub-programs. The sub-programs are sample registration, hardware control and data acquisition; and sample analysis. The data flow and connection between the sub-programs will be explained. The software is developed by using National Instrument LabView development package.
NASA Astrophysics Data System (ADS)
Zaborowicz, M.; Przybył, J.; Koszela, K.; Boniecki, P.; Mueller, W.; Raba, B.; Lewicki, A.; Przybył, K.
2014-04-01
The aim of the project was to make the software which on the basis on image of greenhouse tomato allows for the extraction of its characteristics. Data gathered during the image analysis and processing were used to build learning sets of artificial neural networks. Program enables to process pictures in jpeg format, acquisition of statistical information of the picture and export them to an external file. Produced software is intended to batch analyze collected research material and obtained information saved as a csv file. Program allows for analysis of 33 independent parameters implicitly to describe tested image. The application is dedicated to processing and image analysis of greenhouse tomatoes. The program can be used for analysis of other fruits and vegetables of a spherical shape.
NASA Technical Reports Server (NTRS)
Davis, Frank W.; Quattrochi, Dale A.; Ridd, Merrill K.; Lam, Nina S.-N.; Walsh, Stephen J.
1991-01-01
This paper discusses some basic scientific issues and research needs in the joint processing of remotely sensed and GIS data for environmental analysis. Two general topics are treated in detail: (1) scale dependence of geographic data and the analysis of multiscale remotely sensed and GIS data, and (2) data transformations and information flow during data processing. The discussion of scale dependence focuses on the theory and applications of spatial autocorrelation, geostatistics, and fractals for characterizing and modeling spatial variation. Data transformations during processing are described within the larger framework of geographical analysis, encompassing sampling, cartography, remote sensing, and GIS. Development of better user interfaces between image processing, GIS, database management, and statistical software is needed to expedite research on these and other impediments to integrated analysis of remotely sensed and GIS data.
Combined process automation for large-scale EEG analysis.
Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E
2012-01-01
Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.
Heat and Mass Transfer Processes in Scrubber of Flue Gas Heat Recovery Device
NASA Astrophysics Data System (ADS)
Veidenbergs, Ivars; Blumberga, Dagnija; Vigants, Edgars; Kozuhars, Grigorijs
2010-01-01
The paper deals with the heat and mass transfer process research in a flue gas heat recovery device, where complicated cooling, evaporation and condensation processes are taking place simultaneously. The analogy between heat and mass transfer is used during the process of analysis. In order to prepare a detailed process analysis based on heat and mass process descriptive equations, as well as the correlation for wet gas parameter calculation, software in the
An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis
NASA Astrophysics Data System (ADS)
Kim, Yongmin; Alexander, Thomas
1986-06-01
In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.
Sapunar, Damir; Grković, Ivica; Lukšić, Davor; Marušić, Matko
2016-05-01
Our aim was to describe a comprehensive model of internal quality management (QM) at a medical school founded on the business process analysis (BPA) software tool. BPA software tool was used as the core element for description of all working processes in our medical school, and subsequently the system served as the comprehensive model of internal QM. The quality management system at the University of Split School of Medicine included the documentation and analysis of all business processes within the School. The analysis revealed 80 weak points related to one or several business processes. A precise analysis of medical school business processes allows identification of unfinished, unclear and inadequate points in these processes, and subsequently the respective improvements and increase of the QM level and ultimately a rationalization of the institution's work. Our approach offers a potential reference model for development of common QM framework allowing a continuous quality control, i.e. the adjustments and adaptation to contemporary educational needs of medical students. Copyright © 2016 by Academy of Sciences and Arts of Bosnia and Herzegovina.
Summary and recommendations. [reduced gravitational effects on materials manufactured in space
NASA Technical Reports Server (NTRS)
1975-01-01
An economic analysis using econometric and cost benefit analysis techniques was performed to determine the feasibility of space processing of certain products. The overall objectives of the analysis were (1) to determine specific products or processes uniquely connected with space manufacturing, (2) to select a specific product or process from each of the areas of semiconductors, metals, and biochemicals, and (3) to determine the overall price/cost structure of each product or process considered. The economic elements of the analysis involved a generalized decision making format for analyzing space manufacturing, a comparative cost study of the selected processes in space vs. earth manufacturing, and a supply and demand study of the economic relationships of one of the manufacturing processes. Space processing concepts were explored. The first involved the use of the shuttle as the factory with all operations performed during individual flights. The second concept involved a permanent unmanned space factory which would be launched separately. The shuttle in this case would be used only for maintenance and refurbishment. Finally, some consideration was given to a permanent manned space factory.
Clinical process analysis and activity-based costing at a heart center.
Ridderstolpe, Lisa; Johansson, Andreas; Skau, Tommy; Rutberg, Hans; Ahlfeldt, Hans
2002-08-01
Cost studies, productivity, efficiency, and quality of care measures, the links between resources and patient outcomes, are fundamental issues for hospital management today. This paper describes the implementation of a model for process analysis and activity-based costing (ABC)/management at a Heart Center in Sweden as a tool for administrative cost information, strategic decision-making, quality improvement, and cost reduction. A commercial software package (QPR) containing two interrelated parts, "ProcessGuide and CostControl," was used. All processes at the Heart Center were mapped and graphically outlined. Processes and activities such as health care procedures, research, and education were identified together with their causal relationship to costs and products/services. The construction of the ABC model in CostControl was time-consuming. However, after the ABC/management system was created, it opened the way for new possibilities including process and activity analysis, simulation, and price calculations. Cost analysis showed large variations in the cost obtained for individual patients undergoing coronary artery bypass grafting (CABG) surgery. We conclude that a process-based costing system is applicable and has the potential to be useful in hospital management.
Ultrasound-enhanced bioscouring of greige cotton: regression analysis of process factors
USDA-ARS?s Scientific Manuscript database
Process factors of enzyme concentration, time, power and frequency were investigated for ultrasound-enhanced bioscouring of greige cotton. A fractional factorial experimental design and subsequent regression analysis of the process factors were employed to determine the significance of each factor a...
Quality Assessment of College Admissions Processes.
ERIC Educational Resources Information Center
Fisher, Caroline; Weymann, Elizabeth; Todd, Amy
2000-01-01
This study evaluated the admissions process for a Master's in Business Administration Program using such quality improvement techniques as customer surveys, benchmarking, and gap analysis. Analysis revealed that student dissatisfaction with the admissions process may be a factor influencing declining enrollment. Cycle time and number of student…
Li, Liang; Wang, Yiying; Xu, Jiting; Flora, Joseph R V; Hoque, Shamia; Berge, Nicole D
2018-08-01
Hydrothermal carbonization (HTC) is a wet, low temperature thermal conversion process that continues to gain attention for the generation of hydrochar. The importance of specific process conditions and feedstock properties on hydrochar characteristics is not well understood. To evaluate this, linear and non-linear models were developed to describe hydrochar characteristics based on data collected from HTC-related literature. A Sobol analysis was subsequently conducted to identify parameters that most influence hydrochar characteristics. Results from this analysis indicate that for each investigated hydrochar property, the model fit and predictive capability associated with the random forest models is superior to both the linear and regression tree models. Based on results from the Sobol analysis, the feedstock properties and process conditions most influential on hydrochar yield, carbon content, and energy content were identified. In addition, a variational process parameter sensitivity analysis was conducted to determine how feedstock property importance changes with process conditions. Copyright © 2018 Elsevier Ltd. All rights reserved.
DDS-Suite - A Dynamic Data Acquisition, Processing, and Analysis System for Wind Tunnel Testing
NASA Technical Reports Server (NTRS)
Burnside, Jathan J.
2012-01-01
Wind Tunnels have optimized their steady-state data systems for acquisition and analysis and even implemented large dynamic-data acquisition systems, however development of near real-time processing and analysis tools for dynamic-data have lagged. DDS-Suite is a set of tools used to acquire, process, and analyze large amounts of dynamic data. Each phase of the testing process: acquisition, processing, and analysis are handled by separate components so that bottlenecks in one phase of the process do not affect the other, leading to a robust system. DDS-Suite is capable of acquiring 672 channels of dynamic data at rate of 275 MB / s. More than 300 channels of the system use 24-bit analog-to-digital cards and are capable of producing data with less than 0.01 of phase difference at 1 kHz. System architecture, design philosophy, and examples of use during NASA Constellation and Fundamental Aerodynamic tests are discussed.
Efficient Process Migration for Parallel Processing on Non-Dedicated Networks of Workstations
NASA Technical Reports Server (NTRS)
Chanchio, Kasidit; Sun, Xian-He
1996-01-01
This paper presents the design and preliminary implementation of MpPVM, a software system that supports process migration for PVM application programs in a non-dedicated heterogeneous computing environment. New concepts of migration point as well as migration point analysis and necessary data analysis are introduced. In MpPVM, process migrations occur only at previously inserted migration points. Migration point analysis determines appropriate locations to insert migration points; whereas, necessary data analysis provides a minimum set of variables to be transferred at each migration pint. A new methodology to perform reliable point-to-point data communications in a migration environment is also discussed. Finally, a preliminary implementation of MpPVM and its experimental results are presented, showing the correctness and promising performance of our process migration mechanism in a scalable non-dedicated heterogeneous computing environment. While MpPVM is developed on top of PVM, the process migration methodology introduced in this study is general and can be applied to any distributed software environment.
Hanson, A L; Metzger, L E
2010-02-01
The objective of this study was to determine the effect of increased vitamin D fortification (250 IU/serving) of high-temperature, short-time (HTST)-processed 2% fat milk, UHT-processed 2% fat chocolate milk, and low-fat strawberry yogurt on the sensory characteristics and stability of vitamin D during processing and storage. Three replicates of HTST pasteurized 2% fat milk, UHT pasteurized 2% fat chocolate milk, and low-fat strawberry yogurt were manufactured. Each of the 3 replicates for all products contained a control (no vitamin D fortification), a treatment group with 100 IU vitamin D/serving (current level of vitamin D fortification), and a treatment group with 250 IU vitamin D/serving. A cold-water dispersible vitamin D(3) concentrate was used for all fortifications. The HTST-processed 2% fat milk was stored for 21 d, with vitamin D analysis done before processing and on d 0, 14, and 21. Sensory analysis was conducted on d 14. The UHT-processed 2% fat chocolate milk was stored for 60 d, with vitamin D analysis done before processing and on d 0, 40, and 60. Sensory analysis was conducted on d 40. Low-fat strawberry yogurt was stored for 42 d, with vitamin D analysis done before processing, and on d 0, 28, and 42. Sensory analysis was conducted on d 28. Vitamin D levels in the fortified products were found to be similar to the target levels of fortification (100 and 250 IU vitamin D per serving) for all products, indicating no loss of vitamin D during processing. Vitamin D was also found to be stable over the shelf life of each product. Increasing the fortification of vitamin D from 100 to 250 IU/serving did not result in a change in the sensory characteristics of HTST-processed 2% fat milk, UHT-processed 2% fat chocolate milk, or low-fat strawberry yogurt. These results indicate that it is feasible to increase vitamin D fortification from 100 to 250 IU per serving in these products. Copyright 2010 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Vocational Education Operations Analysis Process.
ERIC Educational Resources Information Center
California State Dept. of Education, Sacramento. Vocational Education Services.
This manual on the vocational education operations analysis process is designed to provide vocational administrators/coordinators with an internal device to collect, analyze, and display vocational education performance data. The first section describes the system and includes the following: analysis worksheet, data sources, utilization, system…
Global Persistent Attack: A Systems Architecture, Process Modeling, and Risk Analysis Approach
2008-06-01
develop an analysis process for quantifying risk associated with the limitations presented by a fiscally constrained environment. The second step...previous independent analysis of each force structure provided information for quantifying risk associated with the given force presentations, the
Wet weather highway accident analysis and skid resistance data management system (volume I).
DOT National Transportation Integrated Search
1992-06-01
The objectives and scope of this research are to establish an effective methodology for wet weather accident analysis and to develop a database management system to facilitate information processing and storage for the accident analysis process, skid...
Multiobjective Sensitivity Analysis Of Sediment And Nitrogen Processes With A Watershed Model
This paper presents a computational analysis for evaluating critical non-point-source sediment and nutrient (specifically nitrogen) processes and management actions at the watershed scale. In the analysis, model parameters that bear key uncertainties were presumed to reflect the ...
Code of Federal Regulations, 2011 CFR
2011-07-01
... Acquisition Programs and Major Automated Information System Acquisition Programs. 1 To comply with NEPA and... ANALYSIS PROCESS (EIAP) § 989.1 Purpose. (a) This part implements the Air Force Environmental Impact Analysis Process (EIAP) and provides procedures for environmental impact analysis both within the United...
A DMAIC approach for process capability improvement an engine crankshaft manufacturing process
NASA Astrophysics Data System (ADS)
Sharma, G. V. S. S.; Rao, P. Srinivasa
2014-05-01
The define-measure-analyze-improve-control (DMAIC) approach is a five-strata approach, namely DMAIC. This approach is the scientific approach for reducing the deviations and improving the capability levels of the manufacturing processes. The present work elaborates on DMAIC approach applied in reducing the process variations of the stub-end-hole boring operation of the manufacture of crankshaft. This statistical process control study starts with selection of the critical-to-quality (CTQ) characteristic in the define stratum. The next stratum constitutes the collection of dimensional measurement data of the CTQ characteristic identified. This is followed by the analysis and improvement strata where the various quality control tools like Ishikawa diagram, physical mechanism analysis, failure modes effects analysis and analysis of variance are applied. Finally, the process monitoring charts are deployed at the workplace for regular monitoring and control of the concerned CTQ characteristic. By adopting DMAIC approach, standard deviation is reduced from 0.003 to 0.002. The process potential capability index ( C P) values improved from 1.29 to 2.02 and the process performance capability index ( C PK) values improved from 0.32 to 1.45, respectively.
Lu, Lingbo; Li, Jingshan; Gisler, Paula
2011-06-01
Radiology tests, such as MRI, CT-scan, X-ray and ultrasound, are cost intensive and insurance pre-approvals are necessary to get reimbursement. In some cases, tests may be denied for payments by insurance companies due to lack of pre-approvals, inaccurate or missing necessary information. This can lead to substantial revenue losses for the hospital. In this paper, we present a simulation study of a centralized scheduling process for outpatient radiology tests at a large community hospital (Central Baptist Hospital in Lexington, Kentucky). Based on analysis of the central scheduling process, a simulation model of information flow in the process has been developed. Using such a model, the root causes of financial losses associated with errors and omissions in this process were identified and analyzed, and their impacts were quantified. In addition, "what-if" analysis was conducted to identify potential process improvement strategies in the form of recommendations to the hospital leadership. Such a model provides a quantitative tool for continuous improvement and process control in radiology outpatient test scheduling process to reduce financial losses associated with process error. This method of analysis is also applicable to other departments in the hospital.
NASA Astrophysics Data System (ADS)
Citraresmi, A. D. P.; Wahyuni, E. E.
2018-03-01
The aim of this study was to inspect the implementation of Hazard Analysis and Critical Control Point (HACCP) for identification and prevention of potential hazards in the production process of dried anchovy at PT. Kelola Mina Laut (KML), Lobuk unit, Sumenep. Cold storage process is needed in each anchovy processing step in order to maintain its physical and chemical condition. In addition, the implementation of quality assurance system should be undertaken to maintain product quality. The research was conducted using a survey method, by following the whole process of making anchovy from the receiving raw materials to the packaging of final product. The method of data analysis used was descriptive analysis method. Implementation of HACCP at PT. KML, Lobuk unit, Sumenep was conducted by applying Pre Requisite Programs (PRP) and preparation stage consisting of 5 initial stages and 7 principles of HACCP. The results showed that CCP was found in boiling process flow with significant hazard of Listeria monocytogenesis bacteria and final sorting process with significant hazard of foreign material contamination in the product. Actions taken were controlling boiling temperature of 100 – 105°C for 3 - 5 minutes and training for sorting process employees.
Walking the Fine Line: Political Decision Making with or without Data.
ERIC Educational Resources Information Center
Merkel-Keller, Claudia
The stages of the policy process are examined and explained in terms of the decision making framework. The policy process is comprised of four stages; policy analysis, policy formation, policy decision, and political analysis. Political analysis is the performance of the market analysis needed for a decision. The political weight, rather than the…
The COMPTEL Processing and Analysis Software system (COMPASS)
NASA Astrophysics Data System (ADS)
de Vries, C. P.; COMPTEL Collaboration
The data analysis system of the gamma-ray Compton Telescope (COMPTEL) onboard the Compton-GRO spacecraft is described. A continous stream of data of the order of 1 kbytes per second is generated by the instrument. The data processing and analysis software is build around a relational database managment system (RDBMS) in order to be able to trace heritage and processing status of all data in the processing pipeline. Four institutes cooperate in this effort requiring procedures to keep local RDBMS contents identical between the sites and swift exchange of data using network facilities. Lately, there has been a gradual move of the system from central processing facilities towards clusters of workstations.
Human Factors Analysis to Improve the Processing of Ares-1 Launch Vehicle
NASA Technical Reports Server (NTRS)
Dippolito, Gregory M.; Stambolian, Damon B.
2011-01-01
The Constellation Program (CxP) is composed of an array of vehicles used to go to the Moon and Mars. The Ares vehicle one of the components of CxP, goes through several stages of processing before it is launched at the Kennedy Space Center. In order to have efficient and effective ground processing inside and outside the vehicle, all of the ground processing activities should be analyzed. The analysis for this program was performed, by engineers, technicians, and human factors experts with spacecraft processing experience. The procedure used to gather data was accomplished by observing human activities within physical mockups. The paper will focus on the procedures, analysis and results from these observations.
Welding process modelling and control
NASA Technical Reports Server (NTRS)
Romine, Peter L.; Adenwala, Jinen A.
1993-01-01
The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.
Simulation and Analysis of One-time Forming Process of Automobile Steering Ball Head
NASA Astrophysics Data System (ADS)
Shi, Peicheng; Zhang, Xujun; Xu, Zengwei; Zhang, Rongyun
2018-03-01
Aiming at the problems such as large machining allowance, low production efficiency and material waste during die forging of ball pin, the cold extrusion process of ball head was studied and the analog simulation of the forming process was carried out by using the finite element analysis software DEFORM-3D. Through the analysis of the equivalent stress strain, velocity vector field and load-displacement curve, the flow regularity of the metal during the cold extrusion process of ball pin was clarified, and possible defects during the molding were predicted. The results showed that this process could solve the forming problem of ball pin and provide theoretical basis for actual production of enterprises.
STARS Conceptual Framework for Reuse Processes (CFRP). Volume 2: application Version 1.0
1993-09-30
Analysis and Design DISA/CIM process x OProcess [DIS93] Feature-Oriented Domain SEI process x Analysis ( FODA ) [KCH+90] JIAWG Object-Oriented Domain JIAWG...Domain Analysis ( FODA ) Feasibility Study. Technical Report CMU/S[1 ,N. I R 21. Soft- ware Engineering Institute, Carnegie Mellon University, Pittsburgh...Electronic Systems Center Air Force Materiel Command, USAF Hanscom AFB, MA 01731-5000 Prepared by: The Boeing Company , IBM, Unisys Corporation, Defense
NASA Astrophysics Data System (ADS)
Armando, Alessandro; Giunchiglia, Enrico; Ponta, Serena Elisa
We present an approach to the formal specification and automatic analysis of business processes under authorization constraints based on the action language \\cal{C}. The use of \\cal{C} allows for a natural and concise modeling of the business process and the associated security policy and for the automatic analysis of the resulting specification by using the Causal Calculator (CCALC). Our approach improves upon previous work by greatly simplifying the specification step while retaining the ability to perform a fully automatic analysis. To illustrate the effectiveness of the approach we describe its application to a version of a business process taken from the banking domain and use CCALC to determine resource allocation plans complying with the security policy.
Huang, Jun; Kaul, Goldi; Cai, Chunsheng; Chatlapalli, Ramarao; Hernandez-Abad, Pedro; Ghosh, Krishnendu; Nagi, Arwinder
2009-12-01
To facilitate an in-depth process understanding, and offer opportunities for developing control strategies to ensure product quality, a combination of experimental design, optimization and multivariate techniques was integrated into the process development of a drug product. A process DOE was used to evaluate effects of the design factors on manufacturability and final product CQAs, and establish design space to ensure desired CQAs. Two types of analyses were performed to extract maximal information, DOE effect & response surface analysis and multivariate analysis (PCA and PLS). The DOE effect analysis was used to evaluate the interactions and effects of three design factors (water amount, wet massing time and lubrication time), on response variables (blend flow, compressibility and tablet dissolution). The design space was established by the combined use of DOE, optimization and multivariate analysis to ensure desired CQAs. Multivariate analysis of all variables from the DOE batches was conducted to study relationships between the variables and to evaluate the impact of material attributes/process parameters on manufacturability and final product CQAs. The integrated multivariate approach exemplifies application of QbD principles and tools to drug product and process development.
Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS
NASA Astrophysics Data System (ADS)
Joshi, D. M.; Patel, H. K.
2015-10-01
Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.
M-DAS: System for multispectral data analysis. [in Saginaw Bay, Michigan
NASA Technical Reports Server (NTRS)
Johnson, R. H.
1975-01-01
M-DAS is a ground data processing system designed for analysis of multispectral data. M-DAS operates on multispectral data from LANDSAT, S-192, M2S and other sources in CCT form. Interactive training by operator-investigators using a variable cursor on a color display was used to derive optimum processing coefficients and data on cluster separability. An advanced multivariate normal-maximum likelihood processing algorithm was used to produce output in various formats: color-coded film images, geometrically corrected map overlays, moving displays of scene sections, coverage tabulations and categorized CCTs. The analysis procedure for M-DAS involves three phases: (1) screening and training, (2) analysis of training data to compute performance predictions and processing coefficients, and (3) processing of multichannel input data into categorized results. Typical M-DAS applications involve iteration between each of these phases. A series of photographs of the M-DAS display are used to illustrate M-DAS operation.
Comprehensive Mass Analysis for Chemical Processes, a Case Study on L-Dopa Manufacture
To evaluate the “greenness” of chemical processes in route selection and process development, we propose a comprehensive mass analysis to inform the stakeholders from different fields. This is carried out by characterizing the mass intensity for each contributing chemical or wast...
The Community Innovation Process: A Conceptualization and Empirical Analysis.
ERIC Educational Resources Information Center
Agnew, John A.; And Others
1978-01-01
Previous research into the community innovation process has tended to emphasize either intercommunity communication or local socioeconomic and political factors. This article incorporates both sets of factors in an analysis of urban renewal, public housing, automated data processing by local municipalities, and public water fluoridation.…
DOT National Transportation Integrated Search
1992-06-01
The objectives and scope of this research are to establish an effective methodology for wet weather accident analysis and to develop a database management system to facilitate information processing and storage for the accident analysis process, skid...
Hydrotreater/Distillation Column Hazard Analysis Report Rev. 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
This project Hazard and Risk Analysis Report contains the results of several hazard analyses and risk assessments. An initial assessment was conducted in 2012, which included a multi-step approach ranging from design reviews to a formal What-If hazard analysis. A second What-If hazard analysis was completed during February 2013 to evaluate the operation of the hydrotreater/distillation column processes to be installed in a process enclosure within the Process Development Laboratory West (PDL-West) facility located on the PNNL campus. The qualitative analysis included participation of project and operations personnel and applicable subject matter experts. The analysis identified potential hazardous scenarios, eachmore » based on an initiating event coupled with a postulated upset condition. The unmitigated consequences of each hazardous scenario were generally characterized as a process upset; the exposure of personnel to steam, vapors or hazardous material; a spray or spill of hazardous material; the creation of a flammable atmosphere; or an energetic release from a pressure boundary.« less
Metacognition and evidence analysis instruction: an educational framework and practical experience.
Parrott, J Scott; Rubinstein, Matthew L
2015-08-21
The role of metacognitive skills in the evidence analysis process has received little attention in the research literature. While the steps of the evidence analysis process are well defined, the role of higher-level cognitive operations (metacognitive strategies) in integrating the steps of the process is not well understood. In part, this is because it is not clear where and how metacognition is implicated in the evidence analysis process nor how these skills might be taught. The purposes of this paper are to (a) suggest a model for identifying critical thinking and metacognitive skills in evidence analysis instruction grounded in current educational theory and research and (b) demonstrate how freely available systematic review/meta-analysis tools can be used to focus on higher-order metacognitive skills, while providing a framework for addressing common student weaknesses. The final goal of this paper is to provide an instructional framework that can generate critique and elaboration while providing the conceptual basis and rationale for future research agendas on this topic.
A method for identifying EMI critical circuits during development of a large C3
NASA Astrophysics Data System (ADS)
Barr, Douglas H.
The circuit analysis methods and process Boeing Aerospace used on a large, ground-based military command, control, and communications (C3) system are described. This analysis was designed to help identify electromagnetic interference (EMI) critical circuits. The methodology used the MIL-E-6051 equipment criticality categories as the basis for defining critical circuits, relational database technology to help sort through and account for all of the approximately 5000 system signal cables, and Macintosh Plus personal computers to predict critical circuits based on safety margin analysis. The EMI circuit analysis process systematically examined all system circuits to identify which ones were likely to be EMI critical. The process used two separate, sequential safety margin analyses to identify critical circuits (conservative safety margin analysis, and detailed safety margin analysis). These analyses used field-to-wire and wire-to-wire coupling models using both worst-case and detailed circuit parameters (physical and electrical) to predict circuit safety margins. This process identified the predicted critical circuits that could then be verified by test.
Operation, Modeling and Analysis of the Reverse Water Gas Shift Process
NASA Technical Reports Server (NTRS)
Whitlow, Jonathan E.
2001-01-01
The Reverse Water Gas Shift process is a candidate technology for water and oxygen production on Mars under the In-Situ Propellant Production project. This report focuses on the operation and analysis of the Reverse Water Gas Shift (RWGS) process, which has been constructed at Kennedy Space Center. A summary of results from the initial operation of the RWGS, process along with an analysis of these results is included in this report. In addition an evaluation of a material balance model developed from the work performed previously under the summer program is included along with recommendations for further experimental work.
NASA Technical Reports Server (NTRS)
Parrish, R. S.; Carter, M. C.
1974-01-01
This analysis utilizes computer simulation and statistical estimation. Realizations of stationary gaussian stochastic processes with selected autocorrelation functions are computer simulated. Analysis of the simulated data revealed that the mean and the variance of a process were functionally dependent upon the autocorrelation parameter and crossing level. Using predicted values for the mean and standard deviation, by the method of moments, the distribution parameters was estimated. Thus, given the autocorrelation parameter, crossing level, mean, and standard deviation of a process, the probability of exceeding the crossing level for a particular length of time was calculated.
TU-AB-BRD-03: Fault Tree Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunscombe, P.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
TU-AB-BRD-02: Failure Modes and Effects Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huq, M.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
What carries a mediation process? Configural analysis of mediation.
von Eye, Alexander; Mun, Eun Young; Mair, Patrick
2009-09-01
Mediation is a process that links a predictor and a criterion via a mediator variable. Mediation can be full or partial. This well-established definition operates at the level of variables even if they are categorical. In this article, two new approaches to the analysis of mediation are proposed. Both of these approaches focus on the analysis of categorical variables. The first involves mediation analysis at the level of configurations instead of variables. Thus, mediation can be incorporated into the arsenal of methods of analysis for person-oriented research. Second, it is proposed that Configural Frequency Analysis (CFA) can be used for both exploration and confirmation of mediation relationships among categorical variables. The implications of using CFA are first that mediation hypotheses can be tested at the level of individual configurations instead of variables. Second, this approach leaves the door open for different types of mediation processes to exist within the same set. Using a data example, it is illustrated that aggregate-level analysis can overlook mediation processes that operate at the level of individual configurations.
Metabolic profiling of body fluids and multivariate data analysis.
Trezzi, Jean-Pierre; Jäger, Christian; Galozzi, Sara; Barkovits, Katalin; Marcus, Katrin; Mollenhauer, Brit; Hiller, Karsten
2017-01-01
Metabolome analyses of body fluids are challenging due pre-analytical variations, such as pre-processing delay and temperature, and constant dynamical changes of biochemical processes within the samples. Therefore, proper sample handling starting from the time of collection up to the analysis is crucial to obtain high quality samples and reproducible results. A metabolomics analysis is divided into 4 main steps: 1) Sample collection, 2) Metabolite extraction, 3) Data acquisition and 4) Data analysis. Here, we describe a protocol for gas chromatography coupled to mass spectrometry (GC-MS) based metabolic analysis for biological matrices, especially body fluids. This protocol can be applied on blood serum/plasma, saliva and cerebrospinal fluid (CSF) samples of humans and other vertebrates. It covers sample collection, sample pre-processing, metabolite extraction, GC-MS measurement and guidelines for the subsequent data analysis. Advantages of this protocol include: •Robust and reproducible metabolomics results, taking into account pre-analytical variations that may occur during the sampling process•Small sample volume required•Rapid and cost-effective processing of biological samples•Logistic regression based determination of biomarker signatures for in-depth data analysis.
Comprehensive NMR analysis of compositional changes of black garlic during thermal processing.
Liang, Tingfu; Wei, Feifei; Lu, Yi; Kodani, Yoshinori; Nakada, Mitsuhiko; Miyakawa, Takuya; Tanokura, Masaru
2015-01-21
Black garlic is a processed food product obtained by subjecting whole raw garlic to thermal processing that causes chemical reactions, such as the Maillard reaction, which change the composition of the garlic. In this paper, we report a nuclear magnetic resonance (NMR)-based comprehensive analysis of raw garlic and black garlic extracts to determine the compositional changes resulting from thermal processing. (1)H NMR spectra with a detailed signal assignment showed that 38 components were altered by thermal processing of raw garlic. For example, the contents of 11 l-amino acids increased during the first step of thermal processing over 5 days and then decreased. Multivariate data analysis revealed changes in the contents of fructose, glucose, acetic acid, formic acid, pyroglutamic acid, cycloalliin, and 5-(hydroxymethyl)furfural (5-HMF). Our results provide comprehensive information on changes in NMR-detectable components during thermal processing of whole garlic.
NASA Technical Reports Server (NTRS)
Goldman, H.; Wolf, M.
1979-01-01
Analyses of slicing processes and junction formation processes are presented. A simple method for evaluation of the relative economic merits of competing process options with respect to the cost of energy produced by the system is described. An energy consumption analysis was developed and applied to determine the energy consumption in the solar module fabrication process sequence, from the mining of the SiO2 to shipping. The analysis shows that, in current technology practice, inordinate energy use in the purification step, and large wastage of the invested energy through losses, particularly poor conversion in slicing, as well as inadequate yields throughout. The cell process energy expenditures already show a downward trend based on increased throughput rates. The large improvement, however, depends on the introduction of a more efficient purification process and of acceptable ribbon growing techniques.
Inverse Thermal Analysis of Titanium GTA Welds Using Multiple Constraints
NASA Astrophysics Data System (ADS)
Lambrakos, S. G.; Shabaev, A.; Huang, L.
2015-06-01
Inverse thermal analysis of titanium gas-tungsten-arc welds using multiple constraint conditions is presented. This analysis employs a methodology that is in terms of numerical-analytical basis functions for inverse thermal analysis of steady-state energy deposition in plate structures. The results of this type of analysis provide parametric representations of weld temperature histories that can be adopted as input data to various types of computational procedures, such as those for prediction of solid-state phase transformations. In addition, these temperature histories can be used to construct parametric function representations for inverse thermal analysis of welds corresponding to other process parameters or welding processes whose process conditions are within similar regimes. The present study applies an inverse thermal analysis procedure that provides for the inclusion of constraint conditions associated with both solidification and phase transformation boundaries.
The Market Responses to the Government Regulation of Chlorinated Solvents: A Policy Analysis
1988-10-01
in the process of statistical estimation of model parameters. The results of the estimation process applied to chlorinated solvent markets show the...93 C.5. Marginal Feedstock Cost Series Estimates for Process Share of Total Production .................................. 94 F.I...poliay context for this research. Section III provides analysis necessary to understand the chemicals involved, their production processes and costs, and
29 CFR 1910.119 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2011 CFR
2011-07-01
... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...
29 CFR 1910.119 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2010 CFR
2010-07-01
... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...
Optimization, an Important Stage of Engineering Design
ERIC Educational Resources Information Center
Kelley, Todd R.
2010-01-01
A number of leaders in technology education have indicated that a major difference between the technological design process and the engineering design process is analysis and optimization. The analysis stage of the engineering design process is when mathematical models and scientific principles are employed to help the designer predict design…
The Impact of Meaning and Dimensionality on Copying Accuracy in Individuals with Autism
ERIC Educational Resources Information Center
Sheppard, Elizabeth; Ropar, Danielle; Mitchell, Peter
2007-01-01
Weak Central Coherence (Frith, 1989) predicts that, in autism, perceptual processing is relatively unaffected by conceptual analysis. Enhanced Perceptual Functioning (Mottron & Burack, 2001) predicts that the perceptual processing of those with autism is less influenced by conceptual analysis only when higher-level processing is detrimental to…
Encapsulation Processing and Manufacturing Yield Analysis
NASA Technical Reports Server (NTRS)
Willis, P. B.
1984-01-01
The development of encapsulation processing and a manufacturing productivity analysis for photovoltaic cells are discussed. The goals were: (1) to understand the relationships between both formulation variables and process variables; (2) to define conditions required for optimum performance; (3) to predict manufacturing yield; and (4) to provide documentation to industry.
Vygotsky's Analysis of Children's Meaning Making Processes
ERIC Educational Resources Information Center
Mahn, Holbrook
2012-01-01
Vygotsky's work is extensive and covers many aspects of the development of children's meaning-making processes in social and cultural contexts. However, his main focus is on the examination of the unification of speaking and thinking processes. His investigation centers on the analysis of the entity created by this unification--an internal…
Encapsulation processing and manufacturing yield analysis
NASA Astrophysics Data System (ADS)
Willis, P. B.
1984-10-01
The development of encapsulation processing and a manufacturing productivity analysis for photovoltaic cells are discussed. The goals were: (1) to understand the relationships between both formulation variables and process variables; (2) to define conditions required for optimum performance; (3) to predict manufacturing yield; and (4) to provide documentation to industry.
Model prototype utilization in the analysis of fault tolerant control and data processing systems
NASA Astrophysics Data System (ADS)
Kovalev, I. V.; Tsarev, R. Yu; Gruzenkin, D. V.; Prokopenko, A. V.; Knyazkov, A. N.; Laptenok, V. D.
2016-04-01
The procedure assessing the profit of control and data processing system implementation is presented in the paper. The reasonability of model prototype creation and analysis results from the implementing of the approach of fault tolerance provision through the inclusion of structural and software assessment redundancy. The developed procedure allows finding the best ratio between the development cost and the analysis of model prototype and earnings from the results of this utilization and information produced. The suggested approach has been illustrated by the model example of profit assessment and analysis of control and data processing system.
Language-Based Curriculum Analysis: A Collaborative Assessment and Intervention Process.
ERIC Educational Resources Information Center
Prelock, Patricia A.
1997-01-01
Presents a systematic process for completing a language-based curriculum analysis to address curriculum expectations that may challenge students with communication impairments. Analysis of vocabulary and the demands for comprehension, oral, and written expression within specific content areas provides a framework for collaboration between teachers…
Theories of State Analyzing the Policy Process,
1973-11-01
values and goals - which is the heart of the rational process-- in reality cannot be separated from the actor’s empirical analysis of the situation...rigorous and objective in analysis . How different would our foreign policy actually be? Would it necessarily be better? In fact, would one even need...State, but the fact is that much of the outside research and analysis of policy process is pointed at the 6 As Robert Rothstein says in his valuable
Space processing applications payload equipment study. Volume 2A: Experiment requirements
NASA Technical Reports Server (NTRS)
Smith, A. G.; Anderson, W. T., Jr.
1974-01-01
An analysis of the space processing applications payload equipment was conducted. The primary objective was to perform a review and an update of the space processing activity research equipment requirements and specifications that were derived in the first study. The analysis is based on the six major experimental classes of: (1) biological applications, (2) chemical processes in fluids, (3) crystal growth, (4) glass technology, (5) metallurgical processes, and (6) physical processes in fluids. Tables of data are prepared to show the functional requirements for the areas of investigation.
Risk analysis within environmental impact assessment of proposed construction activity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeleňáková, Martina; Zvijáková, Lenka
Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less
Optimisation of warpage on plastic injection moulding part using response surface methodology (RSM)
NASA Astrophysics Data System (ADS)
Miza, A. T. N. A.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Rashidi, M. M.
2017-09-01
The warpage is often encountered which occur during injection moulding process of thin shell part depending the process condition. The statistical design of experiment method which are Integrating Finite Element (FE) Analysis, moldflow analysis and response surface methodology (RSM) are the stage of few ways in minimize the warpage values of x,y and z on thin shell plastic parts that were investigated. A battery cover of a remote controller is one of the thin shell plastic part that produced by using injection moulding process. The optimum process condition parameter were determined as to achieve the minimum warpage from being occur. Packing pressure, Cooling time, Melt temperature and Mould temperature are 4 parameters that considered in this study. A two full factorial experimental design was conducted in Design Expert of RSM analysis as to combine all these parameters study. FE analysis result gain from analysis of variance (ANOVA) method was the one of the important process parameters influenced warpage. By using RSM, a predictive response surface model for warpage data will be shown.
Zhang, Hanyuan; Tian, Xuemin; Deng, Xiaogang; Cao, Yuping
2018-05-16
As an attractive nonlinear dynamic data analysis tool, global preserving kernel slow feature analysis (GKSFA) has achieved great success in extracting the high nonlinearity and inherently time-varying dynamics of batch process. However, GKSFA is an unsupervised feature extraction method and lacks the ability to utilize batch process class label information, which may not offer the most effective means for dealing with batch process monitoring. To overcome this problem, we propose a novel batch process monitoring method based on the modified GKSFA, referred to as discriminant global preserving kernel slow feature analysis (DGKSFA), by closely integrating discriminant analysis and GKSFA. The proposed DGKSFA method can extract discriminant feature of batch process as well as preserve global and local geometrical structure information of observed data. For the purpose of fault detection, a monitoring statistic is constructed based on the distance between the optimal kernel feature vectors of test data and normal data. To tackle the challenging issue of nonlinear fault variable identification, a new nonlinear contribution plot method is also developed to help identifying the fault variable after a fault is detected, which is derived from the idea of variable pseudo-sample trajectory projection in DGKSFA nonlinear biplot. Simulation results conducted on a numerical nonlinear dynamic system and the benchmark fed-batch penicillin fermentation process demonstrate that the proposed process monitoring and fault diagnosis approach can effectively detect fault and distinguish fault variables from normal variables. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
Khaligh-Razavi, Seyed-Mahdi; Cichy, Radoslaw Martin; Pantazis, Dimitrios; Oliva, Aude
2018-06-07
Animacy and real-world size are properties that describe any object and thus bring basic order into our perception of the visual world. Here, we investigated how the human brain processes real-world size and animacy. For this, we applied representational similarity to fMRI and MEG data to yield a view of brain activity with high spatial and temporal resolutions, respectively. Analysis of fMRI data revealed that a distributed and partly overlapping set of cortical regions extending from occipital to ventral and medial temporal cortex represented animacy and real-world size. Within this set, parahippocampal cortex stood out as the region representing animacy and size stronger than most other regions. Further analysis of the detailed representational format revealed differences among regions involved in processing animacy. Analysis of MEG data revealed overlapping temporal dynamics of animacy and real-world size processing starting at around 150 msec and provided the first neuromagnetic signature of real-world object size processing. Finally, to investigate the neural dynamics of size and animacy processing simultaneously in space and time, we combined MEG and fMRI with a novel extension of MEG-fMRI fusion by representational similarity. This analysis revealed partly overlapping and distributed spatiotemporal dynamics, with parahippocampal cortex singled out as a region that represented size and animacy persistently when other regions did not. Furthermore, the analysis highlighted the role of early visual cortex in representing real-world size. A control analysis revealed that the neural dynamics of processing animacy and size were distinct from the neural dynamics of processing low-level visual features. Together, our results provide a detailed spatiotemporal view of animacy and size processing in the human brain.
Rouhani, M H; Salehi-Abargouei, A; Surkan, P J; Azadbakht, L
2014-09-01
A body of literature exists regarding the association of red and processed meats with obesity; however, the nature and extent of this relation has not been clearly established. The aim of this study is to conduct a systematic review and meta-analysis of the relationship between red and processed meat intake and obesity. We searched multiple electronic databases for observational studies on the relationship between red and processed meat intake and obesity published until July 2013. Odds ratios (ORs) and means for obesity-related indices and for variables that may contribute to heterogeneity were calculated. A systematic review and a meta-analysis were conducted with 21 and 18 studies, respectively (n = 1,135,661). The meta-analysis (n = 113,477) showed that consumption of higher quantities of red and processed meats was a risk factor for obesity (OR: 1.37; 95% CI: 1.14-1.64). Pooled mean body mass index (BMI) and waist circumference (WC) trends showed that in comparison to those in the lowest ntile, subjects in the highest ntile of red and processed meat consumption had higher BMI (mean difference: 1.37; 95% CI: 0.90-1.84 for red meat; mean difference: 1.32; 95% CI: 0.64-2.00 for processed meat) and WC (mean difference: 2.79; 95% CI: 1.86-3.70 for red meat; mean difference: 2.77; 95% CI: 1.87-2.66 for processed meat). The current analysis revealed that red and processed meat intake is directly associated with risk of obesity, and higher BMI and WC. However, the heterogeneity among studies is significant. These findings suggest a decrease in red and processed meat intake. © 2014 The Authors. obesity reviews © 2014 World Obesity.
Statistical quality control through overall vibration analysis
NASA Astrophysics Data System (ADS)
Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos
2010-05-01
The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence of predictive variables (high-frequency vibration displacements) that are sensible to the processes setup and the quality of the products obtained. Based on the result of this overall vibration analysis, a second paper will analyse self-induced vibration spectrums in order to define limit vibration bands, controllable every cycle or connected to permanent vibration-monitoring systems able to adjust sensible process variables identified by ANOVA, once the vibration readings exceed established quality limits.
Gourley, Paul L.; Gourley, Mark F.
1997-01-01
An apparatus and method for microscopic and spectroscopic analysis and processing of biological cells. The apparatus comprises a laser having an analysis region within the laser cavity for containing one or more biological cells to be analyzed. The presence of a cell within the analysis region in superposition with an activated portion of a gain medium of the laser acts to encode information about the cell upon the laser beam, the cell information being recoverable by an analysis means that preferably includes an array photodetector such as a CCD camera and a spectrometer. The apparatus and method may be used to analyze biomedical cells including blood cells and the like, and may include processing means for manipulating, sorting, or eradicating cells after analysis thereof.
Gourley, P.L.; Gourley, M.F.
1997-03-04
An apparatus and method are disclosed for microscopic and spectroscopic analysis and processing of biological cells. The apparatus comprises a laser having an analysis region within the laser cavity for containing one or more biological cells to be analyzed. The presence of a cell within the analysis region in superposition with an activated portion of a gain medium of the laser acts to encode information about the cell upon the laser beam, the cell information being recoverable by an analysis means that preferably includes an array photodetector such as a CCD camera and a spectrometer. The apparatus and method may be used to analyze biomedical cells including blood cells and the like, and may include processing means for manipulating, sorting, or eradicating cells after analysis. 20 figs.
Child versus adult psychoanalysis: two processes or one?
Sugarman, Alan
2009-12-01
Child analysis continues to be seen as a different technique from adult analysis because children are still involved in a developmental process and because the primary objects continue to play active roles in their lives. This paper argues that this is a false dichotomy. An extended vignette of the analysis of a latency-aged girl is used to demonstrate that the psychoanalytic process that develops in child analysis is structurally the same as that in adult analysis. Both revolve around the analysis of resistance and transference and use both to promote knowledge of the patient's mind at work. And both techniques formulate interventions based on the analyst's appraisal of the patient's mental organization. It is hoped that stressing the essential commonality of both techniques will promote the development of an overarching theory of psychoanalytic technique.
An application of computer aided requirements analysis to a real time deep space system
NASA Technical Reports Server (NTRS)
Farny, A. M.; Morris, R. V.; Hartsough, C.; Callender, E. D.; Teichroew, D.; Chikofsky, E.
1981-01-01
The entire procedure of incorporating the requirements and goals of a space flight project into integrated, time ordered sequences of spacecraft commands, is called the uplink process. The Uplink Process Control Task (UPCT) was created to examine the uplink process and determine ways to improve it. The Problem Statement Language/Problem Statement Analyzer (PSL/PSA) designed to assist the designer/analyst/engineer in the preparation of specifications of an information system is used as a supporting tool to aid in the analysis. Attention is given to a definition of the uplink process, the definition of PSL/PSA, the construction of a PSA database, the value of analysis to the study of the uplink process, and the PSL/PSA lessons learned.
Expert system for web based collaborative CAE
NASA Astrophysics Data System (ADS)
Hou, Liang; Lin, Zusheng
2006-11-01
An expert system for web based collaborative CAE was developed based on knowledge engineering, relational database and commercial FEA (Finite element analysis) software. The architecture of the system was illustrated. In this system, the experts' experiences, theories and typical examples and other related knowledge, which will be used in the stage of pre-process in FEA, were categorized into analysis process and object knowledge. Then, the integrated knowledge model based on object-oriented method and rule based method was described. The integrated reasoning process based on CBR (case based reasoning) and rule based reasoning was presented. Finally, the analysis process of this expert system in web based CAE application was illustrated, and an analysis example of a machine tool's column was illustrated to prove the validity of the system.
Choosing order of operations to accelerate strip structure analysis in parameter range
NASA Astrophysics Data System (ADS)
Kuksenko, S. P.; Akhunov, R. R.; Gazizov, T. R.
2018-05-01
The paper considers the issue of using iteration methods in solving the sequence of linear algebraic systems obtained in quasistatic analysis of strip structures with the method of moments. Using the analysis of 4 strip structures, the authors have proved that additional acceleration (up to 2.21 times) of the iterative process can be obtained during the process of solving linear systems repeatedly by means of choosing a proper order of operations and a preconditioner. The obtained results can be used to accelerate the process of computer-aided design of various strip structures. The choice of the order of operations to accelerate the process is quite simple, universal and could be used not only for strip structure analysis but also for a wide range of computational problems.
Second-Order Analysis of Semiparametric Recurrent Event Processes
Guan, Yongtao
2011-01-01
Summary A typical recurrent event dataset consists of an often large number of recurrent event processes, each of which contains multiple event times observed from an individual during a followup period. Such data have become increasingly available in medical and epidemiological studies. In this paper, we introduce novel procedures to conduct second-order analysis for a flexible class of semiparametric recurrent event processes. Such an analysis can provide useful information regarding the dependence structure within each recurrent event process. Specifically, we will use the proposed procedures to test whether the individual recurrent event processes are all Poisson processes and to suggest sensible alternative models for them if they are not. We apply these procedures to a well-known recurrent event dataset on chronic granulomatous disease and an epidemiological dataset on Meningococcal disease cases in Merseyside, UK to illustrate their practical value. PMID:21361885
Quantifiable and objective approach to organizational performance enhancement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scholand, Andrew Joseph; Tausczik, Yla R.
This report describes a new methodology, social language network analysis (SLNA), that combines tools from social language processing and network analysis to identify socially situated relationships between individuals which, though subtle, are highly influential. Specifically, SLNA aims to identify and characterize the nature of working relationships by processing artifacts generated with computer-mediated communication systems, such as instant message texts or emails. Because social language processing is able to identify psychological, social, and emotional processes that individuals are not able to fully mask, social language network analysis can clarify and highlight complex interdependencies between group members, even when these relationships aremore » latent or unrecognized. This report outlines the philosophical antecedents of SLNA, the mechanics of preprocessing, processing, and post-processing stages, and some example results obtained by applying this approach to a 15-month corporate discussion archive.« less
Dai, Heng; Ye, Ming; Walker, Anthony P.; ...
2017-03-28
A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Heng; Ye, Ming; Walker, Anthony P.
A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less
A State Space Modeling Approach to Mediation Analysis
ERIC Educational Resources Information Center
Gu, Fei; Preacher, Kristopher J.; Ferrer, Emilio
2014-01-01
Mediation is a causal process that evolves over time. Thus, a study of mediation requires data collected throughout the process. However, most applications of mediation analysis use cross-sectional rather than longitudinal data. Another implicit assumption commonly made in longitudinal designs for mediation analysis is that the same mediation…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-18
... technical analysis submitted for parallel-processing by DNREC on December 9, 2010, to address significant... technical analysis submitted by DNREC for parallel-processing on December 9, 2010, to satisfy the... consists of a technical analysis that provides detailed support for Delaware's position that it has...
Image Analysis, Microscopic, and Spectrochemical Study of the PVC Dry Blending Process,
The dry blending process used in the production of electrical grade pvc formulations has been studies using a combination of image analysis , microscopic...by image analysis techniques. Optical and scanning electron microscopy were used to assess morphological differences. Spectrochemical techniques were used to indicate chemical changes.
Processing Cones: A Computational Structure for Image Analysis.
1981-12-01
image analysis applications, referred to as a processing cone, is described and sample algorithms are presented. A fundamental characteristic of the structure is its hierarchical organization into two-dimensional arrays of decreasing resolution. In this architecture, a protypical function is defined on a local window of data and applied uniformly to all windows in a parallel manner. Three basic modes of processing are supported in the cone: reduction operations (upward processing), horizontal operations (processing at a single level) and projection operations (downward
Lab-on-a-chip based total-phosphorus analysis device utilizing a photocatalytic reaction
NASA Astrophysics Data System (ADS)
Jung, Dong Geon; Jung, Daewoong; Kong, Seong Ho
2018-02-01
A lab-on-a-chip (LOC) device for total phosphorus (TP) analysis was fabricated for water quality monitoring. Many commercially available TP analysis systems used to estimate water quality have good sensitivity and accuracy. However, these systems also have many disadvantages such as bulky size, complex pretreatment processes, and high cost, which limit their application. In particular, conventional TP analysis systems require an indispensable pretreatment step, in which the fluidic analyte is heated to 120 °C for 30 min to release the dissolved phosphate, because many phosphates are soluble in water at a standard temperature and pressure. In addition, this pretreatment process requires elevated pressures of up to 1.1 kg cm-2 in order to prevent the evaporation of the heated analyte. Because of these limiting conditions required by the pretreatment processes used in conventional systems, it is difficult to miniaturize TP analysis systems. In this study, we employed a photocatalytic reaction in the pretreatment process. The reaction was carried out by illuminating a photocatalytic titanium dioxide (TiO2) surface formed in a microfluidic channel with ultraviolet (UV) light. This pretreatment process does not require elevated temperatures and pressures. By applying this simplified, photocatalytic-reaction-based pretreatment process to a TP analysis system, greater degrees of freedom are conferred to the design and fabrication of LOC devices for TP monitoring. The fabricated LOC device presented in this paper was characterized by measuring the TP concentration of an unknown sample, and comparing the results with those measured by a conventional TP analysis system. The TP concentrations of the unknown sample measured by the proposed LOC device and the conventional TP analysis system were 0.018 mgP/25 mL and 0.019 mgP/25 mL, respectively. The experimental results revealed that the proposed LOC device had a performance comparable to the conventional bulky TP analysis system. Therefore, our device could be directly employed in water quality monitoring as an alternative to conventional TP analysis systems.
Anima: Modular Workflow System for Comprehensive Image Data Analysis
Rantanen, Ville; Valori, Miko; Hautaniemi, Sampsa
2014-01-01
Modern microscopes produce vast amounts of image data, and computational methods are needed to analyze and interpret these data. Furthermore, a single image analysis project may require tens or hundreds of analysis steps starting from data import and pre-processing to segmentation and statistical analysis; and ending with visualization and reporting. To manage such large-scale image data analysis projects, we present here a modular workflow system called Anima. Anima is designed for comprehensive and efficient image data analysis development, and it contains several features that are crucial in high-throughput image data analysis: programing language independence, batch processing, easily customized data processing, interoperability with other software via application programing interfaces, and advanced multivariate statistical analysis. The utility of Anima is shown with two case studies focusing on testing different algorithms developed in different imaging platforms and an automated prediction of alive/dead C. elegans worms by integrating several analysis environments. Anima is a fully open source and available with documentation at www.anduril.org/anima. PMID:25126541
Spectroscopic analysis technique for arc-welding process control
NASA Astrophysics Data System (ADS)
Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel
2005-09-01
The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.
Emotion processing in the visual brain: a MEG analysis.
Peyk, Peter; Schupp, Harald T; Elbert, Thomas; Junghöfer, Markus
2008-06-01
Recent functional magnetic resonance imaging (fMRI) and event-related brain potential (ERP) studies provide empirical support for the notion that emotional cues guide selective attention. Extending this line of research, whole head magneto-encephalogram (MEG) was measured while participants viewed in separate experimental blocks a continuous stream of either pleasant and neutral or unpleasant and neutral pictures, presented for 330 ms each. Event-related magnetic fields (ERF) were analyzed after intersubject sensor coregistration, complemented by minimum norm estimates (MNE) to explore neural generator sources. Both streams of analysis converge by demonstrating the selective emotion processing in an early (120-170 ms) and a late time interval (220-310 ms). ERF analysis revealed that the polarity of the emotion difference fields was reversed across early and late intervals suggesting distinct patterns of activation in the visual processing stream. Source analysis revealed the amplified processing of emotional pictures in visual processing areas with more pronounced occipito-parieto-temporal activation in the early time interval, and a stronger engagement of more anterior, temporal, regions in the later interval. Confirming previous ERP studies showing facilitated emotion processing, the present data suggest that MEG provides a complementary look at the spread of activation in the visual processing stream.
Deng, Bo; Shi, Yaoyao; Yu, Tao; Kang, Chao; Zhao, Pan
2018-01-31
The composite tape winding process, which utilizes a tape winding machine and prepreg tapes, provides a promising way to improve the quality of composite products. Nevertheless, the process parameters of composite tape winding have crucial effects on the tensile strength and void content, which are closely related to the performances of the winding products. In this article, two different object values of winding products, including mechanical performance (tensile strength) and a physical property (void content), were respectively calculated. Thereafter, the paper presents an integrated methodology by combining multi-parameter relative sensitivity analysis and single-parameter sensitivity analysis to obtain the optimal intervals of the composite tape winding process. First, the global multi-parameter sensitivity analysis method was applied to investigate the sensitivity of each parameter in the tape winding processing. Then, the local single-parameter sensitivity analysis method was employed to calculate the sensitivity of a single parameter within the corresponding range. Finally, the stability and instability ranges of each parameter were distinguished. Meanwhile, the authors optimized the process parameter ranges and provided comprehensive optimized intervals of the winding parameters. The verification test validated that the optimized intervals of the process parameters were reliable and stable for winding products manufacturing.
Yu, Tao; Kang, Chao; Zhao, Pan
2018-01-01
The composite tape winding process, which utilizes a tape winding machine and prepreg tapes, provides a promising way to improve the quality of composite products. Nevertheless, the process parameters of composite tape winding have crucial effects on the tensile strength and void content, which are closely related to the performances of the winding products. In this article, two different object values of winding products, including mechanical performance (tensile strength) and a physical property (void content), were respectively calculated. Thereafter, the paper presents an integrated methodology by combining multi-parameter relative sensitivity analysis and single-parameter sensitivity analysis to obtain the optimal intervals of the composite tape winding process. First, the global multi-parameter sensitivity analysis method was applied to investigate the sensitivity of each parameter in the tape winding processing. Then, the local single-parameter sensitivity analysis method was employed to calculate the sensitivity of a single parameter within the corresponding range. Finally, the stability and instability ranges of each parameter were distinguished. Meanwhile, the authors optimized the process parameter ranges and provided comprehensive optimized intervals of the winding parameters. The verification test validated that the optimized intervals of the process parameters were reliable and stable for winding products manufacturing. PMID:29385048
40 CFR 68.67 - Process hazard analysis.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... instrumentation with alarms, and detection hardware such as hydrocarbon sensors.); (4) Consequences of failure of...
40 CFR 68.67 - Process hazard analysis.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... instrumentation with alarms, and detection hardware such as hydrocarbon sensors.); (4) Consequences of failure of...
Janknegt, Robert; Scott, Mike; Mairs, Jill; Timoney, Mark; McElnay, James; Brenninkmeijer, Rob
2007-10-01
Drug selection should be a rational process that embraces the principles of evidence-based medicine. However, many factors may affect the choice of agent. It is against this background that the System of Objectified Judgement Analysis (SOJA) process for rational drug-selection was developed. This article describes how the information on which the SOJA process is based, was researched and processed.
NASA Astrophysics Data System (ADS)
Mehrpooya, Mehdi; Ansarinasab, Hojat; Moftakhari Sharifzadeh, Mohammad Mehdi; Rosen, Marc A.
2017-10-01
An integrated power plant with a net electrical power output of 3.71 × 105 kW is developed and investigated. The electrical efficiency of the process is found to be 60.1%. The process includes three main sub-systems: molten carbonate fuel cell system, heat recovery section and cryogenic carbon dioxide capturing process. Conventional and advanced exergoeconomic methods are used for analyzing the process. Advanced exergoeconomic analysis is a comprehensive evaluation tool which combines an exergetic approach with economic analysis procedures. With this method, investment and exergy destruction costs of the process components are divided into endogenous/exogenous and avoidable/unavoidable parts. Results of the conventional exergoeconomic analyses demonstrate that the combustion chamber has the largest exergy destruction rate (182 MW) and cost rate (13,100 /h). Also, the total process cost rate can be decreased by reducing the cost rate of the fuel cell and improving the efficiency of the combustion chamber and heat recovery steam generator. Based on the total avoidable endogenous cost rate, the priority for modification is the heat recovery steam generator, a compressor and a turbine of the power plant, in rank order. A sensitivity analysis is done to investigate the exergoeconomic factor parameters through changing the effective parameter variations.
Arvanitoyannis, Ioannis S; Varzakas, Theodoros H
2009-08-01
Failure Mode and Effect Analysis (FMEA) has been applied for the risk assessment of snails manufacturing. A tentative approach of FMEA application to the snails industry was attempted in conjunction with ISO 22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (snails processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and fishbone diagram). In this work a comparison of ISO22000 analysis with HACCP is carried out over snails processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the RPN per identified processing hazard. Sterilization of tins, bioaccumulation of heavy metals, packaging of shells and poisonous mushrooms, were the processes identified as the ones with the highest RPN (280, 240, 147, 144, respectively) and corrective actions were undertaken. Following the application of corrective actions, a second calculation of RPN values was carried out leading to considerably lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO22000 system of a snails processing industry is considered imperative.
Practical, transparent prospective risk analysis for the clinical laboratory.
Janssens, Pim Mw
2014-11-01
Prospective risk analysis (PRA) is an essential element in quality assurance for clinical laboratories. Practical approaches to conducting PRA in laboratories, however, are scarce. On the basis of the classical Failure Mode and Effect Analysis method, an approach to PRA was developed for application to key laboratory processes. First, the separate, major steps of the process under investigation are identified. Scores are then given for the Probability (P) and Consequence (C) of predefined types of failures and the chances of Detecting (D) these failures. Based on the P and C scores (on a 10-point scale), an overall Risk score (R) is calculated. The scores for each process were recorded in a matrix table. Based on predetermined criteria for R and D, it was determined whether a more detailed analysis was required for potential failures and, ultimately, where risk-reducing measures were necessary, if any. As an illustration, this paper presents the results of the application of PRA to our pre-analytical and analytical activities. The highest R scores were obtained in the stat processes, the most common failure type in the collective process steps was 'delayed processing or analysis', the failure type with the highest mean R score was 'inappropriate analysis' and the failure type most frequently rated as suboptimal was 'identification error'. The PRA designed is a useful semi-objective tool to identify process steps with potential failures rated as risky. Its systematic design and convenient output in matrix tables makes it easy to perform, practical and transparent. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Planning applications in image analysis
NASA Technical Reports Server (NTRS)
Boddy, Mark; White, Jim; Goldman, Robert; Short, Nick, Jr.
1994-01-01
We describe two interim results from an ongoing effort to automate the acquisition, analysis, archiving, and distribution of satellite earth science data. Both results are applications of Artificial Intelligence planning research to the automatic generation of processing steps for image analysis tasks. First, we have constructed a linear conditional planner (CPed), used to generate conditional processing plans. Second, we have extended an existing hierarchical planning system to make use of durations, resources, and deadlines, thus supporting the automatic generation of processing steps in time and resource-constrained environments.
1992-12-21
in preparation). Foundations of artificial intelligence. Cambridge, MA: MIT Press. O’Reilly, R. C. (1991). X3DNet: An X- Based Neural Network ...2.2.3 Trace based protocol analysis 19 2.2A Summary of important data features 21 2.3 Tools related to process model testing 23 2.3.1 Tools for building...algorithm 57 3. Requirements for testing process models using trace based protocol 59 analysis 3.1 Definition of trace based protocol analysis (TBPA) 59
Parallel Algorithms for Image Analysis.
1982-06-01
8217 _ _ _ _ _ _ _ 4. TITLE (aid Subtitle) S. TYPE OF REPORT & PERIOD COVERED PARALLEL ALGORITHMS FOR IMAGE ANALYSIS TECHNICAL 6. PERFORMING O4G. REPORT NUMBER TR-1180...Continue on reverse side it neceesary aid Identlfy by block number) Image processing; image analysis ; parallel processing; cellular computers. 20... IMAGE ANALYSIS TECHNICAL 6. PERFORMING ONG. REPORT NUMBER TR-1180 - 7. AUTHOR(&) S. CONTRACT OR GRANT NUMBER(s) Azriel Rosenfeld AFOSR-77-3271 9
USDA-ARS?s Scientific Manuscript database
Using five centimeter resolution images acquired with an unmanned aircraft system (UAS), we developed and evaluated an image processing workflow that included the integration of resolution-appropriate field sampling, feature selection, object-based image analysis, and processing approaches for UAS i...
The Strategic WAste Minimization Initiative (SWAMI) Software, Version 2.0 is a tool for using process analysis for identifying waste minimization opportunities within an industrial setting. The software requires user-supplied information for process definition, as well as materia...
Knowledge Reasoning with Semantic Data for Real-Time Data Processing in Smart Factory
Wang, Shiyong; Li, Di; Liu, Chengliang
2018-01-01
The application of high-bandwidth networks and cloud computing in manufacturing systems will be followed by mass data. Industrial data analysis plays important roles in condition monitoring, performance optimization, flexibility, and transparency of the manufacturing system. However, the currently existing architectures are mainly for offline data analysis, not suitable for real-time data processing. In this paper, we first define the smart factory as a cloud-assisted and self-organized manufacturing system in which physical entities such as machines, conveyors, and products organize production through intelligent negotiation and the cloud supervises this self-organized process for fault detection and troubleshooting based on data analysis. Then, we propose a scheme to integrate knowledge reasoning and semantic data where the reasoning engine processes the ontology model with real time semantic data coming from the production process. Based on these ideas, we build a benchmarking system for smart candy packing application that supports direct consumer customization and flexible hybrid production, and the data are collected and processed in real time for fault diagnosis and statistical analysis. PMID:29415444
NASA Technical Reports Server (NTRS)
Goldman, H.; Wolf, M.
1978-01-01
Several experimental and projected Czochralski crystal growing process methods were studied and compared to available operations and cost-data of recent production Cz-pulling, in order to elucidate the role of the dominant cost contributing factors. From this analysis, it becomes apparent that the specific add-on costs of the Cz-process can be expected to be reduced by about a factor of three by 1982, and about a factor of five by 1986. A format to guide in the accumulation of the data needed for thorough techno-economic analysis of solar cell production processes was developed.
NASA Technical Reports Server (NTRS)
Junkin, B. G. (Principal Investigator)
1979-01-01
A method is presented for the processing and analysis of digital topography data that can subsequently be entered in an interactive data base in the form of slope, slope length, elevation, and aspect angle. A discussion of the data source and specific descriptions of the data processing software programs are included. In addition, the mathematical considerations involved in the registration of raw digitized coordinate points to the UTM coordinate system are presented. Scale factor considerations are also included. Results of the processing and analysis are illustrated using the Shiprock and Gallup Quadrangle test data.
Data processing for a cosmic ray experiment onboard the solar probes Helios 1 and 2: Experiment 6
NASA Technical Reports Server (NTRS)
Mueller-Mellin, R.; Green, G.; Iwers, B.; Kunow, H.; Wibberenz, G.; Fuckner, J.; Hempe, H.; Witte, M.
1982-01-01
The data processing system for the Helios experiment 6, measuring energetic charged particles of solar, planetary and galactic origin in the inner solar system, is described. The aim of this experiment is to extend knowledge on origin and propagation of cosmic rays. The different programs for data reduction, analysis, presentation, and scientific evaluation are described as well as hardware and software of the data processing equipment. A chronological presentation of the data processing operation is given. Procedures and methods for data analysis which were developed can be used with minor modifications for analysis of other space research experiments.
Standardization of pitch-range settings in voice acoustic analysis.
Vogel, Adam P; Maruff, Paul; Snyder, Peter J; Mundt, James C
2009-05-01
Voice acoustic analysis is typically a labor-intensive, time-consuming process that requires the application of idiosyncratic parameters tailored to individual aspects of the speech signal. Such processes limit the efficiency and utility of voice analysis in clinical practice as well as in applied research and development. In the present study, we analyzed 1,120 voice files, using standard techniques (case-by-case hand analysis), taking roughly 10 work weeks of personnel time to complete. The results were compared with the analytic output of several automated analysis scripts that made use of preset pitch-range parameters. After pitch windows were selected to appropriately account for sex differences, the automated analysis scripts reduced processing time of the 1,120 speech samples to less than 2.5 h and produced results comparable to those obtained with hand analysis. However, caution should be exercised when applying the suggested preset values to pathological voice populations.
Generalized Majority Logic Criterion to Analyze the Statistical Strength of S-Boxes
NASA Astrophysics Data System (ADS)
Hussain, Iqtadar; Shah, Tariq; Gondal, Muhammad Asif; Mahmood, Hasan
2012-05-01
The majority logic criterion is applicable in the evaluation process of substitution boxes used in the advanced encryption standard (AES). The performance of modified or advanced substitution boxes is predicted by processing the results of statistical analysis by the majority logic criteria. In this paper, we use the majority logic criteria to analyze some popular and prevailing substitution boxes used in encryption processes. In particular, the majority logic criterion is applied to AES, affine power affine (APA), Gray, Lui J, residue prime, S8 AES, Skipjack, and Xyi substitution boxes. The majority logic criterion is further extended into a generalized majority logic criterion which has a broader spectrum of analyzing the effectiveness of substitution boxes in image encryption applications. The integral components of the statistical analyses used for the generalized majority logic criterion are derived from results of entropy analysis, contrast analysis, correlation analysis, homogeneity analysis, energy analysis, and mean of absolute deviation (MAD) analysis.
Wang, Shengnan; Hua, Yujiao; Zou, Lisi; Liu, Xunhong; Yan, Ying; Zhao, Hui; Luo, Yiyuan; Liu, Juanxiu
2018-02-01
Scrophulariae Radix is one of the most popular traditional Chinese medicines (TCMs). Primary processing of Scrophulariae Radix is an important link which closely related to the quality of products in this TCM. The aim of this study is to explore the influence of different processing methods on chemical constituents in Scrophulariae Radix. The difference of chemical constituents in Scrophulariae Radix processed by different methods was analyzed by using ultra fast liquid chromatography-triple quadrupole-time of flight mass spectrometry coupled with principal component analysis and orthogonal partial least squares discriminant analysis. Furthermore, the contents of 12 index differential constituents in Scrophulariae Radix processed by different methods were simultaneously determined by using ultra fast liquid chromatography coupled with triple quadrupole-linear ion trap mass spectrometry. Gray relational analysis was performed to evaluate the different processed samples according to the contents of 12 constituents. All of the results demonstrated that the quality of Scrophulariae Radix processed by "sweating" method was better. This study will provide the basic information for revealing the change law of chemical constituents in Scrophulariae Radix processed by different methods and facilitating selection of the suitable processing method of this TCM. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Horsch, Salome; Kopczynski, Dominik; Kuthe, Elias; Baumbach, Jörg Ingo; Rahmann, Sven
2017-01-01
Motivation Disease classification from molecular measurements typically requires an analysis pipeline from raw noisy measurements to final classification results. Multi capillary column—ion mobility spectrometry (MCC-IMS) is a promising technology for the detection of volatile organic compounds in the air of exhaled breath. From raw measurements, the peak regions representing the compounds have to be identified, quantified, and clustered across different experiments. Currently, several steps of this analysis process require manual intervention of human experts. Our goal is to identify a fully automatic pipeline that yields competitive disease classification results compared to an established but subjective and tedious semi-manual process. Method We combine a large number of modern methods for peak detection, peak clustering, and multivariate classification into analysis pipelines for raw MCC-IMS data. We evaluate all combinations on three different real datasets in an unbiased cross-validation setting. We determine which specific algorithmic combinations lead to high AUC values in disease classifications across the different medical application scenarios. Results The best fully automated analysis process achieves even better classification results than the established manual process. The best algorithms for the three analysis steps are (i) SGLTR (Savitzky-Golay Laplace-operator filter thresholding regions) and LM (Local Maxima) for automated peak identification, (ii) EM clustering (Expectation Maximization) and DBSCAN (Density-Based Spatial Clustering of Applications with Noise) for the clustering step and (iii) RF (Random Forest) for multivariate classification. Thus, automated methods can replace the manual steps in the analysis process to enable an unbiased high throughput use of the technology. PMID:28910313
NASA Astrophysics Data System (ADS)
Mariajayaprakash, Arokiasamy; Senthilvelan, Thiyagarajan; Vivekananthan, Krishnapillai Ponnambal
2013-07-01
The various process parameters affecting the quality characteristics of the shock absorber during the process were identified using the Ishikawa diagram and by failure mode and effect analysis. The identified process parameters are welding process parameters (squeeze, heat control, wheel speed, and air pressure), damper sealing process parameters (load, hydraulic pressure, air pressure, and fixture height), washing process parameters (total alkalinity, temperature, pH value of rinsing water, and timing), and painting process parameters (flowability, coating thickness, pointage, and temperature). In this paper, the process parameters, namely, painting and washing process parameters, are optimized by Taguchi method. Though the defects are reasonably minimized by Taguchi method, in order to achieve zero defects during the processes, genetic algorithm technique is applied on the optimized parameters obtained by Taguchi method.
NASA Technical Reports Server (NTRS)
Johnston, John D.; Parrish, Keith; Howard, Joseph M.; Mosier, Gary E.; McGinnis, Mark; Bluth, Marcel; Kim, Kevin; Ha, Hong Q.
2004-01-01
This is a continuation of a series of papers on modeling activities for JWST. The structural-thermal- optical, often referred to as "STOP", analysis process is used to predict the effect of thermal distortion on optical performance. The benchmark STOP analysis for JWST assesses the effect of an observatory slew on wavefront error. The paper begins an overview of multi-disciplinary engineering analysis, or integrated modeling, which is a critical element of the JWST mission. The STOP analysis process is then described. This process consists of the following steps: thermal analysis, structural analysis, and optical analysis. Temperatures predicted using geometric and thermal math models are mapped to the structural finite element model in order to predict thermally-induced deformations. Motions and deformations at optical surfaces are input to optical models and optical performance is predicted using either an optical ray trace or WFE estimation techniques based on prior ray traces or first order optics. Following the discussion of the analysis process, results based on models representing the design at the time of the System Requirements Review. In addition to baseline performance predictions, sensitivity studies are performed to assess modeling uncertainties. Of particular interest is the sensitivity of optical performance to uncertainties in temperature predictions and variations in metal properties. The paper concludes with a discussion of modeling uncertainty as it pertains to STOP analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palta, J.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
NASA Technical Reports Server (NTRS)
Shortle, John F.; Allocco, Michael
2005-01-01
This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.
Digital processing of mesoscale analysis and space sensor data
NASA Technical Reports Server (NTRS)
Hickey, J. S.; Karitani, S.
1985-01-01
The mesoscale analysis and space sensor (MASS) data management and analysis system on the research computer system is presented. The MASS data base management and analysis system was implemented on the research computer system which provides a wide range of capabilities for processing and displaying large volumes of conventional and satellite derived meteorological data. The research computer system consists of three primary computers (HP-1000F, Harris/6, and Perkin-Elmer 3250), each of which performs a specific function according to its unique capabilities. The overall tasks performed concerning the software, data base management and display capabilities of the research computer system in terms of providing a very effective interactive research tool for the digital processing of mesoscale analysis and space sensor data is described.
In-Situ Molecular Vapor Composition Measurements During Lyophilization.
Liechty, Evan T; Strongrich, Andrew D; Moussa, Ehab M; Topp, Elizabeth; Alexeenko, Alina A
2018-04-11
Monitoring process conditions during lyophilization is essential to ensuring product quality for lyophilized pharmaceutical products. Residual gas analysis has been applied previously in lyophilization applications for leak detection, determination of endpoint in primary and secondary drying, monitoring sterilization processes, and measuring complex solvents. The purpose of this study is to investigate the temporal evolution of the process gas for various formulations during lyophilization to better understand the relative extraction rates of various molecular compounds over the course of primary drying. In this study, residual gas analysis is used to monitor molecular composition of gases in the product chamber during lyophilization of aqueous formulations typical for pharmaceuticals. Residual gas analysis is also used in the determination of the primary drying endpoint and compared to the results obtained using the comparative pressure measurement technique. The dynamics of solvent vapors, those species dissolved therein, and the ballast gas (the gas supplied to maintain a set-point pressure in the product chamber) are observed throughout the course of lyophilization. In addition to water vapor and nitrogen, the two most abundant gases for all considered aqueous formulations are oxygen and carbon dioxide. In particular, it is observed that the relative concentrations of carbon dioxide and oxygen vary depending on the formulation, an observation which stems from the varying solubility of these species. This result has implications on product shelf life and stability during the lyophilization process. Chamber process gas composition during lyophilization is quantified for several representative formulations using residual gas analysis. The advantages of the technique lie in its ability to measure the relative concentration of various species during the lyophilization process. This feature gives residual gas analysis utility in a host of applications from endpoint determination to quality assurance. In contrast to other methods, residual gas analysis is able to determine oxygen and water vapor content in the process gas. These compounds have been shown to directly influence product shelf life. With these results, residual gas analysis technique presents a potential new method for real-time lyophilization process control and improved understanding of formulation and processing effects for lyophilized pharmaceutical products.
Modeling and Analysis of Power Processing Systems (MAPPS). Volume 1: Technical report
NASA Technical Reports Server (NTRS)
Lee, F. C.; Rahman, S.; Carter, R. A.; Wu, C. H.; Yu, Y.; Chang, R.
1980-01-01
Computer aided design and analysis techniques were applied to power processing equipment. Topics covered include: (1) discrete time domain analysis of switching regulators for performance analysis; (2) design optimization of power converters using augmented Lagrangian penalty function technique; (3) investigation of current-injected multiloop controlled switching regulators; and (4) application of optimization for Navy VSTOL energy power system. The generation of the mathematical models and the development and application of computer aided design techniques to solve the different mathematical models are discussed. Recommendations are made for future work that would enhance the application of the computer aided design techniques for power processing systems.
Sun, Meng; Yan, Donghui; Yang, Xiaolu; Xue, Xingyang; Zhou, Sujuan; Liang, Shengwang; Wang, Shumei; Meng, Jiang
2017-05-01
Raw Arecae Semen, the seed of Areca catechu L., as well as Arecae Semen Tostum and Arecae semen carbonisata are traditionally processed by stir-baking for subsequent use in a variety of clinical applications. These three Arecae semen types, important Chinese herbal drugs, have been used in China and other Asian countries for thousands of years. In this study, the sensory technologies of a colorimeter and sensitive validated high-performance liquid chromatography with diode array detection were employed to discriminate raw Arecae semen and its processed drugs. The color parameters of the samples were determined by a colorimeter instrument CR-410. Moreover, the fingerprints of the four alkaloids of arecaidine, guvacine, arecoline and guvacoline were surveyed by high-performance liquid chromatography. Subsequently, Student's t test, the analysis of variance, fingerprint similarity analysis, hierarchical cluster analysis, principal component analysis, factor analysis and Pearson's correlation test were performed for final data analysis. The results obtained demonstrated a significant color change characteristic for components in raw Arecae semen and its processed drugs. Crude and processed Arecae semen could be determined based on colorimetry and high-performance liquid chromatography with a diode array detector coupled with chemometrics methods for a comprehensive quality evaluation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Glenn, Sigrid S.
1985-01-01
Behavior analysis and institutional economics are viewed as having common origins in the early 20th century effort to benefit from the conceptual revolution spurred by Darwin's synthesis. Institutional economics, initiated by Thorstein Veblen, appears to have failed to develop a progressive scientific technology, while behavior analysis has done so. It is suggested that institutional economics has been held back by lack of a synthesizing scientific mechanism that elucidates the relation between technological and ceremonial processes, the two cultural forces described by Veblen. The theory of institutional economist C. E. Ayres, built on Veblen's distinction, is used to clarify the concepts of technological and ceremonial processes for the reader. An analysis of the behavioral processes that might underlie the cultural processes described by Veblen/Ayres suggests that the experimental analysis of behavior has provided concepts that might function as a synthesizing mechanism for the social sciences and, in particular, institutional economics. The Veblen/Ayres dichotomy, now seen in terms of underlying behavioral processes, is used to examine the field of behavior analysis in terms of its origins, its relation to psychology and its current state. The paper concludes with a few practical suggestions as to how behavior analysts might work to enhance survival. PMID:22478617
Increasing Transparency Through a Multiverse Analysis.
Steegen, Sara; Tuerlinckx, Francis; Gelman, Andrew; Vanpaemel, Wolf
2016-09-01
Empirical research inevitably includes constructing a data set by processing raw data into a form ready for statistical analysis. Data processing often involves choices among several reasonable options for excluding, transforming, and coding data. We suggest that instead of performing only one analysis, researchers could perform a multiverse analysis, which involves performing all analyses across the whole set of alternatively processed data sets corresponding to a large set of reasonable scenarios. Using an example focusing on the effect of fertility on religiosity and political attitudes, we show that analyzing a single data set can be misleading and propose a multiverse analysis as an alternative practice. A multiverse analysis offers an idea of how much the conclusions change because of arbitrary choices in data construction and gives pointers as to which choices are most consequential in the fragility of the result. © The Author(s) 2016.
Human Factors Operability Timeline Analysis to Improve the Processing Flow of the Orion Spacecraft
NASA Technical Reports Server (NTRS)
Schlierf, Roland; Stambolian, Damon B.; Miller, Darcy; Posanda, Juan; Haddock, Mike; Haddad, Mike; Tran, Donald; Henderson, Gena; Barth, Tim
2010-01-01
The Constellation Program (CxP) Orion vehicle goes through several areas and stages of processing before its launched at the Kennedy Space Center. In order to have efficient and effective processing, all of the activities need to be analyzed. This was accomplished by first developing a timeline of events that included each activity, and then each activity was analyzed by operability experts and human factors experts with spacecraft processing experience. This papers focus is to explain the results and the process for developing this human factors operability timeline analysis to improve the processing flow of Orion.
Reduced product yield in chemical processes by second law effects
NASA Technical Reports Server (NTRS)
England, C.; Funk, J. E.
1980-01-01
An analysis of second law effects in chemical processes, where product yield is explicitly related to the individual irreversibilities within the process to indicate a maximum theoretical yield, is presented. Examples are given that indicate differences between first and second law approaches toward process efficiency and process yield. This analysis also expresses production capacity in terms of the heating value of a product. As a result, it is particularly convenient in analyzing fuel conversion plants and their potential for improvement. Relationships are also given for the effects of irreversibilities on requirements for process heat and for feedstocks.
Sensitivity analysis of the add-on price estimate for the edge-defined film-fed growth process
NASA Technical Reports Server (NTRS)
Mokashi, A. R.; Kachare, A. H.
1981-01-01
The analysis is in terms of cost parameters and production parameters. The cost parameters include equipment, space, direct labor, materials, and utilities. The production parameters include growth rate, process yield, and duty cycle. A computer program was developed specifically to do the sensitivity analysis.
Global sensitivity analysis of DRAINMOD-FOREST, an integrated forest ecosystem model
Shiying Tian; Mohamed A. Youssef; Devendra M. Amatya; Eric D. Vance
2014-01-01
Global sensitivity analysis is a useful tool to understand process-based ecosystem models by identifying key parameters and processes controlling model predictions. This study reported a comprehensive global sensitivity analysis for DRAINMOD-FOREST, an integrated model for simulating water, carbon (C), and nitrogen (N) cycles and plant growth in lowland forests. The...
State Analysis: A Control Architecture View of Systems Engineering
NASA Technical Reports Server (NTRS)
Rasmussen, Robert D.
2005-01-01
A viewgraph presentation on the state analysis process is shown. The topics include: 1) Issues with growing complexity; 2) Limits of common practice; 3) Exploiting a control point of view; 4) A glimpse at the State Analysis process; 5) Synergy with model-based systems engineering; and 6) Bridging the systems to software gap.
Strategic and Market Analysis | Bioenergy | NREL
recent efforts in comparative techno-economic analysis. Our analysis considers a wide range of conversion Intermediates NREL has developed first-of-its-kind process models and economic assessments of the co-processing work strives to understand the economic incentives, technical risks, and key data gaps that need to be
2013-03-01
9 B. REQUIREMENTS ANALYSIS PROCESS ..................................................9 1. Requirements Management and... Analysis Plan ................................9 2. Knowledge Point Reviews .................................................................11 3...are Identified .......12 5. RMAP/CDD Process Analysis and Results......................................13 IV. TD PHASE BEGINS
ERIC Educational Resources Information Center
Coad, Jane; Evans, Ruth
2008-01-01
This article reflects on key methodological issues emerging from children and young people's involvement in data analysis processes. We outline a pragmatic framework illustrating different approaches to engaging children, using two case studies of children's experiences of participating in data analysis. The article highlights methods of…
ERIC Educational Resources Information Center
Bjerstedt, Ake
A three-volume series describes the construction of a self-instructional system as a work process with three main phases: system analysis, system synthesis, and system modification and evaluation. After an introductory discussion of some basic principles of instructional programing, this first volume focuses on the system analysis phase,…
Wójcicki, Tomasz; Nowicki, Michał
2016-01-01
The article presents a selected area of research and development concerning the methods of material analysis based on the automatic image recognition of the investigated metallographic sections. The objectives of the analyses of the materials for gas nitriding technology are described. The methods of the preparation of nitrided layers, the steps of the process and the construction and operation of devices for gas nitriding are given. We discuss the possibility of using the methods of digital images processing in the analysis of the materials, as well as their essential task groups: improving the quality of the images, segmentation, morphological transformations and image recognition. The developed analysis model of the nitrided layers formation, covering image processing and analysis techniques, as well as selected methods of artificial intelligence are presented. The model is divided into stages, which are formalized in order to better reproduce their actions. The validation of the presented method is performed. The advantages and limitations of the developed solution, as well as the possibilities of its practical use, are listed. PMID:28773389
Processing and analysis of cardiac optical mapping data obtained with potentiometric dyes
Laughner, Jacob I.; Ng, Fu Siong; Sulkin, Matthew S.; Arthur, R. Martin
2012-01-01
Optical mapping has become an increasingly important tool to study cardiac electrophysiology in the past 20 years. Multiple methods are used to process and analyze cardiac optical mapping data, and no consensus currently exists regarding the optimum methods. The specific methods chosen to process optical mapping data are important because inappropriate data processing can affect the content of the data and thus alter the conclusions of the studies. Details of the different steps in processing optical imaging data, including image segmentation, spatial filtering, temporal filtering, and baseline drift removal, are provided in this review. We also provide descriptions of the common analyses performed on data obtained from cardiac optical imaging, including activation mapping, action potential duration mapping, repolarization mapping, conduction velocity measurements, and optical action potential upstroke analysis. Optical mapping is often used to study complex arrhythmias, and we also discuss dominant frequency analysis and phase mapping techniques used for the analysis of cardiac fibrillation. PMID:22821993
GaiaGrid : Its Implications and Implementation
NASA Astrophysics Data System (ADS)
Ansari, S. G.; Lammers, U.; Ter Linden, M.
2005-12-01
Gaia is an ESA space mission to determine positions of 1 billion objects in the Galaxy at micro-arcsecond precision. The data analysis and processing requirements of the mission involves about 20 institutes across Europe, each providing specific algorithms for specific tasks, which range from relativistic effects on positional determination, classification, astrometric binary star detection, photometric analysis, spectroscopic analysis etc. In an initial phase, a study has been ongoing over the past three years to determine the complexity of Gaia's data processing. Two processing categories have materialised: core and shell. While core deals with routine data processing, shell tasks are algorithms to carry out data analysis, which involves the Gaia Community at large. For this latter category, we are currently experimenting with use of Grid paradigms to allow access to the core data and to augment processing power to simulate and analyse the data in preparation for the actual mission. We present preliminary results and discuss the sociological impact of distributing the tasks amongst the community.
Ibrahim, Reham S; Fathy, Hoda
2018-03-30
Tracking the impact of commonly applied post-harvesting and industrial processing practices on the compositional integrity of ginger rhizome was implemented in this work. Untargeted metabolite profiling was performed using digitally-enhanced HPTLC method where the chromatographic fingerprints were extracted using ImageJ software then analysed with multivariate Principal Component Analysis (PCA) for pattern recognition. A targeted approach was applied using a new, validated, simple and fast HPTLC image analysis method for simultaneous quantification of the officially recognized markers 6-, 8-, 10-gingerol and 6-shogaol in conjunction with chemometric Hierarchical Clustering Analysis (HCA). The results of both targeted and untargeted metabolite profiling revealed that peeling, drying in addition to storage employed during processing have a great influence on ginger chemo-profile, the different forms of processed ginger shouldn't be used interchangeably. Moreover, it deemed necessary to consider the holistic metabolic profile for comprehensive evaluation of ginger during processing. Copyright © 2018. Published by Elsevier B.V.
Zimmermann, Hartmut F; Hentschel, Norbert
2011-01-01
With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here explains how a biopharmaceutical process FMEA can be conducted. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. With the help of this guideline, different details of the manufacturing process can be ranked according to their potential risks, and this can help pharmaceutical companies to identify aspects with high potential risks and to react accordingly to improve the safety of medicines.
Tests of Spectral Cloud Classification Using DMSP Fine Mode Satellite Data.
1980-06-02
processing techniques of potential value. Fourier spectral analysis was identified as the most promising technique to upgrade automated processing of...these measurements on the Earth’s surface is 0. 3 n mi. 3. Pickett, R.M., and Blackman, E.S. (1976) Automated Processing of Satellite Imagery Data at Air...and Pickett. R. Al. (1977) Automated Processing of Satellite Imagery Data at the Air Force Global Weather Central: Demonstrations of Spectral Analysis
A Comparative Analysis of Extract, Transformation and Loading (ETL) Process
NASA Astrophysics Data System (ADS)
Runtuwene, J. P. A.; Tangkawarow, I. R. H. T.; Manoppo, C. T. M.; Salaki, R. J.
2018-02-01
The current growth of data and information occurs rapidly in varying amount and media. These types of development will eventually produce large number of data better known as the Big Data. Business Intelligence (BI) utilizes large number of data and information for analysis so that one can obtain important information. This type of information can be used to support decision-making process. In practice a process integrating existing data and information into data warehouse is needed. This data integration process is known as Extract, Transformation and Loading (ETL). In practice, many applications have been developed to carry out the ETL process, but selection which applications are more time, cost and power effective and efficient may become a challenge. Therefore, the objective of the study was to provide comparative analysis through comparison between the ETL process using Microsoft SQL Server Integration Service (SSIS) and one using Pentaho Data Integration (PDI).
Second-order analysis of semiparametric recurrent event processes.
Guan, Yongtao
2011-09-01
A typical recurrent event dataset consists of an often large number of recurrent event processes, each of which contains multiple event times observed from an individual during a follow-up period. Such data have become increasingly available in medical and epidemiological studies. In this article, we introduce novel procedures to conduct second-order analysis for a flexible class of semiparametric recurrent event processes. Such an analysis can provide useful information regarding the dependence structure within each recurrent event process. Specifically, we will use the proposed procedures to test whether the individual recurrent event processes are all Poisson processes and to suggest sensible alternative models for them if they are not. We apply these procedures to a well-known recurrent event dataset on chronic granulomatous disease and an epidemiological dataset on meningococcal disease cases in Merseyside, United Kingdom to illustrate their practical value. © 2011, The International Biometric Society.
SDI-based business processes: A territorial analysis web information system in Spain
NASA Astrophysics Data System (ADS)
Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.
2012-09-01
Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.
Process Feasibility Study in Support of Silicon Material Task 1
NASA Technical Reports Server (NTRS)
Li, K. Y.; Hansen, K. C.; Yaws, C. L.
1979-01-01
Analysis of process system properties was continued for silicon source materials under consideration for producing silicon. The following property data are reported for dichlorosilane which is involved in processing operations for silicon: critical constants, vapor pressure, heat of vaporization, heat capacity, density, surface tension, thermal conductivity, heat of formation and Gibb's free energy of formation. The properties are reported as a function of temperature to permit rapid engineering usage. The preliminary economic analysis of the process is described. Cost analysis results for the process (case A-two deposition reactors and six electrolysis cells) are presented based on a preliminary process design of a plant to produce 1,000 metric tons/year of silicon. Fixed capital investment estimate for the plant is $12.47 million (1975 dollars) ($17.47 million, 1980 dollars). Product cost without profit is 8.63 $/kg of silicon (1975 dollars)(12.1 $/kg, 1980 dollars).
10 CFR 712.36 - Medical assessment process.
Code of Federal Regulations, 2010 CFR
2010-01-01
... assigned duties. (b) Employers must provide a job task analysis for those individuals involved in HRP... performed if a job task analysis has not been provided. (c) The medical process by the Designated Physician...
Thermo-Mechanical Analysis for John Deere Electronics Solutions | Advanced
impacts of alternative manufacturing processes Die, package, and interface material analysis for power module reliability Manufacturing process impacts versus thermal cycling impacts on power module
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lines, Amanda M.; Nelson, Gilbert L.; Casella, Amanda J.
Microfluidic devices are a growing field with significant potential for application to small scale processing of solutions. Much like large scale processing, fast, reliable, and cost effective means of monitoring the streams during processing are needed. Here we apply a novel Micro-Raman probe to the on-line monitoring of streams within a microfluidic device. For either macro or micro scale process monitoring via spectroscopic response, there is the danger of interfering or confounded bands obfuscating results. By utilizing chemometric analysis, a form of multivariate analysis, species can be accurately quantified in solution despite the presence of overlapping or confounded spectroscopic bands.more » This is demonstrated on solutions of HNO 3 and NaNO 3 within micro-flow and microfluidic devices.« less
Methodological aspects of fuel performance system analysis at raw hydrocarbon processing plants
NASA Astrophysics Data System (ADS)
Kulbjakina, A. V.; Dolotovskij, I. V.
2018-01-01
The article discusses the methodological aspects of fuel performance system analysis at raw hydrocarbon (RH) processing plants. Modern RH processing facilities are the major consumers of energy resources (ER) for their own needs. To reduce ER, including fuel consumption, and to develop rational fuel system structure are complex and relevant scientific tasks that can only be done using system analysis and complex system synthesis. In accordance with the principles of system analysis, the hierarchical structure of the fuel system, the block scheme for the synthesis of the most efficient alternative of the fuel system using mathematical models and the set of performance criteria have been developed on the main stages of the study. The results from the introduction of specific engineering solutions to develop their own energy supply sources for RH processing facilities have been provided.
Introduction of male circumcision for HIV prevention in Uganda: analysis of the policy process.
Odoch, Walter Denis; Kabali, Kenneth; Ankunda, Racheal; Zulu, Joseph Mumba; Tetui, Moses
2015-06-20
Health policy analysis is important for all health policies especially in fields with ever changing evidence-based interventions such as HIV prevention. However, there are few published reports of health policy analysis in sub-Saharan Africa in this field. This study explored the policy process of the introduction of male circumcision (MC) for HIV prevention in Uganda in order to inform the development processes of similar health policies. Desk review of relevant documents was conducted between March and May 2012. Thematic analysis was used to analyse the data. Conceptual frameworks that demonstrate the interrelationship within the policy development processes and influence of actors in the policy development processes guided the analysis. Following the introduction of MC on the national policy agenda in 2007, negotiation and policy formulation preceded its communication and implementation. Policy proponents included academic researchers in the early 2000s and development partners around 2007. Favourable contextual factors that supported the development of the policy included the rising HIV prevalence, adoption of MC for HIV prevention in other sub-Saharan African countries, and expertise on MC. Additionally, the networking capability of proponents facilitated the change in position of non-supportive or neutral actors. Non-supportive and neutral actors in the initial stages of the policy development process included the Ministry of Health, traditional and Muslim leaders, and the Republican President. Using political authority, legitimacy, and charisma, actors who opposed the policy tried to block the policy development process. Researchers' initial disregard of the Ministry of Health in the research process of MC and the missing civil society advocacy arm contributed to delays in the policy development process. This study underscores the importance of securing top political leadership as well as key implementing partners' support in policy development processes. Equally important is the appreciation of the various forms of actors' power and how such power shapes the policy agenda, development process, and content.
Bonan, Brigitte; Martelli, Nicolas; Berhoune, Malik; Maestroni, Marie-Laure; Havard, Laurent; Prognon, Patrice
2009-02-01
To apply the Hazard analysis and Critical Control Points method to the preparation of anti-cancer drugs. To identify critical control points in our cancer chemotherapy process and to propose control measures and corrective actions to manage these processes. The Hazard Analysis and Critical Control Points application began in January 2004 in our centralized chemotherapy compounding unit. From October 2004 to August 2005, monitoring of the process nonconformities was performed to assess the method. According to the Hazard Analysis and Critical Control Points method, a multidisciplinary team was formed to describe and assess the cancer chemotherapy process. This team listed all of the critical points and calculated their risk indexes according to their frequency of occurrence, their severity and their detectability. The team defined monitoring, control measures and corrective actions for each identified risk. Finally, over a 10-month period, pharmacists reported each non-conformity of the process in a follow-up document. Our team described 11 steps in the cancer chemotherapy process. The team identified 39 critical control points, including 11 of higher importance with a high-risk index. Over 10 months, 16,647 preparations were performed; 1225 nonconformities were reported during this same period. The Hazard Analysis and Critical Control Points method is relevant when it is used to target a specific process such as the preparation of anti-cancer drugs. This method helped us to focus on the production steps, which can have a critical influence on product quality, and led us to improve our process.
NASA Astrophysics Data System (ADS)
Xie, Dongjin; Xu, Jing; Cheng, Haifeng; Wang, Nannan; Zhou, Qun
2018-06-01
Thermochromic compound [(C2H5)2NH2]2CuCl4 displays a solid-solid phase transition at 52 °C apparent with color changing from green to yellow, induced by the geometry of [CuCl4]2- anion (regarded as chromophore of the compound) ranging from square-planar to flattened tetrahedral structure. Fourier transform infrared (FTIR) spectroscopy and two-dimensional correlation (2D-COS) analysis have been applied to study the role played by the amine and ethyl group of the ammonium cation during the phase transition process in heating and cooling process. With temperature increasing, strength weakening of the N-H…Cl H-bond and thermal disordering of the alkyl chain both occur in the phase transition. 2D-COS analysis reveals the N-H…Cl H-bond responds to increasing temperature in the first place, and may the dominating driving force for the structure variation of [CuCl4]2- anion. Although the thermochromic process of [(C2H5)2NH2]2CuCl4 is a reversible process, the sequential order of the variation of NH2+ and alkyl group of [(C2H5)2NH2]2CuCl4 derived by 2D-COS analysis during heating and cooling process are reverse, indicating the dynamic process of the phase transition is not perfect reversible. The existence of undercooling phenomenon in the cooling process has been revealed by 2D-COS analysis.
Query-Driven Visualization and Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruebel, Oliver; Bethel, E. Wes; Prabhat, Mr.
2012-11-01
This report focuses on an approach to high performance visualization and analysis, termed query-driven visualization and analysis (QDV). QDV aims to reduce the amount of data that needs to be processed by the visualization, analysis, and rendering pipelines. The goal of the data reduction process is to separate out data that is "scientifically interesting'' and to focus visualization, analysis, and rendering on that interesting subset. The premise is that for any given visualization or analysis task, the data subset of interest is much smaller than the larger, complete data set. This strategy---extracting smaller data subsets of interest and focusing ofmore » the visualization processing on these subsets---is complementary to the approach of increasing the capacity of the visualization, analysis, and rendering pipelines through parallelism. This report discusses the fundamental concepts in QDV, their relationship to different stages in the visualization and analysis pipelines, and presents QDV's application to problems in diverse areas, ranging from forensic cybersecurity to high energy physics.« less
Coal gasification systems engineering and analysis. Appendix A: Coal gasification catalog
NASA Technical Reports Server (NTRS)
1980-01-01
The scope of work in preparing the Coal Gasification Data Catalog included the following subtasks: (1) candidate system subsystem definition, (2) raw materials analysis, (3) market analysis for by-products, (4) alternate products analysis, (5) preliminary integrated facility requirements. Definition of candidate systems/subsystems includes the identity of and alternates for each process unit, raw material requirements, and the cost and design drivers for each process design.
Tao, Ling; Aden, Andy; Elander, Richard T; Pallapolu, Venkata Ramesh; Lee, Y Y; Garlock, Rebecca J; Balan, Venkatesh; Dale, Bruce E; Kim, Youngmi; Mosier, Nathan S; Ladisch, Michael R; Falls, Matthew; Holtzapple, Mark T; Sierra, Rocio; Shi, Jian; Ebrik, Mirvat A; Redmond, Tim; Yang, Bin; Wyman, Charles E; Hames, Bonnie; Thomas, Steve; Warner, Ryan E
2011-12-01
Six biomass pretreatment processes to convert switchgrass to fermentable sugars and ultimately to cellulosic ethanol are compared on a consistent basis in this technoeconomic analysis. The six pretreatment processes are ammonia fiber expansion (AFEX), dilute acid (DA), lime, liquid hot water (LHW), soaking in aqueous ammonia (SAA), and sulfur dioxide-impregnated steam explosion (SO(2)). Each pretreatment process is modeled in the framework of an existing biochemical design model so that systematic variations of process-related changes are consistently captured. The pretreatment area process design and simulation are based on the research data generated within the Biomass Refining Consortium for Applied Fundamentals and Innovation (CAFI) 3 project. Overall ethanol production, total capital investment, and minimum ethanol selling price (MESP) are reported along with selected sensitivity analysis. The results show limited differentiation between the projected economic performances of the pretreatment options, except for processes that exhibit significantly lower monomer sugar and resulting ethanol yields. Copyright © 2011 Elsevier Ltd. All rights reserved.
Xue, Xiu-Juan; Gao, Qing; Qiao, Jian-Hong; Zhang, Jie; Xu, Cui-Ping; Liu, Ju
2014-01-01
This meta-analysis was to summarize the published studies about the association between red/processed meat consumption and the risk of lung cancer. 5 databases were systematically reviewed, and random-effect model was used to pool the study results and to assess dose-response relationships. Results shown that six cohort studies and twenty eight case-control studies were included in this meat-analysis. The pooled Risk Radios (RR) for total red meat and processed meat were 1.44 (95% CI, 1.29-1.61) and 1.23 (95% CI, 1.10-1.37), respectively. Dose-response analysis revealed that for every increment of 120 grams red meat per day the risk of lung cancer increases 35% and for every increment of 50 grams red meat per day the risk of lung cancer increases 20%. The present dose-response meta-analysis suggested that both red and processed meat consumption showed a positive effect on lung cancer risk. PMID:25035778
Retinal imaging analysis based on vessel detection.
Jamal, Arshad; Hazim Alkawaz, Mohammed; Rehman, Amjad; Saba, Tanzila
2017-07-01
With an increase in the advancement of digital imaging and computing power, computationally intelligent technologies are in high demand to be used in ophthalmology cure and treatment. In current research, Retina Image Analysis (RIA) is developed for optometrist at Eye Care Center in Management and Science University. This research aims to analyze the retina through vessel detection. The RIA assists in the analysis of the retinal images and specialists are served with various options like saving, processing and analyzing retinal images through its advanced interface layout. Additionally, RIA assists in the selection process of vessel segment; processing these vessels by calculating its diameter, standard deviation, length, and displaying detected vessel on the retina. The Agile Unified Process is adopted as the methodology in developing this research. To conclude, Retina Image Analysis might help the optometrist to get better understanding in analyzing the patient's retina. Finally, the Retina Image Analysis procedure is developed using MATLAB (R2011b). Promising results are attained that are comparable in the state of art. © 2017 Wiley Periodicals, Inc.
FPGA-Based Filterbank Implementation for Parallel Digital Signal Processing
NASA Technical Reports Server (NTRS)
Berner, Stephan; DeLeon, Phillip
1999-01-01
One approach to parallel digital signal processing decomposes a high bandwidth signal into multiple lower bandwidth (rate) signals by an analysis bank. After processing, the subband signals are recombined into a fullband output signal by a synthesis bank. This paper describes an implementation of the analysis and synthesis banks using (Field Programmable Gate Arrays) FPGAs.
ERIC Educational Resources Information Center
McCormick, Joe Lew
This study examined major stakeholders' perceptions of their involvement and role in the legislative process surrounding the introduction, deliberation, and ultimate passage of the Direct Loan Demonstration Program (DLDP), a federal pilot student loan program. Data analysis was based on a detailed description of the legislative process surrounding…
DEP : a computer program for evaluating lumber drying costs and investments
Stewart Holmes; George B. Harpole; Edward Bilek
1983-01-01
The DEP computer program is a modified discounted cash flow computer program designed for analysis of problems involving economic analysis of wood drying processes. Wood drying processes are different from other processes because of the large amounts of working capital required to finance inventories, and because of relatively large shares of costs charged to inventory...
Understanding Processes and Timelines for Distributed Photovoltaic
data from more than 30,000 PV systems across 87 utilities in 16 states to better understand how solar photovoltaic (PV) interconnection process time frames in the United States. This study includes an analysis of Analysis Metrics" that shows the four steps involved in the utility interconnection process for solar
Information Acquisition, Analysis and Integration
2016-08-03
of sensing and processing, theory, applications, signal processing, image and video processing, machine learning , technology transfer. 16. SECURITY... learning . 5. Solved elegantly old problems like image and video debluring, intro- ducing new revolutionary approaches. 1 DISTRIBUTION A: Distribution...Polatkan, G. Sapiro, D. Blei, D. B. Dunson, and L. Carin, “ Deep learning with hierarchical convolution factor analysis,” IEEE 6 DISTRIBUTION A
ERIC Educational Resources Information Center
Duffy, Melissa C.; Azevedo, Roger; Sun, Ning-Zi; Griscom, Sophia E.; Stead, Victoria; Crelinsten, Linda; Wiseman, Jeffrey; Maniatis, Thomas; Lachapelle, Kevin
2015-01-01
This study examined the nature of cognitive, metacognitive, and affective processes among a medical team experiencing difficulty managing a challenging simulated medical emergency case by conducting in-depth analysis of process data. Medical residents participated in a simulation exercise designed to help trainees to develop medical expertise,…
Code of Federal Regulations, 2010 CFR
2010-07-01
..., processing, or interpretation of any geological data and information. Initial analysis and processing are the stages of analysis or processing where the data and information first become available for in-house... information are available for submission, inspection, and selection? 280.40 Section 280.40 Mineral Resources...
Study of process variables associated with manufacturing hermetically-sealed nickel-cadmium cells
NASA Technical Reports Server (NTRS)
Miller, L.; Doan, D. J.; Carr, E. S.
1971-01-01
A program to determine and study the critical process variables associated with the manufacture of aerospace, hermetically-sealed, nickel-cadmium cells is described. The determination and study of the process variables associated with the positive and negative plaque impregnation/polarization process are emphasized. The experimental data resulting from the implementation of fractional factorial design experiments are analyzed by means of a linear multiple regression analysis technique. This analysis permits the selection of preferred levels for certain process variables to achieve desirable impregnated plaque characteristics.
NASA Astrophysics Data System (ADS)
Syafrina, R.; Rohman, I.; Yuliani, G.
2018-05-01
This study aims to analyze the concept characteristics of solubility and solubility products that will serve as the basis for the development of virtual laboratory and students' science process skills. Characteristics of the analyzed concepts include concept definitions, concept attributes, and types of concepts. The concept analysis method uses concept analysis according to Herron. The results of the concept analysis show that there are twelve chemical concepts that become the prerequisite concept before studying the solubility and solubility and five core concepts that students must understand in the solubility and Solubility product. As many as 58.3% of the definitions of the concepts contained in high school textbooks support students' science process skills, the rest of the definition of the concept is memorized. Concept attributes that meet three levels of chemical representation and can be poured into a virtual laboratory have a percentage of 66.6%. Type of concept, 83.3% is a concept based on principle; and 16.6% concepts that state the process. Meanwhile, the science process skills that can be developed based on concept analysis are the ability to observe, calculate, measure, predict, interpret, hypothesize, apply, classify, and inference.
The effects of pre-processing strategies in sentiment analysis of online movie reviews
NASA Astrophysics Data System (ADS)
Zin, Harnani Mat; Mustapha, Norwati; Murad, Masrah Azrifah Azmi; Sharef, Nurfadhlina Mohd
2017-10-01
With the ever increasing of internet applications and social networking sites, people nowadays can easily express their feelings towards any products and services. These online reviews act as an important source for further analysis and improved decision making. These reviews are mostly unstructured by nature and thus, need processing like sentiment analysis and classification to provide a meaningful information for future uses. In text analysis tasks, the appropriate selection of words/features will have a huge impact on the effectiveness of the classifier. Thus, this paper explores the effect of the pre-processing strategies in the sentiment analysis of online movie reviews. In this paper, supervised machine learning method was used to classify the reviews. The support vector machine (SVM) with linear and non-linear kernel has been considered as classifier for the classification of the reviews. The performance of the classifier is critically examined based on the results of precision, recall, f-measure, and accuracy. Two different features representations were used which are term frequency and term frequency-inverse document frequency. Results show that the pre-processing strategies give a significant impact on the classification process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
TU-AB-BRD-04: Development of Quality Management Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomadsen, B.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
Full Life Cycle of Data Analysis with Climate Model Diagnostic Analyzer (CMDA)
NASA Astrophysics Data System (ADS)
Lee, S.; Zhai, C.; Pan, L.; Tang, B.; Zhang, J.; Bao, Q.; Malarout, N.
2017-12-01
We have developed a system that supports the full life cycle of a data analysis process, from data discovery, to data customization, to analysis, to reanalysis, to publication, and to reproduction. The system called Climate Model Diagnostic Analyzer (CMDA) is designed to demonstrate that the full life cycle of data analysis can be supported within one integrated system for climate model diagnostic evaluation with global observational and reanalysis datasets. CMDA has four subsystems that are highly integrated to support the analysis life cycle. Data System manages datasets used by CMDA analysis tools, Analysis System manages CMDA analysis tools which are all web services, Provenance System manages the meta data of CMDA datasets and the provenance of CMDA analysis history, and Recommendation System extracts knowledge from CMDA usage history and recommends datasets/analysis tools to users. These four subsystems are not only highly integrated but also easily expandable. New datasets can be easily added to Data System and scanned to be visible to the other subsystems. New analysis tools can be easily registered to be available in the Analysis System and Provenance System. With CMDA, a user can start a data analysis process by discovering datasets of relevance to their research topic using the Recommendation System. Next, the user can customize the discovered datasets for their scientific use (e.g. anomaly calculation, regridding, etc) with tools in the Analysis System. Next, the user can do their analysis with the tools (e.g. conditional sampling, time averaging, spatial averaging) in the Analysis System. Next, the user can reanalyze the datasets based on the previously stored analysis provenance in the Provenance System. Further, they can publish their analysis process and result to the Provenance System to share with other users. Finally, any user can reproduce the published analysis process and results. By supporting the full life cycle of climate data analysis, CMDA improves the research productivity and collaboration level of its user.
Magnesium-Aluminum-Zirconium Oxide Amorphous Ternary Composite: A Dense and Stable Optical Coating
NASA Technical Reports Server (NTRS)
Sahoo, N. K.; Shapiro, A. P.
1998-01-01
In the present work, the process parameter dependent optical and structural properties of MgO-Al(2)O(3)-ZrO(2) ternary mixed-composite material have been investigated. Optical properties were derived from spectrophotometric measurements. The surface morphology, grain size distributions, crystallographic phases and process dependent material composition of films have been investigated through the use of Atomic Force Microscopy (AFM), X-ray diffraction analysis and Energy Dispersive X- ray (EDX) analysis. EDX analysis made evident the correlation between the optical constants and the process dependent compositions in the films. It is possible to achieve environmentally stable amorphous films with high packing density under certain optimized process conditions.
NASA Technical Reports Server (NTRS)
Thomson, F.
1972-01-01
The additional processing performed on data collected over the Rhode River Test Site and Forestry Site in November 1970 is reported. The techniques and procedures used to obtain the processed results are described. Thermal data collected over three approximately parallel lines of the site were contoured, and the results color coded, for the purpose of delineating important scene constituents and to identify trees attacked by pine bark beetles. Contouring work and histogram preparation are reviewed and the important conclusions from the spectral analysis and recognition computer (SPARC) signature extension work are summarized. The SPARC setup and processing records are presented and recommendations are made for future data collection over the site.
Parallel processing in finite element structural analysis
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
1987-01-01
A brief review is made of the fundamental concepts and basic issues of parallel processing. Discussion focuses on parallel numerical algorithms, performance evaluation of machines and algorithms, and parallelism in finite element computations. A computational strategy is proposed for maximizing the degree of parallelism at different levels of the finite element analysis process including: 1) formulation level (through the use of mixed finite element models); 2) analysis level (through additive decomposition of the different arrays in the governing equations into the contributions to a symmetrized response plus correction terms); 3) numerical algorithm level (through the use of operator splitting techniques and application of iterative processes); and 4) implementation level (through the effective combination of vectorization, multitasking and microtasking, whenever available).
MgO-Al2O3-ZrO2 Amorphous Ternary Composite: A Dense and Stable Optical Coating
NASA Technical Reports Server (NTRS)
Shaoo, Naba K.; Shapiro, Alan P.
1998-01-01
The process-parameter-dependent optical and structural properties of MgO-Al2O3-ZrO2 ternary mixed-composite material were investigated. Optical properties were derived from spectrophotometric measurements. The surface morphology, grain size distributions, crystallographic phases, and process- dependent material composition of films were investigated through the use of atomic force microscopy, x-ray diffraction analysis, and energy-dispersive x-ray analysis. Energy-dispersive x-ray analysis made evident the correlation between the optical constants and the process-dependent compositions in the films. It is possible to achieve environmentally stable amorphous films with high packing density under certain optimized process conditions.
NASA Technical Reports Server (NTRS)
Nagy, S.
1988-01-01
Due to extraordinary distances scanned by modern telescopes, optical surfaces in such telescopes must be manufactured to unimaginable standards of perfection of a few thousandths of a centimeter. The detection of imperfections of less than 1/20 of a wavelength of light, for application in the building of the mirror for the Space Infrared Telescope Facility, was undertaken. Because the mirror must be kept very cold while in space, another factor comes into effect: cryogenics. The process to test a specific morror under cryogenic conditions is described; including the follow-up analysis accomplished through computer work. To better illustrate the process and analysis, a Pyrex Hex-Core mirror is followed through the process from the laser interferometry in the lab, to computer analysis via a computer program called FRINGE. This analysis via FRINGE is detailed.
dada - a web-based 2D detector analysis tool
NASA Astrophysics Data System (ADS)
Osterhoff, Markus
2017-06-01
The data daemon, dada, is a server backend for unified access to 2D pixel detector image data stored with different detectors, file formats and saved with varying naming conventions and folder structures across instruments. Furthermore, dada implements basic pre-processing and analysis routines from pixel binning over azimuthal integration to raster scan processing. Common user interactions with dada are by a web frontend, but all parameters for an analysis are encoded into a Uniform Resource Identifier (URI) which can also be written by hand or scripts for batch processing.
Analysis of launch site processing effectiveness for the Space Shuttle 26R payload
NASA Technical Reports Server (NTRS)
Flores, Carlos A.; Heuser, Robert E.; Pepper, Richard E., Jr.; Smith, Anthony M.
1991-01-01
A trend analysis study has been performed on problem reports recorded during the Space Shuttle 26R payload's processing cycle at NASA-Kennedy, using the defect-flow analysis (DFA) methodology; DFA gives attention to the characteristics of the problem-report 'population' as a whole. It is established that the problem reports contain data which distract from pressing problems, and that fully 60 percent of such reports were caused during processing at NASA-Kennedy. The second major cause of problem reports was design defects.
NASA Technical Reports Server (NTRS)
Brown, R. A.
1986-01-01
This research program focuses on analysis of the transport mechanisms in solidification processes, especially one of interest to the Microgravity Sciences and Applications Program of NASA. Research during the last year has focused on analysis of the dynamics of the floating zone process for growth of small-scale crystals, on studies of the effect of applied magnetic fields on convection and solute segregation in directional solidification, and on the dynamics of microscopic cell formation in two-dimensional solidification of binary alloys. Significant findings are given.
Annalaura, Carducci; Giulia, Davini; Stefano, Ceccanti
2013-01-01
Risk analysis is widely used in the pharmaceutical industry to manage production processes, validation activities, training, and other activities. Several methods of risk analysis are available (for example, failure mode and effects analysis, fault tree analysis), and one or more should be chosen and adapted to the specific field where they will be applied. Among the methods available, hazard analysis and critical control points (HACCP) is a methodology that has been applied since the 1960s, and whose areas of application have expanded over time from food to the pharmaceutical industry. It can be easily and successfully applied to several processes because its main feature is the identification, assessment, and control of hazards. It can be also integrated with other tools, such as fishbone diagram and flowcharting. The aim of this article is to show how HACCP can be used to manage an analytical process, propose how to conduct the necessary steps, and provide data templates necessary to document and useful to follow current good manufacturing practices. In the quality control process, risk analysis is a useful tool for enhancing the uniformity of technical choices and their documented rationale. Accordingly, it allows for more effective and economical laboratory management, is capable of increasing the reliability of analytical results, and enables auditors and authorities to better understand choices that have been made. The aim of this article is to show how hazard analysis and critical control points can be used to manage bacterial endotoxins testing and other analytical processes in a formal, clear, and detailed manner.
From perceptual to lexico-semantic analysis--cortical plasticity enabling new levels of processing.
Schlaffke, Lara; Rüther, Naima N; Heba, Stefanie; Haag, Lauren M; Schultz, Thomas; Rosengarth, Katharina; Tegenthoff, Martin; Bellebaum, Christian; Schmidt-Wilcke, Tobias
2015-11-01
Certain kinds of stimuli can be processed on multiple levels. While the neural correlates of different levels of processing (LOPs) have been investigated to some extent, most of the studies involve skills and/or knowledge already present when performing the task. In this study we specifically sought to identify neural correlates of an evolving skill that allows the transition from perceptual to a lexico-semantic stimulus analysis. Eighteen participants were trained to decode 12 letters of Morse code that were presented acoustically inside and outside of the scanner environment. Morse code was presented in trains of three letters while brain activity was assessed with fMRI. Participants either attended to the stimulus length (perceptual analysis), or evaluated its meaning distinguishing words from nonwords (lexico-semantic analysis). Perceptual and lexico-semantic analyses shared a mutual network comprising the left premotor cortex, the supplementary motor area (SMA) and the inferior parietal lobule (IPL). Perceptual analysis was associated with a strong brain activation in the SMA and the superior temporal gyrus bilaterally (STG), which remained unaltered from pre and post training. In the lexico-semantic analysis post learning, study participants showed additional activation in the left inferior frontal cortex (IFC) and in the left occipitotemporal cortex (OTC), regions known to be critically involved in lexical processing. Our data provide evidence for cortical plasticity evolving with a learning process enabling the transition from perceptual to lexico-semantic stimulus analysis. Importantly, the activation pattern remains task-related LOP and is thus the result of a decision process as to which LOP to engage in. © 2015 The Authors. Human Brain Mapping Published by Wiley Periodicals, Inc.
The road to smoke-free legislation in Ireland.
Currie, Laura M; Clancy, Luke
2011-01-01
To describe the process through which Ireland changed its policies towards smoking in work-places and distil lessons for others implementing or extending smoke-free laws. This analysis is informed by a review of secondary sources including a commissioned media analysis, documentary analysis and key informant interviews with policy actors who provide insight into the process of smoke-free policy development. The policy analysis techniques used include the development of a time-line for policy reform, stakeholder analysis, policy mapping techniques, impact analysis through use of secondary data and a review process. The policy analysis triangle, which highlights the importance of examining policy content, context, actors and processes, will be used as an analytical framework. The importance of the political, economic, social and cultural context emerged clearly. The interaction of the context with the policy process both in identification of need for policy and its formulation demonstrated the opportunity for advocates to exert influence at all points of the process. The campaign to support the legislation had the following characteristics: a sustained consistent simple health message, sustained political leadership/commitment, a strong coalition between the Health Alliance, the Office of Tobacco Control and the Department of Health and Children, with cross-party political support and trade union support. The public and the media support clearly defined the benefit of deliberate and consistent planning and organization of a communication strategy. The Irish smoke-free legislation was a success as a policy initiative because of timing, dedication, planning, implementation and the existence of strong leadership and a powerful convinced credible political champion. © 2010 The Authors, Addiction © 2010 Society for the Study of Addiction.
Contributions to systemic analysis for worm screw production using thread whirling devices
NASA Astrophysics Data System (ADS)
Cretu, G.
2017-08-01
The paper aims to achieve a systemic analysis of worms processing using whirling threaded devices with highlighting all the factors involved in this system. It will also carry out an analysis of these factors depending on specific conditions such machining. Are also presented the stages of experimentation program and ways of processing for data obtained.
The Controlling Function of the Agent in the Analysis of Question-Response Relationships.
ERIC Educational Resources Information Center
Bierschenk, Inger
In contrast to traditional linguistic analysis, a model based on the empirical agent is presented and tested. A text is regarded as an intentionally produced cognitive process. The analysis has to take the agent (perspective) into account to facilitate an adequate processing of its objectives (viewpoints). Moreover, the model is surface-oriented…
Code of Federal Regulations, 2014 CFR
2014-10-01
... over the lands covered by your application a written analysis of those factors applicable to your... actual costs (see § 2804.14(f) of this subpart). Submitting your analysis with the application will.... While we consider your written analysis, BLM will not process your Category 6 application. (a) FLPMA...
Code of Federal Regulations, 2013 CFR
2013-10-01
... over the lands covered by your application a written analysis of those factors applicable to your... actual costs (see § 2804.14(f) of this subpart). Submitting your analysis with the application will.... While we consider your written analysis, BLM will not process your Category 6 application. (a) FLPMA...
Code of Federal Regulations, 2011 CFR
2011-10-01
... over the lands covered by your application a written analysis of those factors applicable to your... actual costs (see § 2804.14(f) of this subpart). Submitting your analysis with the application will.... While we consider your written analysis, BLM will not process your Category 6 application. (a) FLPMA...
Code of Federal Regulations, 2012 CFR
2012-10-01
... over the lands covered by your application a written analysis of those factors applicable to your... actual costs (see § 2804.14(f) of this subpart). Submitting your analysis with the application will.... While we consider your written analysis, BLM will not process your Category 6 application. (a) FLPMA...
Linear circuit analysis program for IBM 1620 Monitor 2, 1311/1443 data processing system /CIRCS/
NASA Technical Reports Server (NTRS)
Hatfield, J.
1967-01-01
CIRCS is modification of IBSNAP Circuit Analysis Program, for use on smaller systems. This data processing system retains the basic dc, transient analysis, and FORTRAN 2 formats. It can be used on the IBM 1620/1311 Monitor I Mod 5 system, and solves a linear network containing 15 nodes and 45 branches.
The SeaDAS Processing and Analysis System: SeaWiFS, MODIS, and Beyond
NASA Astrophysics Data System (ADS)
MacDonald, M. D.; Ruebens, M.; Wang, L.; Franz, B. A.
2005-12-01
The SeaWiFS Data Analysis System (SeaDAS) is a comprehensive software package for the processing, display, and analysis of ocean data from a variety of satellite sensors. Continuous development and user support by programmers and scientists for more than a decade has helped to make SeaDAS the most widely used software package in the world for ocean color applications, with a growing base of users from the land and sea surface temperature community. Full processing support for past (CZCS, OCTS, MOS) and present (SeaWiFS, MODIS) sensors, and anticipated support for future missions such as NPP/VIIRS, enables end users to reproduce the standard ocean archive product suite distributed by NASA's Ocean Biology Processing Group (OBPG), as well as a variety of evaluation and intermediate ocean, land, and atmospheric products. Availability of the processing algorithm source codes and a software build environment also provide users with the tools to implement custom algorithms. Recent SeaDAS enhancements include synchronization of MODIS processing with the latest code and calibration updates from the MODIS Calibration Support Team (MCST), support for all levels of MODIS processing including Direct Broadcast, a port to the Macintosh OS X operating system, release of the display/analysis-only SeaDAS-Lite, and an extremely active web-based user support forum.
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
Nott, Melissa T; Chapparo, Christine
2008-09-01
Agitation following traumatic brain injury is characterised by a heightened state of activity with disorganised information processing that interferes with learning and achieving functional goals. This study aimed to identify information processing problems during task performance of a severely agitated adult using the Perceive, Recall, Plan and Perform (PRPP) System of Task Analysis. Second, this study aimed to examine the sensitivity of the PRPP System to changes in task performance over a short period of rehabilitation, and third, to evaluate the guidance provided by the PRPP in directing intervention. A case study research design was employed. The PRPP System of Task Analysis was used to assess changes in task embedded information processing capacity during occupational therapy intervention with a severely agitated adult in a rehabilitation context. Performance is assessed on three selected tasks over a one-month period. Information processing difficulties during task performance can be clearly identified when observing a severely agitated adult following a traumatic brain injury. Processing skills involving attention, sensory processing and planning were most affected at this stage of rehabilitation. These processing difficulties are linked to established descriptions of agitated behaviour. Fluctuations in performance across three tasks of differing processing complexity were evident, leading to hypothesised relationships between task complexity, environment and novelty with information processing errors. Changes in specific information processing capacity over time were evident based on repeated measures using the PRPP System of Task Analysis. This lends preliminary support for its utility as an outcome measure, and raises hypotheses about the type of therapy required to enhance information processing in people with severe agitation. The PRPP System is sensitive to information processing changes in severely agitated adults when used to reassess performance over short intervals and can provide direct guidance to occupational therapy intervention to improve task embedded information processing by categorising errors under four stages of an information processing model: Perceive, Recall, Plan and Perform.
Preliminary Thermal-Mechanical Sizing of Metallic TPS: Process Development and Sensitivity Studies
NASA Technical Reports Server (NTRS)
Poteet, Carl C.; Abu-Khajeel, Hasan; Hsu, Su-Yuen
2002-01-01
The purpose of this research was to perform sensitivity studies and develop a process to perform thermal and structural analysis and sizing of the latest Metallic Thermal Protection System (TPS) developed at NASA LaRC (Langley Research Center). Metallic TPS is a key technology for reducing the cost of reusable launch vehicles (RLV), offering the combination of increased durability and competitive weights when compared to other systems. Accurate sizing of metallic TPS requires combined thermal and structural analysis. Initial sensitivity studies were conducted using transient one-dimensional finite element thermal analysis to determine the influence of various TPS and analysis parameters on TPS weight. The thermal analysis model was then used in combination with static deflection and failure mode analysis of the sandwich panel outer surface of the TPS to obtain minimum weight TPS configurations at three vehicle stations on the windward centerline of a representative RLV. The coupled nature of the analysis requires an iterative analysis process, which will be described herein. Findings from the sensitivity analysis are reported, along with TPS designs at the three RLV vehicle stations considered.
Xu, Jia-Min; Wang, Ce-Qun; Lin, Long-Nian
2014-06-25
Multi-channel in vivo recording techniques are used to record ensemble neuronal activity and local field potentials (LFP) simultaneously. One of the key points for the technique is how to process these two sets of recorded neural signals properly so that data accuracy can be assured. We intend to introduce data processing approaches for action potentials and LFP based on the original data collected through multi-channel recording system. Action potential signals are high-frequency signals, hence high sampling rate of 40 kHz is normally chosen for recording. Based on waveforms of extracellularly recorded action potentials, tetrode technology combining principal component analysis can be used to discriminate neuronal spiking signals from differently spatially distributed neurons, in order to obtain accurate single neuron spiking activity. LFPs are low-frequency signals (lower than 300 Hz), hence the sampling rate of 1 kHz is used for LFPs. Digital filtering is required for LFP analysis to isolate different frequency oscillations including theta oscillation (4-12 Hz), which is dominant in active exploration and rapid-eye-movement (REM) sleep, gamma oscillation (30-80 Hz), which is accompanied by theta oscillation during cognitive processing, and high frequency ripple oscillation (100-250 Hz) in awake immobility and slow wave sleep (SWS) state in rodent hippocampus. For the obtained signals, common data post-processing methods include inter-spike interval analysis, spike auto-correlation analysis, spike cross-correlation analysis, power spectral density analysis, and spectrogram analysis.
The initial design of LAPAN's IR micro bolometer using mission analysis process
NASA Astrophysics Data System (ADS)
Bustanul, A.; Irwan, P.; M. T., Andi; Firman, B.
2016-11-01
As new player in Infra Red (IR) sector, uncooled, small, and lightweight IR Micro Bolometer has been chosen as one of payloads for LAPAN's next micro satellite project. Driven the desire to create our own IR Micro Bolometer, mission analysis design procedure has been applied. After tracing all possible missions, the Planck's and Wien's Law for black body, Temperature Responsivity (TR), and sub-pixel response had been utilized in order to determine the appropriate spectral radiance. The 3.8 - 4 μm wavelength were available to detect wild fire (forest fire) and active volcanoes, two major problems faced by Indonesia. In order to strengthen and broaden the result, iteration process had been used throughout the process. The analysis, then, were continued by calculating Ground pixel size, IFOV pixel, swath width, and focus length. Meanwhile, regarding of resolution, at least it is 400 m. The further procedure covered the integrated of optical design, wherein we combined among optical design software, Zemax, with mechanical analysis software (structure and thermal analysis), such as Nastran and Thermal Desktop / Sinda Fluint. The integration process was intended to produce high performance optical system of our IR Micro Bolometer that can be used under extreme environment. The results of all those analysis, either in graphs or in measurement, show that the initial design of LAPAN'S IR Micro Bolometer meets the determined requirement. However, it needs the further evaluation (iteration). This paper describes the initial design of LAPAN's IR Micro Bolometer using mission analysis process
2012-02-09
different sources [12,13], but the analytical techniques needed for such analysis (XRD, INAA , & ICP-MS) are time consuming and require expensive...partial least-squares discriminant analysis (PLSDA) that used the SIMPLS solving method [33]. In the experi- ment design, a leave-one-sample-out (LOSO) para...REPORT Advanced signal processing analysis of laser-induced breakdown spectroscopy data for the discrimination of obsidian sources 14. ABSTRACT 16
General RMP Guidance - Appendix D: OSHA Guidance on PSM
OSHA's Process Safety Management (PSM) Guidance on providing complete and accurate written information concerning process chemicals, process technology, and process equipment; including process hazard analysis and material safety data sheets.
CPAS Preflight Drop Test Analysis Process
NASA Technical Reports Server (NTRS)
Englert, Megan E.; Bledsoe, Kristin J.; Romero, Leah M.
2015-01-01
Throughout the Capsule Parachute Assembly System (CPAS) drop test program, the CPAS Analysis Team has developed a simulation and analysis process to support drop test planning and execution. This process includes multiple phases focused on developing test simulations and communicating results to all groups involved in the drop test. CPAS Engineering Development Unit (EDU) series drop test planning begins with the development of a basic operational concept for each test. Trajectory simulation tools include the Flight Analysis and Simulation Tool (FAST) for single bodies, and the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulation for the mated vehicle. Results are communicated to the team at the Test Configuration Review (TCR) and Test Readiness Review (TRR), as well as at Analysis Integrated Product Team (IPT) meetings in earlier and intermediate phases of the pre-test planning. The ability to plan and communicate efficiently with rapidly changing objectives and tight schedule constraints is a necessity for safe and successful drop tests.
Conducting a narrative analysis.
Emden, C
1998-07-01
This paper describes the process of narrative analysis as undertaken within a nursing study on scholars and scholarship. If follows an earlier paper titled: Theoretical perspectives on narrative inquiry that described the influencing ideas of Bruner (1987) and Roof (1994) upon the same study. Analysis procedures are described here in sufficient detail for other researchers wishing to implement a similar approach to do so. The process as described has two main components: (A) strategies of 'core story creation' and 'employment'; and (B) issues and dilemmas of narrative analysis, especially relating to rigour. The ideas of Polkinghorne (1988), Mishler (1986), and Labov (in Mishler 1986a) are introduced in so far as they impinge upon the analysis process. These relate especially to the development of key terms, and to the analysis strategies of core story creation and employment. Outcomes of the study in question are termed 'Signposting the lived-world of scholarship'.
The detection and analysis of point processes in biological signals
NASA Technical Reports Server (NTRS)
Anderson, D. J.; Correia, M. J.
1977-01-01
A pragmatic approach to the detection and analysis of discrete events in biomedical signals is taken. Examples from both clinical and basic research are provided. Introductory sections discuss not only discrete events which are easily extracted from recordings by conventional threshold detectors but also events embedded in other information carrying signals. The primary considerations are factors governing event-time resolution and the effects limits to this resolution have on the subsequent analysis of the underlying process. The analysis portion describes tests for qualifying the records as stationary point processes and procedures for providing meaningful information about the biological signals under investigation. All of these procedures are designed to be implemented on laboratory computers of modest computational capacity.
Processing and Analysis of Mars Pathfinder Science Data at JPL's Science Data Processing Section
NASA Technical Reports Server (NTRS)
LaVoie, S.; Green, W.; Runkle, A.; Alexander, D.; Andres, P.; DeJong, E.; Duxbury, E.; Freda, D.; Gorjian, Z.; Hall, J.;
1998-01-01
The Mars Pathfinder mission required new capabilities and adaptation of existing capabilities in order to support science analysis and flight operations requirements imposed by the in-situ nature of the mission.
Sustainability Analysis for Products and Processes
Sustainability Analysis for Products and Processes Subhas K. Sikdar National Risk Management Research Laboratory United States Environmental protection Agency 26 W. M.L. King Dr. Cincinnati, OH 45237 Sikdar.subhas@epa.gov ABSTRACT Claims of both sustainable and unsu...
ERIC Educational Resources Information Center
Burns, Daniel J.; Martens, Nicholas J.; Bertoni, Alicia A.; Sweeney, Emily J.; Lividini, Michelle D.
2006-01-01
In a repeated testing paradigm, list items receiving item-specific processing are more likely to be recovered across successive tests (item gains), whereas items receiving relational processing are likely to be forgotten progressively less on successive tests. Moreover, analysis of cumulative-recall curves has shown that item-specific processing…
ERIC Educational Resources Information Center
Bohrn, Isabel C.; Altmann, Ulrike; Jacobs, Arthur M.
2012-01-01
A quantitative, coordinate-based meta-analysis combined data from 354 participants across 22 fMRI studies and one positron emission tomography (PET) study to identify the differences in neural correlates of figurative and literal language processing, and to investigate the role of the right hemisphere (RH) in figurative language processing.…
MSEE: Stochastic Cognitive Linguistic Behavior Models for Semantic Sensing
2013-09-01
recognition, a Gaussian Process Dynamic Model with Social Network Analysis (GPDM-SNA) for a small human group action recognition, an extended GPDM-SNA...44 3.2. Small Human Group Activity Modeling Based on Gaussian Process Dynamic Model and Social Network Analysis (SN-GPDM...51 Approved for public release; distribution unlimited. 3 3.2.3. Gaussian Process Dynamical Model and
PSP, TSP, XP, CMMI...Eating the Alphabet Soup!
2011-05-19
Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that notwithstanding any other...4 Q tit t Continuous process improvement Organizational Performance Management Causal Analysis and Resolution Level Focus Process Areas Requirements...Project Management process standardization Risk management Decision Analysis and Resolution Product Integration 2 M d R i t t anage Basic Project
A cost analysis: processing maple syrup products
Neil K. Huyler; Lawrence D. Garrett
1979-01-01
A cost analysis of processing maple sap to syrup for three fuel types, oil-, wood-, and LP gas-fired evaporators, indicates that: (1) fuel, capital, and labor are the major cost components of processing sap to syrup; (2) wood-fired evaporators show a slight cost advantage over oil- and LP gas-fired evaporators; however, as the cost of wood approaches $50 per cord, wood...
2011-12-01
systems engineering technical and technical management processes. Technical Planning, Stakeholders Requirements Development, and Architecture Design were...Stakeholder Requirements Definition, Architecture Design and Technical Planning. A purposive sampling of AFRL rapid development program managers and engineers...emphasize one process over another however Architecture Design , Implementation scored higher among Technical Processes. Decision Analysis, Technical
The instrumental genesis process in future primary teachers using Dynamic Geometry Software
NASA Astrophysics Data System (ADS)
Ruiz-López, Natalia
2018-05-01
This paper, which describes a study undertaken with pairs of future primary teachers using GeoGebra software to solve geometry problems, includes a brief literature review, the theoretical framework and methodology used. An analysis of the instrumental genesis process for a pair participating in the case study is also provided. This analysis addresses the techniques and types of dragging used, the obstacles to learning encountered, a description of the interaction between the pair and their interaction with the teacher, and the type of language used. Based on this analysis, possibilities and limitations of the instrumental genesis process are identified for the development of geometric competencies such as conjecture creation, property checking and problem researching. It is also suggested that the methodology used in the analysis of the problem solving process may be useful for those teachers and researchers who want to integrate Dynamic Geometry Software (DGS) in their classrooms.
Closed Loop Requirements and Analysis Management
NASA Technical Reports Server (NTRS)
Lamoreaux, Michael; Verhoef, Brett
2015-01-01
Effective systems engineering involves the use of analysis in the derivation of requirements and verification of designs against those requirements. The initial development of requirements often depends on analysis for the technical definition of specific aspects of a product. Following the allocation of system-level requirements to a product's components, the closure of those requirements often involves analytical approaches to verify that the requirement criteria have been satisfied. Meanwhile, changes that occur in between these two processes need to be managed in order to achieve a closed-loop requirement derivation/verification process. Herein are presented concepts for employing emerging Team center capabilities to jointly manage requirements and analysis data such that analytical techniques are utilized to effectively derive and allocate requirements, analyses are consulted and updated during the change evaluation processes, and analyses are leveraged during the design verification process. Recommendations on concept validation case studies are also discussed.
Development of economic consequence methodology for process risk analysis.
Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed
2015-04-01
A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Pochampally, Kishore K.; Gupta, Surendra M.; Cullinane, Thomas P.
2004-02-01
The cost-benefit analysis of data associated with re-processing of used products often involves the uncertainty feature of cash-flow modeling. The data is not objective because of uncertainties in supply, quality and disassembly times of used products. Hence, decision-makers must rely on "fuzzy" data for analysis. The same parties that are involved in the forward supply chain often carry out the collection and re-processing of used products. It is therefore important that the cost-benefit analysis takes the data of both new products and used products into account. In this paper, a fuzzy cost-benefit function is proposed that is used to perform a multi-criteria economic analysis to select the most economical products to process in a closed-loop supply chain. Application of the function is detailed through an illustrative example.
NASA Astrophysics Data System (ADS)
Peer, Regina; Peer, Siegfried; Sander, Heike; Marsolek, Ingo; Koller, Wolfgang; Pappert, Dirk; Hierholzer, Johannes
2002-05-01
If new technology is introduced into medical practice it must prove to make a difference. However traditional approaches of outcome analysis failed to show a direct benefit of PACS on patient care and economical benefits are still in debate. A participatory process analysis was performed to compare workflow in a film based hospital and a PACS environment. This included direct observation of work processes, interview of involved staff, structural analysis and discussion of observations with staff members. After definition of common structures strong and weak workflow steps were evaluated. With a common workflow structure in both hospitals, benefits of PACS were revealed in workflow steps related to image reporting with simultaneous image access for ICU-physicians and radiologists, archiving of images as well as image and report distribution. However PACS alone is not able to cover the complete process of 'radiography for intensive care' from ordering of an image till provision of the final product equals image + report. Interference of electronic workflow with analogue process steps such as paper based ordering reduces the potential benefits of PACS. In this regard workflow modeling proved to be very helpful for the evaluation of complex work processes linking radiology and the ICU.
Determinants of job stress in chemical process industry: A factor analysis approach.
Menon, Balagopal G; Praveensal, C J; Madhu, G
2015-01-01
Job stress is one of the active research domains in industrial safety research. The job stress can result in accidents and health related issues in workers in chemical process industries. Hence it is important to measure the level of job stress in workers so as to mitigate the same to avoid the worker's safety related problems in the industries. The objective of this study is to determine the job stress factors in the chemical process industry in Kerala state, India. This study also aims to propose a comprehensive model and an instrument framework for measuring job stress levels in the chemical process industries in Kerala, India. The data is collected through a questionnaire survey conducted in chemical process industries in Kerala. The collected data out of 1197 surveys is subjected to principal component and confirmatory factor analysis to develop the job stress factor structure. The factor analysis revealed 8 factors that influence the job stress in process industries. It is also found that the job stress in employees is most influenced by role ambiguity and the least by work environment. The study has developed an instrument framework towards measuring job stress utilizing exploratory factor analysis and structural equation modeling.
Ibarra-Taquez, Harold N; GilPavas, Edison; Blatchley, Ernest R; Gómez-García, Miguel-Ángel; Dobrosz-Gómez, Izabela
2017-09-15
Soluble coffee production generates wastewater containing complex mixtures of organic macromolecules. In this work, a sequential Electrocoagulation-Electrooxidation (EC-EO) process, using aluminum and graphite electrodes, was proposed as an alternative way for the treatment of soluble coffee effluent. Process operational parameters were optimized, achieving total decolorization, as well as 74% and 63.5% of COD and TOC removal, respectively. The integrated EC-EO process yielded a highly oxidized (AOS = 1.629) and biocompatible (BOD 5 /COD ≈ 0.6) effluent. The Molecular Weight Distribution (MWD) analysis showed that during the EC-EO process, EC effectively decomposed contaminants with molecular weight in the range of 10-30 kDa. In contrast, EO was quite efficient in mineralization of contaminants with molecular weight higher than 30 kDa. A kinetic analysis allowed determination of the time required to meet Colombian permissible discharge limits. Finally, a comprehensive operational cost analysis was performed. The integrated EC-EO process was demonstrated as an efficient alternative for the treatment of industrial effluents resulting from soluble coffee production. Copyright © 2017 Elsevier Ltd. All rights reserved.
He, Qing; Hao, Yinping; Liu, Hui; Liu, Wenyi
2018-01-01
Super-critical carbon dioxide energy-storage (SC-CCES) technology is a new type of gas energy-storage technology. This paper used orthogonal method and variance analysis to make significant analysis on the factors which would affect the thermodynamics characteristics of the SC-CCES system and obtained the significant factors and interactions in the energy-storage process, the energy-release process and the whole energy-storage system. Results have shown that the interactions in the components have little influence on the energy-storage process, the energy-release process and the whole energy-storage process of the SC-CCES system, the significant factors are mainly on the characteristics of the system component itself, which will provide reference for the optimization of the thermal properties of the energy-storage system.
He, Qing; Liu, Hui; Liu, Wenyi
2018-01-01
Super-critical carbon dioxide energy-storage (SC-CCES) technology is a new type of gas energy-storage technology. This paper used orthogonal method and variance analysis to make significant analysis on the factors which would affect the thermodynamics characteristics of the SC-CCES system and obtained the significant factors and interactions in the energy-storage process, the energy-release process and the whole energy-storage system. Results have shown that the interactions in the components have little influence on the energy-storage process, the energy-release process and the whole energy-storage process of the SC-CCES system, the significant factors are mainly on the characteristics of the system component itself, which will provide reference for the optimization of the thermal properties of the energy-storage system. PMID:29634742
Artificial intelligence applied to process signal analysis
NASA Technical Reports Server (NTRS)
Corsberg, Dan
1988-01-01
Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.
Cost analysis of advanced turbine blade manufacturing processes
NASA Technical Reports Server (NTRS)
Barth, C. F.; Blake, D. E.; Stelson, T. S.
1977-01-01
A rigorous analysis was conducted to estimate relative manufacturing costs for high technology gas turbine blades prepared by three candidate materials process systems. The manufacturing costs for the same turbine blade configuration of directionally solidified eutectic alloy, an oxide dispersion strengthened superalloy, and a fiber reinforced superalloy were compared on a relative basis to the costs of the same blade currently in production utilizing the directional solidification process. An analytical process cost model was developed to quantitatively perform the cost comparisons. The impact of individual process yield factors on costs was also assessed as well as effects of process parameters, raw materials, labor rates and consumable items.
A framework of knowledge creation processes in participatory simulation of hospital work systems.
Andersen, Simone Nyholm; Broberg, Ole
2017-04-01
Participatory simulation (PS) is a method to involve workers in simulating and designing their own future work system. Existing PS studies have focused on analysing the outcome, and minimal attention has been devoted to the process of creating this outcome. In order to study this process, we suggest applying a knowledge creation perspective. The aim of this study was to develop a framework describing the process of how ergonomics knowledge is created in PS. Video recordings from three projects applying PS of hospital work systems constituted the foundation of process mining analysis. The analysis resulted in a framework revealing the sources of ergonomics knowledge creation as sequential relationships between the activities of simulation participants sharing work experiences; experimenting with scenarios; and reflecting on ergonomics consequences. We argue that this framework reveals the hidden steps of PS that are essential when planning and facilitating PS that aims at designing work systems. Practitioner Summary: When facilitating participatory simulation (PS) in work system design, achieving an understanding of the PS process is essential. By applying a knowledge creation perspective and process mining, we investigated the knowledge-creating activities constituting the PS process. The analysis resulted in a framework of the knowledge-creating process in PS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, J.; Moon, T.J.; Howell, J.R.
This paper presents an analysis of the heat transfer occurring during an in-situ curing process for which infrared energy is provided on the surface of polymer composite during winding. The material system is Hercules prepreg AS4/3501-6. Thermoset composites have an exothermic chemical reaction during the curing process. An Eulerian thermochemical model is developed for the heat transfer analysis of helical winding. The model incorporates heat generation due to the chemical reaction. Several assumptions are made leading to a two-dimensional, thermochemical model. For simplicity, 360{degree} heating around the mandrel is considered. In order to generate the appropriate process windows, the developedmore » heat transfer model is combined with a simple winding time model. The process windows allow for a proper selection of process variables such as infrared energy input and winding velocity to give a desired end-product state. Steady-state temperatures are found for each combination of the process variables. A regression analysis is carried out to relate the process variables to the resulting steady-state temperatures. Using regression equations, process windows for a wide range of cylinder diameters are found. A general procedure to find process windows for Hercules AS4/3501-6 prepreg tape is coded in a FORTRAN program.« less
A Framework for Business Process Change Requirements Analysis
NASA Astrophysics Data System (ADS)
Grover, Varun; Otim, Samuel
The ability to quickly and continually adapt business processes to accommodate evolving requirements and opportunities is critical for success in competitive environments. Without appropriate linkage between redesign decisions and strategic inputs, identifying processes that need to be modified will be difficult. In this paper, we draw attention to the analysis of business process change requirements in support of process change initiatives. Business process redesign is a multifaceted phenomenon involving processes, organizational structure, management systems, human resource architecture, and many other aspects of organizational life. To be successful, the business process initiative should focus not only on identifying the processes to be redesigned, but also pay attention to various enablers of change. Above all, a framework is just a blueprint; management must lead change. We hope our modest contribution will draw attention to the broader framing of requirements for business process change.
Data near processing support for climate data analysis
NASA Astrophysics Data System (ADS)
Kindermann, Stephan; Ehbrecht, Carsten; Hempelmann, Nils
2016-04-01
Climate data repositories grow in size exponentially. Scalable data near processing capabilities are required to meet future data analysis requirements and to replace current "data download and process at home" workflows and approaches. On one hand side, these processing capabilities should be accessible via standardized interfaces (e.g. OGC WPS), on the other side a large variety of processing tools, toolboxes and deployment alternatives have to be supported and maintained at the data/processing center. We present a community approach of a modular and flexible system supporting the development, deployment and maintenace of OGC-WPS based web processing services. This approach is organized in an open source github project (called "bird-house") supporting individual processing services ("birds", e.g. climate index calculations, model data ensemble calculations), which rely on basic common infrastructural components (e.g. installation and deployment recipes, analysis code dependencies management). To support easy deployment at data centers as well as home institutes (e.g. for testing and development) the system supports the management of the often very complex package dependency chain of climate data analysis packages as well as docker based packaging and installation. We present a concrete deployment scenario at the German Climate Computing Center (DKRZ). The DKRZ one hand side hosts a multi-petabyte climate archive which is integrated e.g. into the european ENES and worldwide ESGF data infrastructure, and on the other hand hosts an HPC center supporting (model) data production and data analysis. The deployment scenario also includes openstack based data cloud services to support data import and data distribution for bird-house based WPS web processing services. Current challenges for inter-institutionnal deployments of web processing services supporting the european and international climate modeling community as well as the climate impact community are highlighted. Also aspects supporting future WPS based cross community usage scenarios supporting data reuse and data provenance aspects are reflected.
Foveal analysis and peripheral selection during active visual sampling
Ludwig, Casimir J. H.; Davies, J. Rhys; Eckstein, Miguel P.
2014-01-01
Human vision is an active process in which information is sampled during brief periods of stable fixation in between gaze shifts. Foveal analysis serves to identify the currently fixated object and has to be coordinated with a peripheral selection process of the next fixation location. Models of visual search and scene perception typically focus on the latter, without considering foveal processing requirements. We developed a dual-task noise classification technique that enables identification of the information uptake for foveal analysis and peripheral selection within a single fixation. Human observers had to use foveal vision to extract visual feature information (orientation) from different locations for a psychophysical comparison. The selection of to-be-fixated locations was guided by a different feature (luminance contrast). We inserted noise in both visual features and identified the uptake of information by looking at correlations between the noise at different points in time and behavior. Our data show that foveal analysis and peripheral selection proceeded completely in parallel. Peripheral processing stopped some time before the onset of an eye movement, but foveal analysis continued during this period. Variations in the difficulty of foveal processing did not influence the uptake of peripheral information and the efficacy of peripheral selection, suggesting that foveal analysis and peripheral selection operated independently. These results provide important theoretical constraints on how to model target selection in conjunction with foveal object identification: in parallel and independently. PMID:24385588
Parallel ICA and its hardware implementation in hyperspectral image analysis
NASA Astrophysics Data System (ADS)
Du, Hongtao; Qi, Hairong; Peterson, Gregory D.
2004-04-01
Advances in hyperspectral images have dramatically boosted remote sensing applications by providing abundant information using hundreds of contiguous spectral bands. However, the high volume of information also results in excessive computation burden. Since most materials have specific characteristics only at certain bands, a lot of these information is redundant. This property of hyperspectral images has motivated many researchers to study various dimensionality reduction algorithms, including Projection Pursuit (PP), Principal Component Analysis (PCA), wavelet transform, and Independent Component Analysis (ICA), where ICA is one of the most popular techniques. It searches for a linear or nonlinear transformation which minimizes the statistical dependence between spectral bands. Through this process, ICA can eliminate superfluous but retain practical information given only the observations of hyperspectral images. One hurdle of applying ICA in hyperspectral image (HSI) analysis, however, is its long computation time, especially for high volume hyperspectral data sets. Even the most efficient method, FastICA, is a very time-consuming process. In this paper, we present a parallel ICA (pICA) algorithm derived from FastICA. During the unmixing process, pICA divides the estimation of weight matrix into sub-processes which can be conducted in parallel on multiple processors. The decorrelation process is decomposed into the internal decorrelation and the external decorrelation, which perform weight vector decorrelations within individual processors and between cooperative processors, respectively. In order to further improve the performance of pICA, we seek hardware solutions in the implementation of pICA. Until now, there are very few hardware designs for ICA-related processes due to the complicated and iterant computation. This paper discusses capacity limitation of FPGA implementations for pICA in HSI analysis. A synthesis of Application-Specific Integrated Circuit (ASIC) is designed for pICA-based dimensionality reduction in HSI analysis. The pICA design is implemented using standard-height cells and aimed at TSMC 0.18 micron process. During the synthesis procedure, three ICA-related reconfigurable components are developed for the reuse and retargeting purpose. Preliminary results show that the standard-height cell based ASIC synthesis provide an effective solution for pICA and ICA-related processes in HSI analysis.
Designing a ticket to ride with the Cognitive Work Analysis Design Toolkit.
Read, Gemma J M; Salmon, Paul M; Lenné, Michael G; Jenkins, Daniel P
2015-01-01
Cognitive work analysis has been applied in the design of numerous sociotechnical systems. The process used to translate analysis outputs into design concepts, however, is not always clear. Moreover, structured processes for translating the outputs of ergonomics methods into concrete designs are lacking. This paper introduces the Cognitive Work Analysis Design Toolkit (CWA-DT), a design approach which has been developed specifically to provide a structured means of incorporating cognitive work analysis outputs in design using design principles and values derived from sociotechnical systems theory. This paper outlines the CWA-DT and describes its application in a public transport ticketing design case study. Qualitative and quantitative evaluations of the process provide promising early evidence that the toolkit fulfils the evaluation criteria identified for its success, with opportunities for improvement also highlighted. The Cognitive Work Analysis Design Toolkit has been developed to provide ergonomics practitioners with a structured approach for translating the outputs of cognitive work analysis into design solutions. This paper demonstrates an application of the toolkit and provides evaluation findings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Ba Nghiep; Johnson, Kenneth I.; Khaleel, Mohammad A.
2003-04-01
This paper employs an inverse approach (IA) formulation for the analysis of tubes under free hydroforming conditions. The IA formulation is derived from that of Guo et al. established for flat sheet hydroforming analysis using constant strain triangular membrane elements. At first, an incremental analysis of free hydroforming for a hot-dip galvanized (HG/Z140) DP600 tube is performed using the finite element Marc code. The deformed geometry obtained at the last converged increment is then used as the final configuration in the inverse analysis. This comparative study allows us to assess the predicting capability of the inverse analysis. The results willmore » be compared with the experimental values determined by Asnafi and Skogsgardh. After that, a procedure based on a forming limit diagram (FLD) is proposed to adjust the process parameters such as the axial feed and internal pressure. Finally, the adjustment process is illustrated through a re-analysis of the same tube using the inverse approach« less
SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool
Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda
2008-01-01
Background It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. Results This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. Conclusion SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes. PMID:18706080
SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool.
Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda
2008-08-15
It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes.
The Pan-STARRS PS1 Image Processing Pipeline
NASA Astrophysics Data System (ADS)
Magnier, E.
The Pan-STARRS PS1 Image Processing Pipeline (IPP) performs the image processing and data analysis tasks needed to enable the scientific use of the images obtained by the Pan-STARRS PS1 prototype telescope. The primary goals of the IPP are to process the science images from the Pan-STARRS telescopes and make the results available to other systems within Pan-STARRS. It also is responsible for combining all of the science images in a given filter into a single representation of the non-variable component of the night sky defined as the "Static Sky". To achieve these goals, the IPP also performs other analysis functions to generate the calibrations needed in the science image processing, and to occasionally use the derived data to generate improved astrometric and photometric reference catalogs. It also provides the infrastructure needed to store the incoming data and the resulting data products. The IPP inherits lessons learned, and in some cases code and prototype code, from several other astronomy image analysis systems, including Imcat (Kaiser), the Sloan Digital Sky Survey (REF), the Elixir system (Magnier & Cuillandre), and Vista (Tonry). Imcat and Vista have a large number of robust image processing functions. SDSS has demonstrated a working analysis pipeline and large-scale databasesystem for a dedicated project. The Elixir system has demonstrated an automatic image processing system and an object database system for operational usage. This talk will present an overview of the IPP architecture, functional flow, code development structure, and selected analysis algorithms. Also discussed is the HW highly parallel HW configuration necessary to support PS1 operational requirements. Finally, results are presented of the processing of images collected during PS1 early commissioning tasks utilizing the Pan-STARRS Test Camera #3.
Varzakas, Theodoros H
2011-09-01
The Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of pastry processing. A tentative approach of FMEA application to the pastry industry was attempted in conjunction with ISO22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (pastry processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and fishbone diagram). In this work a comparison of ISO22000 analysis with HACCP is carried out over pastry processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the Risk Priority Number (RPN) per identified processing hazard. Storage of raw materials and storage of final products at -18°C followed by freezing were the processes identified as the ones with the highest RPN (225, 225, and 144 respectively) and corrective actions were undertaken. Following the application of corrective actions, a second calculation of RPN values was carried out leading to considerably lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO22000 system of a pastry processing industry is considered imperative.
Unbiased plasma metabolomics reveal the correlation of metabolic pathways and Prakritis of humans.
Shirolkar, Amey; Chakraborty, Sutapa; Mandal, Tusharkanti; Dabur, Rajesh
2017-11-25
Ayurveda, an ancient Indian medicinal system, has categorized human body constitutions in three broad constitutional types (prakritis) i.e. Vata, Pitta and Kapha. Analysis of plasma metabolites and related pathways to classify Prakriti specific dominant marker metabolites and metabolic pathways. 38 healthy male individuals were assessed for dominant Prakritis and their fasting blood samples were collected. The processed plasma samples were subjected to rapid resolution liquid chromatography-electrospray ionization-quadrupole time of flight mass spectrometry (RRLC-ESI-QTOFMS). Mass profiles were aligned and subjected to multivariate analysis. Partial least square discriminant analysis (PLS-DA) model showed 97.87% recognition capability. List of PLS-DA metabolites was subjected to permutative Benjamini-Hochberg false discovery rate (FDR) correction and final list of 76 metabolites with p < 0.05 and fold-change > 2.0 was identified. Pathway analysis using metascape and JEPETTO plugins in Cytoscape revealed that steroidal hormone biosynthesis, amino acid, and arachidonic acid metabolism are major pathways varying with different constitution. Biological Go processes analysis showed that aromatic amino acids, sphingolipids, and pyrimidine nucleotides metabolic processes were dominant in kapha type of body constitution. Fat soluble vitamins, cellular amino acid, and androgen biosynthesis process along with branched chain amino acid and glycerolipid catabolic processes were dominant in pitta type individuals. Vata Prakriti was found to have dominant catecholamine, arachidonic acid and hydrogen peroxide metabolomics processes. The neurotransmission and oxidative stress in vata, BCAA catabolic, androgen, xenobiotics metabolic processes in pitta, and aromatic amino acids, sphingolipid, and pyrimidine metabolic process in kaphaPrakriti were the dominant marker pathways. Copyright © 2017 Transdisciplinary University, Bangalore and World Ayurveda Foundation. Published by Elsevier B.V. All rights reserved.
2005-06-01
cognitive task analysis , organizational information dissemination and interaction, systems engineering, collaboration and communications processes, decision-making processes, and data collection and organization. By blending these diverse disciplines command centers can be designed to support decision-making, cognitive analysis, information technology, and the human factors engineering aspects of Command and Control (C2). This model can then be used as a baseline when dealing with work in areas of business processes, workflow engineering, information management,
Development of Low-cost, High Energy-per-unit-area Solar Cell Modules
NASA Technical Reports Server (NTRS)
Jones, G. T.; Chitre, S.; Rhee, S. S.
1978-01-01
The development of two hexagonal solar cell process sequences, a laserscribing process technique for scribing hexagonal and modified hexagonal solar cells, a large through-put diffusion process, and two surface macrostructure processes suitable for large scale production is reported. Experimental analysis was made on automated spin-on anti-reflective coating equipment and high pressure wafer cleaning equipment. Six hexagonal solar cell modules were fabricated. Also covered is a detailed theoretical analysis on the optimum silicon utilization by modified hexagonal solar cells.
Honda, Masayuki; Matsumoto, Takehiro
2017-01-01
Several kinds of event log data produced in daily clinical activities have yet to be used for secure and efficient improvement of hospital activities. Data Warehouse systems in Hospital Information Systems used for the analysis of structured data such as disease, lab-tests, and medications, have also shown efficient outcomes. This article is focused on two kinds of essential functions: process mining using log data and non-structured data analysis via Natural Language Processing.
Guidebook for solar process-heat applications
NASA Astrophysics Data System (ADS)
Fazzolare, R.; Mignon, G.; Campoy, L.; Luttmann, F.
1981-01-01
The potential for solar process heat in Arizona and some of the general technical aspects of solar, such as insolation, siting, and process analysis are explored. Major aspects of a solar plant design are presented. Collectors, storage, and heat exchange are discussed. Reducing hardware costs to annual dollar benefits is also discussed. Rate of return, cash flow, and payback are discussed as they relate to solar systems. Design analysis procedures are presented. The design cost optimization techniques using a yearly computer simulation of a solar process operation is demonstrated.
NASA Astrophysics Data System (ADS)
Bianchetti, Raechel Anne
Remotely sensed images have become a ubiquitous part of our daily lives. From novice users, aiding in search and rescue missions using tools such as TomNod, to trained analysts, synthesizing disparate data to address complex problems like climate change, imagery has become central to geospatial problem solving. Expert image analysts are continually faced with rapidly developing sensor technologies and software systems. In response to these cognitively demanding environments, expert analysts develop specialized knowledge and analytic skills to address increasingly complex problems. This study identifies the knowledge, skills, and analytic goals of expert image analysts tasked with identification of land cover and land use change. Analysts participating in this research are currently working as part of a national level analysis of land use change, and are well versed with the use of TimeSync, forest science, and image analysis. The results of this study benefit current analysts as it improves their awareness of their mental processes used during the image interpretation process. The study also can be generalized to understand the types of knowledge and visual cues that analysts use when reasoning with imagery for purposes beyond land use change studies. Here a Cognitive Task Analysis framework is used to organize evidence from qualitative knowledge elicitation methods for characterizing the cognitive aspects of the TimeSync image analysis process. Using a combination of content analysis, diagramming, semi-structured interviews, and observation, the study highlights the perceptual and cognitive elements of expert remote sensing interpretation. Results show that image analysts perform several standard cognitive processes, but flexibly employ these processes in response to various contextual cues. Expert image analysts' ability to think flexibly during their analysis process was directly related to their amount of image analysis experience. Additionally, results show that the basic Image Interpretation Elements continue to be important despite technological augmentation of the interpretation process. These results are used to derive a set of design guidelines for developing geovisual analytic tools and training to support image analysis.
Funding for the 2ND IAEA technical meeting on fusion data processing, validation and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greenwald, Martin
The International Atomic Energy Agency (IAEA) will organize the second Technical Meeting on Fusion Da Processing, Validation and Analysis from 30 May to 02 June, 2017, in Cambridge, MA USA. The meeting w be hosted by the MIT Plasma Science and Fusion Center (PSFC). The objective of the meeting is to provide a platform where a set of topics relevant to fusion data processing, validation and analysis are discussed with the view of extrapolation needs to next step fusion devices such as ITER. The validation and analysis of experimental data obtained from diagnostics used to characterize fusion plasmas are crucialmore » for a knowledge based understanding of the physical processes governing the dynamics of these plasmas. The meeting will aim at fostering, in particular, discussions of research and development results that set out or underline trends observed in the current major fusion confinement devices. General information on the IAEA, including its mission and organization, can be found at the IAEA websit Uncertainty quantification (UQ) Model selection, validation, and verification (V&V) Probability theory and statistical analysis Inverse problems & equilibrium reconstru ction Integrated data analysis Real time data analysis Machine learning Signal/image proc essing & pattern recognition Experimental design and synthetic diagnostics Data management« less
NASA Astrophysics Data System (ADS)
Naik, Deepak kumar; Maity, K. P.
2018-03-01
Plasma arc cutting (PAC) is a high temperature thermal cutting process employed for the cutting of extensively high strength material which are difficult to cut through any other manufacturing process. This process involves high energized plasma arc to cut any conducting material with better dimensional accuracy in lesser time. This research work presents the effect of process parameter on to the dimensional accuracy of PAC process. The input process parameters were selected as arc voltage, standoff distance and cutting speed. A rectangular plate of 304L stainless steel of 10 mm thickness was taken for the experiment as a workpiece. Stainless steel is very extensively used material in manufacturing industries. Linear dimension were measured following Taguchi’s L16 orthogonal array design approach. Three levels were selected to conduct the experiment for each of the process parameter. In all experiments, clockwise cut direction was followed. The result obtained thorough measurement is further analyzed. Analysis of variance (ANOVA) and Analysis of means (ANOM) were performed to evaluate the effect of each process parameter. ANOVA analysis reveals the effect of input process parameter upon leaner dimension in X axis. The results of the work shows that the optimal setting of process parameter values for the leaner dimension on the X axis. The result of the investigations clearly show that the specific range of input process parameter achieved the improved machinability.
Best Manufacturing Practices Survey Conducted at Litton Data Systems Division, Van Nuys, California
1988-10-01
Hardware and Software ................................ 10 DESIGN RELEASE Engineering Change Order Processing and Analysis...structured using bridges to isolate local traffic. Long term plans call for a wide-band network. ENGINEERING CHANGE ORDER PROCESSING AND ANALYSIS
Karabatsos, George
2017-02-01
Most of applied statistics involves regression analysis of data. In practice, it is important to specify a regression model that has minimal assumptions which are not violated by data, to ensure that statistical inferences from the model are informative and not misleading. This paper presents a stand-alone and menu-driven software package, Bayesian Regression: Nonparametric and Parametric Models, constructed from MATLAB Compiler. Currently, this package gives the user a choice from 83 Bayesian models for data analysis. They include 47 Bayesian nonparametric (BNP) infinite-mixture regression models; 5 BNP infinite-mixture models for density estimation; and 31 normal random effects models (HLMs), including normal linear models. Each of the 78 regression models handles either a continuous, binary, or ordinal dependent variable, and can handle multi-level (grouped) data. All 83 Bayesian models can handle the analysis of weighted observations (e.g., for meta-analysis), and the analysis of left-censored, right-censored, and/or interval-censored data. Each BNP infinite-mixture model has a mixture distribution assigned one of various BNP prior distributions, including priors defined by either the Dirichlet process, Pitman-Yor process (including the normalized stable process), beta (two-parameter) process, normalized inverse-Gaussian process, geometric weights prior, dependent Dirichlet process, or the dependent infinite-probits prior. The software user can mouse-click to select a Bayesian model and perform data analysis via Markov chain Monte Carlo (MCMC) sampling. After the sampling completes, the software automatically opens text output that reports MCMC-based estimates of the model's posterior distribution and model predictive fit to the data. Additional text and/or graphical output can be generated by mouse-clicking other menu options. This includes output of MCMC convergence analyses, and estimates of the model's posterior predictive distribution, for selected functionals and values of covariates. The software is illustrated through the BNP regression analysis of real data.
Planar Inlet Design and Analysis Process (PINDAP)
NASA Technical Reports Server (NTRS)
Slater, John W.; Gruber, Christopher R.
2005-01-01
The Planar Inlet Design and Analysis Process (PINDAP) is a collection of software tools that allow the efficient aerodynamic design and analysis of planar (two-dimensional and axisymmetric) inlets. The aerodynamic analysis is performed using the Wind-US computational fluid dynamics (CFD) program. A major element in PINDAP is a Fortran 90 code named PINDAP that can establish the parametric design of the inlet and efficiently model the geometry and generate the grid for CFD analysis with design changes to those parameters. The use of PINDAP is demonstrated for subsonic, supersonic, and hypersonic inlets.
Process Improvement Through Tool Integration in Aero-Mechanical Design
NASA Technical Reports Server (NTRS)
Briggs, Clark
2010-01-01
Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.
2009-12-18
cannot be detected with univariate techniques, but require multivariate analysis instead (Kamitani and Tong [2005]). Two other time series analysis ...learning for time series analysis . The historical record of DBNs can be traced back to Dean and Kanazawa [1988] and Dean and Wellman [1991], with...Rev. 8-98) Prescribed by ANSI Std Z39-18 Keywords: Hidden Process Models, probabilistic time series modeling, functional Magnetic Resonance Imaging
Organization Domain Modeling. Volume 1. Conceptual Foundations, Process and Workproduct Description
1993-07-31
J.A. Hess, W.E. Novak, and A.S. Peterson. Feature-Oriented Domain Analysis ( FODA ) Feasibility Study. Technical Report CMU/SEI-90-TR-21, Software...domain analysis (DA) and modeling, including a structured set of workproducts, a tailorable process model and a set of modeling techniques and guidelines...23 5.3.1 U sability Analysis (Rescoping) ..................................................... 24
Formal Foundations for the Specification of Software Architecture.
1995-03-01
Architectures For- mally: A Case-Study Using KWIC." Kestrel Institute, Palo Alto, CA 94304, April 1994. 58. Kang, Kyo C. Feature-Oriented Domain Analysis ( FODA ...6.3.5 Constraint-Based Architectures ................. 6-60 6.4 Summary ......... ............................. 6-63 VII. Analysis of Process-Based...between these architec- ture theories were investigated. A feasibility analysis on an image processing application demonstrated that architecture theories
Following the Part I paper that described an application of the U.S. EPA Models-3/Community Multiscale Air Quality (CMAQ) modeling system to the 1999 Southern Oxidants Study episode, this paper presents results from process analysis (PA) using the PA tool embedded in CMAQ and s...
Teaching Tip: Using Activity Diagrams to Model Systems Analysis Techniques: Teaching What We Preach
ERIC Educational Resources Information Center
Lending, Diane; May, Jeffrey
2013-01-01
Activity diagrams are used in Systems Analysis and Design classes as a visual tool to model the business processes of "as-is" and "to-be" systems. This paper presents the idea of using these same activity diagrams in the classroom to model the actual processes (practices and techniques) of Systems Analysis and Design. This tip…
Analysis of Flue Gas Desulfurization (FGD) Processes for Potential Use on Army Coal-Fired Boilers
1980-09-01
TECHNICAL REPORT N-93 September 1980 ANALYSIS OF FLUE GAS DESULFURIZATION (FGD) PROCESSES FOR POTENTIAL USE ON ARMY COAL-FIRED BOILERS TECHNICAL LIBRARY...REFERENCE: Technical Report N-93, Analysis of Flue Gas Desulfurization (FGD) Ppooesses for Potential Use on Army Coal-Fired Boilers Please take a few...REPORT DOCUMENTATION PAGE 1. REPORT NUMBER CERL-TR-N-93 2. GOVT ACCESSION NO «. TITLE (end Subtitle) ANALYSIS OF FLUE GAS DESULFURIZATION (FGD
From scenarios to domain models: processes and representations
NASA Astrophysics Data System (ADS)
Haddock, Gail; Harbison, Karan
1994-03-01
The domain specific software architectures (DSSA) community has defined a philosophy for the development of complex systems. This philosophy improves productivity and efficiency by increasing the user's role in the definition of requirements, increasing the systems engineer's role in the reuse of components, and decreasing the software engineer's role to the development of new components and component modifications only. The scenario-based engineering process (SEP), the first instantiation of the DSSA philosophy, has been adopted by the next generation controller project. It is also the chosen methodology of the trauma care information management system project, and the surrogate semi-autonomous vehicle project. SEP uses scenarios from the user to create domain models and define the system's requirements. Domain knowledge is obtained from a variety of sources including experts, documents, and videos. This knowledge is analyzed using three techniques: scenario analysis, task analysis, and object-oriented analysis. Scenario analysis results in formal representations of selected scenarios. Task analysis of the scenario representations results in descriptions of tasks necessary for object-oriented analysis and also subtasks necessary for functional system analysis. Object-oriented analysis of task descriptions produces domain models and system requirements. This paper examines the representations that support the DSSA philosophy, including reference requirements, reference architectures, and domain models. The processes used to create and use the representations are explained through use of the scenario-based engineering process. Selected examples are taken from the next generation controller project.
NASA Technical Reports Server (NTRS)
Amundsen, R. M.; Feldhaus, W. S.; Little, A. D.; Mitchum, M. V.
1995-01-01
Electronic integration of design and analysis processes was achieved and refined at Langley Research Center (LaRC) during the development of an optical bench for a laser-based aerospace experiment. Mechanical design has been integrated with thermal, structural and optical analyses. Electronic import of the model geometry eliminates the repetitive steps of geometry input to develop each analysis model, leading to faster and more accurate analyses. Guidelines for integrated model development are given. This integrated analysis process has been built around software that was already in use by designers and analysis at LaRC. The process as currently implemented used Pro/Engineer for design, Pro/Manufacturing for fabrication, PATRAN for solid modeling, NASTRAN for structural analysis, SINDA-85 and P/Thermal for thermal analysis, and Code V for optical analysis. Currently, the only analysis model to be built manually is the Code V model; all others can be imported for the Pro/E geometry. The translator from PATRAN results to Code V optical analysis (PATCOD) was developed and tested at LaRC. Directions for use of the translator or other models are given.
Discovery of Information Diffusion Process in Social Networks
NASA Astrophysics Data System (ADS)
Kim, Kwanho; Jung, Jae-Yoon; Park, Jonghun
Information diffusion analysis in social networks is of significance since it enables us to deeply understand dynamic social interactions among users. In this paper, we introduce approaches to discovering information diffusion process in social networks based on process mining. Process mining techniques are applied from three perspectives: social network analysis, process discovery and community recognition. We then present experimental results by using a real-life social network data. The proposed techniques are expected to employ as new analytical tools in online social networks such as blog and wikis for company marketers, politicians, news reporters and online writers.
Structural Optimization in automotive design
NASA Technical Reports Server (NTRS)
Bennett, J. A.; Botkin, M. E.
1984-01-01
Although mathematical structural optimization has been an active research area for twenty years, there has been relatively little penetration into the design process. Experience indicates that often this is due to the traditional layout-analysis design process. In many cases, optimization efforts have been outgrowths of analysis groups which are themselves appendages to the traditional design process. As a result, optimization is often introduced into the design process too late to have a significant effect because many potential design variables have already been fixed. A series of examples are given to indicate how structural optimization has been effectively integrated into the design process.
Laboratory plant study on the melting process of asbestos waste
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sakai, Shinichi; Terazono, Atsushi; Takatsuki, Hiroshi
The melting process was studied as a method of changing asbestos into non-hazardous waste and recovering it as a reusable resource. In an initial effort, the thermal behaviors of asbestos waste in terms of physical and chemical structure have been studied. Then, 10 kg/h-scale laboratory plant experiments were carried out. By X-ray diffraction analysis, the thermal behaviors of sprayed-on asbestos waste revealed that chrysotile asbestos waste change in crystal structure at around 800 C, and becomes melted slag, mainly composed of magnesium silicate, at around 1,500 C. Laboratory plant experiments on the melting process of sprayed-on asbestos have shown thatmore » melted slag can be obtained. X-ray diffraction analysis of the melted slag revealed crystal structure change, and SEM analysis showed the slag to have a non-fibrous form. And more, TEM analysis proved the very high treatment efficiency of the process, that is, reduction of the asbestos content to 1/10{sup 6} as a weight basis. These analytical results indicate the effectiveness of the melting process for asbestos waste treatment.« less
NASA Astrophysics Data System (ADS)
Kurniati, D. R.; Rohman, I.
2018-05-01
This study aims to analyze the concepts and science process skills in bomb calorimeter experiment as a basis for developing the virtual laboratory of bomb calorimeter. This study employed research and development method (R&D) to gain the answer to the proposed problems. This paper discussed the concepts and process skills analysis. The essential concepts and process skills associated with bomb calorimeter are analyze by optimizing the bomb calorimeter experiment. The concepts analysis found seven fundamental concepts to be concerned in developing the virtual laboratory that are internal energy, burning heat, perfect combustion, incomplete combustion, calorimeter constant, bomb calorimeter, and Black principle. Since the concept of bomb calorimeter, perfect and incomplete combustion created to figure out the real situation and contain controllable variables, in virtual the concepts displayed in the form of simulation. Meanwhile, the last four concepts presented in the form of animation because no variable found to be controlled. The process skills analysis detect four notable skills to be developed that are ability to observe, design experiment, interpretation, and communication skills.
Nonlinear breakup of liquid sheets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jazayeri, S.A.; Li, X.
1997-07-01
Sprays formed from the disintegration of liquid sheets have extensive practical applications, ranging from chemical and pharmaceutical processes to power generation and propulsion systems. A knowledge of the liquid sheet breakup process is essential to the understanding of fundamental mechanism of liquid atomization and spray formation processes. The breakup of liquid sheets has been investigated in terms of hydrodynamic stability via linear analysis by Squire, Hagerty and Shea, Li, etc. nonlinear effect has been studied by Clark and Dombrowski up to the second order, and by Rangel and Sirignano through numerical simulation employing vortex discretization method. As shown by Taubmore » for the breakup of circular liquid jets, the closer to the breakup region, the higher the order of nonlinear analysis has to be for adequate description of the breakup behavior. As pointed out by Bogy, a nonlinear analysis up to the third order is generally sufficient to account for the inherent nonlinear nature of the breakup process. Therefore, a third-order nonlinear analysis has been carried out in this study to investigate the process of liquid sheet disruption preceding the spray formation.« less
Signal Processing in Periodically Forced Gradient Frequency Neural Networks
Kim, Ji Chul; Large, Edward W.
2015-01-01
Oscillatory instability at the Hopf bifurcation is a dynamical phenomenon that has been suggested to characterize active non-linear processes observed in the auditory system. Networks of oscillators poised near Hopf bifurcation points and tuned to tonotopically distributed frequencies have been used as models of auditory processing at various levels, but systematic investigation of the dynamical properties of such oscillatory networks is still lacking. Here we provide a dynamical systems analysis of a canonical model for gradient frequency neural networks driven by a periodic signal. We use linear stability analysis to identify various driven behaviors of canonical oscillators for all possible ranges of model and forcing parameters. The analysis shows that canonical oscillators exhibit qualitatively different sets of driven states and transitions for different regimes of model parameters. We classify the parameter regimes into four main categories based on their distinct signal processing capabilities. This analysis will lead to deeper understanding of the diverse behaviors of neural systems under periodic forcing and can inform the design of oscillatory network models of auditory signal processing. PMID:26733858
Wang, Monan; Zhang, Kai; Yang, Ning
2018-04-09
To help doctors decide their treatment from the aspect of mechanical analysis, the work built a computer assisted optimal system for treatment of femoral neck fracture oriented to clinical application. The whole system encompassed the following three parts: Preprocessing module, finite element mechanical analysis module, post processing module. Preprocessing module included parametric modeling of bone, parametric modeling of fracture face, parametric modeling of fixed screw and fixed position and input and transmission of model parameters. Finite element mechanical analysis module included grid division, element type setting, material property setting, contact setting, constraint and load setting, analysis method setting and batch processing operation. Post processing module included extraction and display of batch processing operation results, image generation of batch processing operation, optimal program operation and optimal result display. The system implemented the whole operations from input of fracture parameters to output of the optimal fixed plan according to specific patient real fracture parameter and optimal rules, which demonstrated the effectiveness of the system. Meanwhile, the system had a friendly interface, simple operation and could improve the system function quickly through modifying single module.
NASA Astrophysics Data System (ADS)
Huang, J. C.; Wright, W. V.
1982-04-01
The Defense Waste Processing Facility (DWPF) for immobilizing nuclear high level waste (HLW) is scheduled to be built. High level waste is produced when reactor components are subjected to chemical separation operations. Two candidates for immobilizing this HLW are borosilicate glass and crystalline ceramic, either being contained in weld sealed stainless steel canisters. A number of technical analyses are being conducted to support a selection between these two waste forms. The risks associated with the manufacture and interim storage of these two forms in the DWPF are compared. Process information used in the risk analysis was taken primarily from a DWPF processibility analysis. The DWPF environmental analysis provided much of the necessary environmental information.
New technologies for solar energy silicon - Cost analysis of BCL process
NASA Technical Reports Server (NTRS)
Yaws, C. L.; Li, K.-Y.; Fang, C. S.; Lutwack, R.; Hsu, G.; Leven, H.
1980-01-01
New technologies for producing polysilicon are being developed to provide lower cost material for solar cells which convert sunlight into electricity. This article presents results for the BCL Process, which produces the solar-cell silicon by reduction of silicon tetrachloride with zinc vapor. Cost, sensitivity, and profitability analysis results are presented based on a preliminary process design of a plant to produce 1000 metric tons/year of silicon by the BCL Process. Profitability analysis indicates a sales price of $12.1-19.4 per kg of silicon (1980 dollars) at a 0-25 per cent DCF rate of return on investment after taxes. These results indicate good potential for meeting the goal of providing lower cost material for silicon solar cells.
An Evidential Reasoning-Based CREAM to Human Reliability Analysis in Maritime Accident Process.
Wu, Bing; Yan, Xinping; Wang, Yang; Soares, C Guedes
2017-10-01
This article proposes a modified cognitive reliability and error analysis method (CREAM) for estimating the human error probability in the maritime accident process on the basis of an evidential reasoning approach. This modified CREAM is developed to precisely quantify the linguistic variables of the common performance conditions and to overcome the problem of ignoring the uncertainty caused by incomplete information in the existing CREAM models. Moreover, this article views maritime accident development from the sequential perspective, where a scenario- and barrier-based framework is proposed to describe the maritime accident process. This evidential reasoning-based CREAM approach together with the proposed accident development framework are applied to human reliability analysis of a ship capsizing accident. It will facilitate subjective human reliability analysis in different engineering systems where uncertainty exists in practice. © 2017 Society for Risk Analysis.
Arvanitoyannis, Ioannis S; Varzakas, Theodoros H
2008-05-01
The Failure Mode and Effect Analysis (FMEA) model was applied for risk assessment of salmon manufacturing. A tentative approach of FMEA application to the salmon industry was attempted in conjunction with ISO 22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (salmon processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points were identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram and fishbone diagram). In this work, a comparison of ISO 22000 analysis with HACCP is carried out over salmon processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the RPN per identified processing hazard. Fish receiving, casing/marking, blood removal, evisceration, filet-making cooling/freezing, and distribution were the processes identified as the ones with the highest RPN (252, 240, 210, 210, 210, 210, 200 respectively) and corrective actions were undertaken. After the application of corrective actions, a second calculation of RPN values was carried out resulting in substantially lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO 22000 system of a salmon processing industry is anticipated to prove advantageous to industrialists, state food inspectors, and consumers.
Comparison of approaches for mobile document image analysis using server supported smartphones
NASA Astrophysics Data System (ADS)
Ozarslan, Suleyman; Eren, P. Erhan
2014-03-01
With the recent advances in mobile technologies, new capabilities are emerging, such as mobile document image analysis. However, mobile phones are still less powerful than servers, and they have some resource limitations. One approach to overcome these limitations is performing resource-intensive processes of the application on remote servers. In mobile document image analysis, the most resource consuming process is the Optical Character Recognition (OCR) process, which is used to extract text in mobile phone captured images. In this study, our goal is to compare the in-phone and the remote server processing approaches for mobile document image analysis in order to explore their trade-offs. For the inphone approach, all processes required for mobile document image analysis run on the mobile phone. On the other hand, in the remote-server approach, core OCR process runs on the remote server and other processes run on the mobile phone. Results of the experiments show that the remote server approach is considerably faster than the in-phone approach in terms of OCR time, but adds extra delays such as network delay. Since compression and downscaling of images significantly reduce file sizes and extra delays, the remote server approach overall outperforms the in-phone approach in terms of selected speed and correct recognition metrics, if the gain in OCR time compensates for the extra delays. According to the results of the experiments, using the most preferable settings, the remote server approach performs better than the in-phone approach in terms of speed and acceptable correct recognition metrics.
SensePath: Understanding the Sensemaking Process Through Analytic Provenance.
Nguyen, Phong H; Xu, Kai; Wheat, Ashley; Wong, B L William; Attfield, Simon; Fields, Bob
2016-01-01
Sensemaking is described as the process of comprehension, finding meaning and gaining insight from information, producing new knowledge and informing further action. Understanding the sensemaking process allows building effective visual analytics tools to make sense of large and complex datasets. Currently, it is often a manual and time-consuming undertaking to comprehend this: researchers collect observation data, transcribe screen capture videos and think-aloud recordings, identify recurring patterns, and eventually abstract the sensemaking process into a general model. In this paper, we propose a general approach to facilitate such a qualitative analysis process, and introduce a prototype, SensePath, to demonstrate the application of this approach with a focus on browser-based online sensemaking. The approach is based on a study of a number of qualitative research sessions including observations of users performing sensemaking tasks and post hoc analyses to uncover their sensemaking processes. Based on the study results and a follow-up participatory design session with HCI researchers, we decided to focus on the transcription and coding stages of thematic analysis. SensePath automatically captures user's sensemaking actions, i.e., analytic provenance, and provides multi-linked views to support their further analysis. A number of other requirements elicited from the design session are also implemented in SensePath, such as easy integration with existing qualitative analysis workflow and non-intrusive for participants. The tool was used by an experienced HCI researcher to analyze two sensemaking sessions. The researcher found the tool intuitive and considerably reduced analysis time, allowing better understanding of the sensemaking process.
Medical Image Analysis by Cognitive Information Systems - a Review.
Ogiela, Lidia; Takizawa, Makoto
2016-10-01
This publication presents a review of medical image analysis systems. The paradigms of cognitive information systems will be presented by examples of medical image analysis systems. The semantic processes present as it is applied to different types of medical images. Cognitive information systems were defined on the basis of methods for the semantic analysis and interpretation of information - medical images - applied to cognitive meaning of medical images contained in analyzed data sets. Semantic analysis was proposed to analyzed the meaning of data. Meaning is included in information, for example in medical images. Medical image analysis will be presented and discussed as they are applied to various types of medical images, presented selected human organs, with different pathologies. Those images were analyzed using different classes of cognitive information systems. Cognitive information systems dedicated to medical image analysis was also defined for the decision supporting tasks. This process is very important for example in diagnostic and therapy processes, in the selection of semantic aspects/features, from analyzed data sets. Those features allow to create a new way of analysis.
1982-04-01
processes requiring systematic experimental analysis. Accordingly, group performance effectiveness studies were initiated to 61 assess the effects on...the experiment. 67 active processes associated with Joining the respective established groups, but the absence of baseline levels precludes such an...novitiate in comparison to such values observed during baseline days suggested an active process associated with the joining of the group and emphasized the
Process-based organization design and hospital efficiency.
Vera, Antonio; Kuntz, Ludwig
2007-01-01
The central idea of process-based organization design is that organizing a firm around core business processes leads to cost reductions and quality improvements. We investigated theoretically and empirically whether the implementation of a process-based organization design is advisable in hospitals. The data came from a database compiled by the Statistical Office of the German federal state of Rheinland-Pfalz and from a written questionnaire, which was sent to the chief executive officers (CEOs) of all 92 hospitals in this federal state. We used data envelopment analysis (DEA) to measure hospital efficiency, and factor analysis and regression analysis to test our hypothesis. Our principal finding is that a high degree of process-based organization has a moderate but significant positive effect on the efficiency of hospitals. The main implication is that hospitals should implement a process-based organization to improve their efficiency. However, to actually achieve positive effects on efficiency, it is of paramount importance to observe some implementation rules, in particular to mobilize physician participation and to create an adequate organizational culture.
Modeling and flow analysis of pure nylon polymer for injection molding process
NASA Astrophysics Data System (ADS)
Nuruzzaman, D. M.; Kusaseh, N.; Basri, S.; Oumer, A. N.; Hamedon, Z.
2016-02-01
In the production of complex plastic parts, injection molding is one of the most popular industrial processes. This paper addresses the modeling and analysis of the flow process of the nylon (polyamide) polymer for injection molding process. To determine the best molding conditions, a series of simulations are carried out using Autodesk Moldflow Insight software and the processing parameters are adjusted. This mold filling commercial software simulates the cavity filling pattern along with temperature and pressure distributions in the mold cavity. In the modeling, during the plastics flow inside the mold cavity, different flow parameters such as fill time, pressure, temperature, shear rate and warp at different locations in the cavity are analyzed. Overall, this Moldflow is able to perform a relatively sophisticated analysis of the flow process of pure nylon. Thus the prediction of the filling of a mold cavity is very important and it becomes useful before a nylon plastic part to be manufactured.
Statistical Analysis of the First Passage Path Ensemble of Jump Processes
NASA Astrophysics Data System (ADS)
von Kleist, Max; Schütte, Christof; Zhang, Wei
2018-02-01
The transition mechanism of jump processes between two different subsets in state space reveals important dynamical information of the processes and therefore has attracted considerable attention in the past years. In this paper, we study the first passage path ensemble of both discrete-time and continuous-time jump processes on a finite state space. The main approach is to divide each first passage path into nonreactive and reactive segments and to study them separately. The analysis can be applied to jump processes which are non-ergodic, as well as continuous-time jump processes where the waiting time distributions are non-exponential. In the particular case that the jump processes are both Markovian and ergodic, our analysis elucidates the relations between the study of the first passage paths and the study of the transition paths in transition path theory. We provide algorithms to numerically compute statistics of the first passage path ensemble. The computational complexity of these algorithms scales with the complexity of solving a linear system, for which efficient methods are available. Several examples demonstrate the wide applicability of the derived results across research areas.
Qualitative Analysis for Maintenance Process Assessment
NASA Technical Reports Server (NTRS)
Brand, Lionel; Kim, Yong-Mi; Melo, Walcelio; Seaman, Carolyn; Basili, Victor
1996-01-01
In order to improve software maintenance processes, we first need to be able to characterize and assess them. These tasks must be performed in depth and with objectivity since the problems are complex. One approach is to set up a measurement-based software process improvement program specifically aimed at maintenance. However, establishing a measurement program requires that one understands the problems to be addressed by the measurement program and is able to characterize the maintenance environment and processes in order to collect suitable and cost-effective data. Also, enacting such a program and getting usable data sets takes time. A short term substitute is therefore needed. We propose in this paper a characterization process aimed specifically at maintenance and based on a general qualitative analysis methodology. This process is rigorously defined in order to be repeatable and usable by people who are not acquainted with such analysis procedures. A basic feature of our approach is that actual implemented software changes are analyzed in order to understand the flaws in the maintenance process. Guidelines are provided and a case study is shown that demonstrates the usefulness of the approach.
Binnicker, M. J.; Jespersen, D. J.; Harring, J. A.; Rollins, L. O.; Bryant, S. C.; Beito, E. M.
2008-01-01
The diagnosis of Lyme borreliosis (LB) is commonly made by serologic testing with Western blot (WB) analysis serving as an important supplemental assay. Although specific, the interpretation of WBs for diagnosis of LB (i.e., Lyme WBs) is subjective, with considerable variability in results. In addition, the processing, reading, and interpretation of Lyme WBs are laborious and time-consuming procedures. With the need for rapid processing and more objective interpretation of Lyme WBs, we evaluated the performances of two automated interpretive systems, TrinBlot/BLOTrix (Trinity Biotech, Carlsbad, CA) and BeeBlot/ViraScan (Viramed Biotech AG, Munich, Germany), using 518 serum specimens submitted to our laboratory for Lyme WB analysis. The results of routine testing with visual interpretation were compared to those obtained by BLOTrix analysis of MarBlot immunoglobulin M (IgM) and IgG and by ViraScan analysis of ViraBlot and ViraStripe IgM and IgG assays. BLOTrix analysis demonstrated an agreement of 84.7% for IgM and 87.3% for IgG compared to visual reading and interpretation. ViraScan analysis of the ViraBlot assays demonstrated agreements of 85.7% for IgM and 94.2% for IgG, while ViraScan analysis of the ViraStripe IgM and IgG assays showed agreements of 87.1 and 93.1%, respectively. Testing by the automated systems yielded an average time savings of 64 min/run compared to processing, reading, and interpretation by our current procedure. Our findings demonstrated that automated processing and interpretive systems yield results comparable to those of visual interpretation, while reducing the subjectivity and time required for Lyme WB analysis. PMID:18463211
Binnicker, M J; Jespersen, D J; Harring, J A; Rollins, L O; Bryant, S C; Beito, E M
2008-07-01
The diagnosis of Lyme borreliosis (LB) is commonly made by serologic testing with Western blot (WB) analysis serving as an important supplemental assay. Although specific, the interpretation of WBs for diagnosis of LB (i.e., Lyme WBs) is subjective, with considerable variability in results. In addition, the processing, reading, and interpretation of Lyme WBs are laborious and time-consuming procedures. With the need for rapid processing and more objective interpretation of Lyme WBs, we evaluated the performances of two automated interpretive systems, TrinBlot/BLOTrix (Trinity Biotech, Carlsbad, CA) and BeeBlot/ViraScan (Viramed Biotech AG, Munich, Germany), using 518 serum specimens submitted to our laboratory for Lyme WB analysis. The results of routine testing with visual interpretation were compared to those obtained by BLOTrix analysis of MarBlot immunoglobulin M (IgM) and IgG and by ViraScan analysis of ViraBlot and ViraStripe IgM and IgG assays. BLOTrix analysis demonstrated an agreement of 84.7% for IgM and 87.3% for IgG compared to visual reading and interpretation. ViraScan analysis of the ViraBlot assays demonstrated agreements of 85.7% for IgM and 94.2% for IgG, while ViraScan analysis of the ViraStripe IgM and IgG assays showed agreements of 87.1 and 93.1%, respectively. Testing by the automated systems yielded an average time savings of 64 min/run compared to processing, reading, and interpretation by our current procedure. Our findings demonstrated that automated processing and interpretive systems yield results comparable to those of visual interpretation, while reducing the subjectivity and time required for Lyme WB analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wahanani, Nursinta Adi, E-mail: sintaadi@batan.go.id; Natsir, Khairina, E-mail: sintaadi@batan.go.id; Hartini, Entin, E-mail: sintaadi@batan.go.id
Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this softwaremore » 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu{sup 239} and Pu{sup 241}. Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis.« less
Analytic Steering: Inserting Context into the Information Dialog
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bohn, Shawn J.; Calapristi, Augustin J.; Brown, Shyretha D.
2011-10-23
An analyst’s intrinsic domain knowledge is a primary asset in almost any analysis task. Unstructured text analysis systems that apply un-supervised content analysis approaches can be more effective if they can leverage this domain knowledge in a manner that augments the information discovery process without obfuscating new or unexpected content. Current unsupervised approaches rely upon the prowess of the analyst to submit the right queries or observe generalized document and term relationships from ranked or visual results. We propose a new approach which allows the user to control or steer the analytic view within the unsupervised space. This process ismore » controlled through the data characterization process via user supplied context in the form of a collection of key terms. We show that steering with an appropriate choice of key terms can provide better relevance to the analytic domain and still enable the analyst to uncover un-expected relationships; this paper discusses cases where various analytic steering approaches can provide enhanced analysis results and cases where analytic steering can have a negative impact on the analysis process.« less
Hatz, F; Hardmeier, M; Bousleiman, H; Rüegg, S; Schindler, C; Fuhr, P
2015-02-01
To compare the reliability of a newly developed Matlab® toolbox for the fully automated, pre- and post-processing of resting state EEG (automated analysis, AA) with the reliability of analysis involving visually controlled pre- and post-processing (VA). 34 healthy volunteers (age: median 38.2 (20-49), 82% female) had three consecutive 256-channel resting-state EEG at one year intervals. Results of frequency analysis of AA and VA were compared with Pearson correlation coefficients, and reliability over time was assessed with intraclass correlation coefficients (ICC). Mean correlation coefficient between AA and VA was 0.94±0.07, mean ICC for AA 0.83±0.05 and for VA 0.84±0.07. AA and VA yield very similar results for spectral EEG analysis and are equally reliable. AA is less time-consuming, completely standardized, and independent of raters and their training. Automated processing of EEG facilitates workflow in quantitative EEG analysis. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Sethuramalingam, Prabhu; Vinayagam, Babu Kupusamy
2016-07-01
Carbon nanotube mixed grinding wheel is used in the grinding process to analyze the surface characteristics of AISI D2 tool steel material. Till now no work has been carried out using carbon nanotube based grinding wheel. Carbon nanotube based grinding wheel has excellent thermal conductivity and good mechanical properties which are used to improve the surface finish of the workpiece. In the present study, the multi response optimization of process parameters like surface roughness and metal removal rate of grinding process of single wall carbon nanotube (CNT) in mixed cutting fluids is undertaken using orthogonal array with grey relational analysis. Experiments are performed with designated grinding conditions obtained using the L9 orthogonal array. Based on the results of the grey relational analysis, a set of optimum grinding parameters is obtained. Using the analysis of variance approach the significant machining parameters are found. Empirical model for the prediction of output parameters has been developed using regression analysis and the results are compared empirically, for conditions of with and without CNT grinding wheel in grinding process.
40 CFR 68.67 - Process hazard analysis.
Code of Federal Regulations, 2011 CFR
2011-07-01
...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... potential for catastrophic consequences. (3) Engineering and administrative controls applicable to the... engineering and administrative controls; (5) Stationary source siting; (6) Human factors; and (7) A...
40 CFR 68.67 - Process hazard analysis.
Code of Federal Regulations, 2012 CFR
2012-07-01
...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... potential for catastrophic consequences. (3) Engineering and administrative controls applicable to the... engineering and administrative controls; (5) Stationary source siting; (6) Human factors; and (7) A...
40 CFR 68.67 - Process hazard analysis.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... potential for catastrophic consequences. (3) Engineering and administrative controls applicable to the... engineering and administrative controls; (5) Stationary source siting; (6) Human factors; and (7) A...
Vehicle Infrastructure Integration (VII) data use analysis and processing : project summary report.
DOT National Transportation Integrated Search
2012-03-01
The purpose of the Data Use Analysis and Processing (DUAP) project is to support the : Michigan Department of Transportation (MDOT) and its partners in evaluating uses and : benefits of connected vehicle data in transportation agency management and :...
Analysis of delamination related fracture processes in composites
NASA Technical Reports Server (NTRS)
Armanios, Erian A.
1988-01-01
Delamination related fracture processes in composite materials are discussed. Thermal and moisture influences on the free-edge delamination of laminated composites, fracture analysis of local delaminations in laminated composites, and strain energy release rates in belts are among the topics covered.
Abstracts of Research. July 1974-June 1975.
ERIC Educational Resources Information Center
Ohio State Univ., Columbus. Computer and Information Science Research Center.
Abstracts of research papers in computer and information science are given for 68 papers in the areas of information storage and retrieval; human information processing; information analysis; linguistic analysis; artificial intelligence; information processes in physical, biological, and social systems; mathematical techniques; systems…
Allchin's Shoehorn, or Why Science Is Hypothetico-Deductive.
ERIC Educational Resources Information Center
Lawson, Anton E.
2003-01-01
Criticizes Allchin's article about Lawson's analysis of Galileo's discovery of Jupiter's moons. Suggests that a careful analysis of the way humans spontaneously process information and reason supports a general hypothetico-deductive theory of human information processing, reasoning, and scientific discovery. (SOE)
Lessons learned from trend analysis of Shuttle Payload Processing problem reports
NASA Technical Reports Server (NTRS)
Heuser, Robert E.; Pepper, Richard E., Jr.; Smith, Anthony M.
1989-01-01
In the wake of the Challenger accident, NASA has placed an increasing emphasis on trend analysis techniques. These analyses provide meaningful insights into system and hardware status, and also develop additional lessons learned from historical data to aid in the design and operation of future space systems. This paper presents selected results from such a trend analysis study that was conducted on the problem report data files for the Shuttle Payload Processing activities. Specifically, the results shown are for the payload canister system which interfaces with and transfers payloads from their processing facilities to the orbiter.
Puchades, R.; Maquieira, A.; Atienza, J.; Herrero, M. A.
1990-01-01
Flow injection analysis (FIA) has emerged as an increasingly used laboratory tool in chemical analysis. Employment of the technique for on-line sample treatment and on-line measurement in chemical process control is a growing trend. This article reviews the recent applications of FlA. Most papers refer to on-line sample treatment. Although FIA is very well suited to continuous on-line process monitoring, few examples have been found in this areamost of them have been applied to water treatment or fermentation processes. PMID:18925271
Processing and analysis techniques involving in-vessel material generation
Schabron, John F [Laramie, WY; Rovani, Jr., Joseph F.
2011-01-25
In at least one embodiment, the inventive technology relates to in-vessel generation of a material from a solution of interest as part of a processing and/or analysis operation. Preferred embodiments of the in-vessel material generation (e.g., in-vessel solid material generation) include precipitation; in certain embodiments, analysis and/or processing of the solution of interest may include dissolution of the material, perhaps as part of a successive dissolution protocol using solvents of increasing ability to dissolve. Applications include, but are by no means limited to estimation of a coking onset and solution (e.g., oil) fractionating.
Application of a neural network to simulate analysis in an optimization process
NASA Technical Reports Server (NTRS)
Rogers, James L.; Lamarsh, William J., II
1992-01-01
A new experimental software package called NETS/PROSSS aimed at reducing the computing time required to solve a complex design problem is described. The software combines a neural network for simulating the analysis program with an optimization program. The neural network is applied to approximate results of a finite element analysis program to quickly obtain a near-optimal solution. Results of the NETS/PROSSS optimization process can also be used as an initial design in a normal optimization process and make it possible to converge to an optimum solution with significantly fewer iterations.
Processing and analysis techniques involving in-vessel material generation
Schabron, John F [Laramie, WY; Rovani, Jr., Joseph F.
2012-09-25
In at least one embodiment, the inventive technology relates to in-vessel generation of a material from a solution of interest as part of a processing and/or analysis operation. Preferred embodiments of the in-vessel material generation (e.g., in-vessel solid material generation) include precipitation; in certain embodiments, analysis and/or processing of the solution of interest may include dissolution of the material, perhaps as part of a successive dissolution protocol using solvents of increasing ability to dissolve. Applications include, but are by no means limited to estimation of a coking onset and solution (e.g., oil) fractionating.
NASA Astrophysics Data System (ADS)
Prabhakaran, A.; Jawahar Raj, N.
2018-03-01
The present study attempts to understand the form and geomorphic/hydrologic processes of the 20 watersheds of the Pachamalai hills and its adjoinings located in Tamil Nadu State of southern India from the analysis of its drainage morphometric characteristics. Survey of India's topographic sheets of 1:50,000 is the data source from which stream networks and watersheds of the study area were demarcated followed by the analysis of their morphometric characteristics using ArcGIS software. The results of the analysis formed the basis for deducing the form and processes of the watersheds of the study area. The form of the watersheds inferred from the analysis includes shape, length, slope steepness and length, degree of branching of streams, dissection and elongation of watersheds. The geomorphic/hydrologic processes inferred include denudation rate, potential energy, intensity of erosion, mean annual run off, mean discharge, discharge rate, rock resistivity and infiltration potential, amount of sediment transported, mean annual rainfall, rainfall intensity, lagtime, flash flood potential, flood discharge per unit area, sediment yield and speed of the water flow in the streams. The understanding of variations of form and processes mentioned can be used towards prioritizing the watersheds for development, management and conservation planning.
Uhrenfeldt, Lisbeth; Lakanmaa, Riitta-Liisa; Flinkman, Mervi; Basto, Marta Lima; Attree, Moira
2014-05-01
This paper critically reviews the literature on international collaboration and analyses the collaborative process involved in producing a nursing workforce policy analysis. Collaboration is increasingly promoted as a means of solving shared problems and achieving common goals; however, collaboration creates its own opportunities and challenges. Evidence about the collaboration process, its outcomes and critical success factors is lacking. A literature review and content analysis of data collected from six participants (from five European countries) members of the European Academy of Nursing Science Scholar Collaborative Workforce Workgroup, using a SWOT (Strengths, Weaknesses, Opportunities and Threats) analysis template. Two major factors affecting scholarly collaboration were identified: Facilitators, which incorporated personal attributes and enabling contexts/mechanisms, including individual commitment, responsibility and teamwork, facilitative supportive structures and processes. The second, Barriers, incorporated unmet needs for funding; time; communication and impeding contexts/mechanisms, including workload and insufficient support/mentorship. The literature review identified a low level of evidence on collaboration processes, outcomes, opportunities and challenges. The SWOT analysis identified critical success factors, planning strategies and resources of effective international collaboration. Collaboration is an important concept for management. Evidence-based knowledge of the critical success factors facilitating and impeding collaboration could help managers make collaboration more effective. © 2012 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Yidana, Sandow Mark; Bawoyobie, Patrick; Sakyi, Patrick; Fynn, Obed Fiifi
2018-02-01
An evolutionary trend has been postulated through the analysis of hydrochemical data of a crystalline rock aquifer system in the Densu Basin, Southern Ghana. Hydrochemcial data from 63 groundwater samples, taken from two main groundwater outlets (Boreholes and hand dug wells) were used to postulate an evolutionary theory for the basin. Sequential factor and hierarchical cluster analysis were used to disintegrate the data into three factors and five clusters (spatial associations). These were used to characterize the controls on groundwater hydrochemistry and its evolution in the terrain. The dissolution of soluble salts and cation exchange processes are the dominant processes controlling groundwater hydrochemistry in the terrain. The trend of evolution of this set of processes follows the pattern of groundwater flow predicted by a calibrated transient groundwater model in the area. The data suggest that anthropogenic activities represent the second most important process in the hydrochemistry. Silicate mineral weathering is the third most important set of processes. Groundwater associations resulting from Q-mode hierarchical cluster analysis indicate an evolutionary pattern consistent with the general groundwater flow pattern in the basin. These key findings are at variance with results of previous investigations and indicate that when carefully done, groundwater hydrochemical data can be very useful for conceptualizing groundwater flow in basins.
Weber, R; Knaup, P; Knietitg, R; Haux, R; Merzweiler, A; Mludek, V; Schilling, F H; Wiedemann, T
2001-01-01
The German Society for Paediatric Oncology and Haematology (GPOH) runs nation-wide multicentre clinical trials to improve the treatment of children suffering from malignant diseases. We want to provide methods and tools to support the centres of these trials in developing trial specific modules for the computer-based DOcumentation System for Paediatric Oncology (DOSPO). For this we carried out an object-oriented business process analysis for the Cooperative Soft Tissue Sarcoma Trial at the Olgahospital Stuttgart for Child and Adolescent Medicine. The result is a comprehensive business process model consisting of UML-diagrams and use case specifications. We recommend the object-oriented business process analysis as a method for the definition of requirements in information processing projects in the field of clinical trials in general. For this our model can serve as basis because it slightly can be adjusted to each type of clinical trial.
Crystallographic data processing for free-electron laser sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Thomas A., E-mail: taw@physics.org; Barty, Anton; Stellato, Francesco
2013-07-01
A processing pipeline for diffraction data acquired using the ‘serial crystallography’ methodology with a free-electron laser source is described with reference to the crystallographic analysis suite CrystFEL and the pre-processing program Cheetah. A processing pipeline for diffraction data acquired using the ‘serial crystallography’ methodology with a free-electron laser source is described with reference to the crystallographic analysis suite CrystFEL and the pre-processing program Cheetah. A detailed analysis of the nature and impact of indexing ambiguities is presented. Simulations of the Monte Carlo integration scheme, which accounts for the partially recorded nature of the diffraction intensities, are presented and show thatmore » the integration of partial reflections could be made to converge more quickly if the bandwidth of the X-rays were to be increased by a small amount or if a slight convergence angle were introduced into the incident beam.« less
Analysis of complex decisionmaking processes. [with application to jet engine development
NASA Technical Reports Server (NTRS)
Hill, J. D.; Ollila, R. G.
1978-01-01
The analysis of corporate decisionmaking processes related to major system developments is unusually difficult because of the number of decisionmakers involved in the process and the long development cycle. A method for analyzing such decision processes is developed and illustrated through its application to the analysis of the commercial jet engine development process. The method uses interaction matrices as the key tool for structuring the problem, recording data, and analyzing the data to establish the rank order of the major factors affecting development decisions. In the example, the use of interaction matrices permitted analysts to collect and analyze approximately 50 factors that influenced decisions during the four phases of the development cycle, and to determine the key influencers of decisions at each development phase. The results of this study indicate that the cost of new technology installed on an aircraft is the prime concern of the engine manufacturer.
An analytical approach to customer requirement information processing
NASA Astrophysics Data System (ADS)
Zhou, Zude; Xiao, Zheng; Liu, Quan; Ai, Qingsong
2013-11-01
'Customer requirements' (CRs) management is a key component of customer relationship management (CRM). By processing customer-focused information, CRs management plays an important role in enterprise systems (ESs). Although two main CRs analysis methods, quality function deployment (QFD) and Kano model, have been applied to many fields by many enterprises in the past several decades, the limitations such as complex processes and operations make them unsuitable for online businesses among small- and medium-sized enterprises (SMEs). Currently, most SMEs do not have the resources to implement QFD or Kano model. In this article, we propose a method named customer requirement information (CRI), which provides a simpler and easier way for SMEs to run CRs analysis. The proposed method analyses CRs from the perspective of information and applies mathematical methods to the analysis process. A detailed description of CRI's acquisition, classification and processing is provided.
NASA Technical Reports Server (NTRS)
Goldman, H.; Wolf, M.
1978-01-01
Several experimental and projected Czochralski crystal growing process methods were studied and compared to available operations and cost-data of recent production Cz-pulling, in order to elucidate the role of the dominant cost contributing factors. From this analysis, it becomes apparent that substantial cost reductions can be realized from technical advancements which fall into four categories: an increase in furnace productivity; the reduction of crucible cost through use of the crucible for the equivalent of multiple state-of-the-art crystals; the combined effect of several smaller technical improvements; and a carry over effect of the expected availability of semiconductor grade polysilicon at greatly reduced prices. A format for techno-economic analysis of solar cell production processes was developed, called the University of Pennsylvania Process Characterization (UPPC) format. The accumulated Cz process data are presented.
Usability engineering: domain analysis activities for augmented-reality systems
NASA Astrophysics Data System (ADS)
Gabbard, Joseph; Swan, J. E., II; Hix, Deborah; Lanzagorta, Marco O.; Livingston, Mark; Brown, Dennis B.; Julier, Simon J.
2002-05-01
This paper discusses our usability engineering process for the Battlefield Augmented Reality System (BARS). Usability engineering is a structured, iterative, stepwise development process. Like the related disciplines of software and systems engineering, usability engineering is a combination of management principals and techniques, formal and semi- formal evaluation techniques, and computerized tools. BARS is an outdoor augmented reality system that displays heads- up battlefield intelligence information to a dismounted warrior. The paper discusses our general usability engineering process. We originally developed the process in the context of virtual reality applications, but in this work we are adapting the procedures to an augmented reality system. The focus of this paper is our work on domain analysis, the first activity of the usability engineering process. We describe our plans for and our progress to date on our domain analysis for BARS. We give results in terms of a specific urban battlefield use case we have designed.
Shreve, Elizabeth A.; Downs, Aimee C.
2005-01-01
This report describes laboratory procedures used by the U.S. Geological Survey Kentucky Water Science Center Sediment Laboratory for the processing and analysis of fluvial-sediment samples for concentration of sand and finer material. The report details the processing of a sediment sample through the laboratory from receiving the sediment sample, through the analytical process, to compiling results of the requested analysis. Procedures for preserving sample integrity, calibrating and maintaining of laboratory and field instruments and equipment, analyzing samples, internal quality assurance and quality control, and validity of the sediment-analysis results also are described. The report includes a list of references cited and a glossary of sediment and quality-assurance terms.
Application of a High-Fidelity Icing Analysis Method to a Model-Scale Rotor in Forward Flight
NASA Technical Reports Server (NTRS)
Narducci, Robert; Orr, Stanley; Kreeger, Richard E.
2012-01-01
An icing analysis process involving the loose coupling of OVERFLOW-RCAS for rotor performance prediction and with LEWICE3D for thermal analysis and ice accretion is applied to a model-scale rotor for validation. The process offers high-fidelity rotor analysis for the noniced and iced rotor performance evaluation that accounts for the interaction of nonlinear aerodynamics with blade elastic deformations. Ice accumulation prediction also involves loosely coupled data exchanges between OVERFLOW and LEWICE3D to produce accurate ice shapes. Validation of the process uses data collected in the 1993 icing test involving Sikorsky's Powered Force Model. Non-iced and iced rotor performance predictions are compared to experimental measurements as are predicted ice shapes.
Spatial recurrence analysis: A sensitive and fast detection tool in digital mammography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prado, T. L.; Galuzio, P. P.; Lopes, S. R.
Efficient diagnostics of breast cancer requires fast digital mammographic image processing. Many breast lesions, both benign and malignant, are barely visible to the untrained eye and requires accurate and reliable methods of image processing. We propose a new method of digital mammographic image analysis that meets both needs. It uses the concept of spatial recurrence as the basis of a spatial recurrence quantification analysis, which is the spatial extension of the well-known time recurrence analysis. The recurrence-based quantifiers are able to evidence breast lesions in a way as good as the best standard image processing methods available, but with amore » better control over the spurious fragments in the image.« less
NASA Astrophysics Data System (ADS)
Venkata Subbaiah, K.; Raju, Ch.; Suresh, Ch.
2017-08-01
The present study aims to compare the conventional cutting inserts with wiper cutting inserts during the hard turning of AISI 4340 steel at different workpiece hardness. Type of insert, hardness, cutting speed, feed, and depth of cut are taken as process parameters. Taguchi’s L18 orthogonal array was used to conduct the experimental tests. Parametric analysis carried in order to know the influence of each process parameter on the three important Surface Roughness Characteristics (Ra, Rz, and Rt) and Material Removal Rate. Taguchi based Grey Relational Analysis (GRA) used to optimize the process parameters for individual response and multi-response outputs. Additionally, the analysis of variance (ANOVA) is also applied to identify the most significant factor.
Postoptimality Analysis in the Selection of Technology Portfolios
NASA Technical Reports Server (NTRS)
Adumitroaie, Virgil; Shelton, Kacie; Elfes, Alberto; Weisbin, Charles R.
2006-01-01
This slide presentation reviews a process of postoptimally analysing the selection of technology portfolios. The rationale for the analysis stems from the need for consistent, transparent and auditable decision making processes and tools. The methodology is used to assure that project investments are selected through an optimization of net mission value. The main intent of the analysis is to gauge the degree of confidence in the optimal solution and to provide the decision maker with an array of viable selection alternatives which take into account input uncertainties and possibly satisfy non-technical constraints. A few examples of the analysis are reviewed. The goal of the postoptimality study is to enhance and improve the decision-making process by providing additional qualifications and substitutes to the optimal solution.
Peigneux, P; Salmon, E; van der Linden, M; Garraux, G; Aerts, J; Delfiore, G; Degueldre, C; Luxen, A; Orban, G; Franck, G
2000-06-01
Humans, like numerous other species, strongly rely on the observation of gestures of other individuals in their everyday life. It is hypothesized that the visual processing of human gestures is sustained by a specific functional architecture, even at an early prelexical cognitive stage, different from that required for the processing of other visual entities. In the present PET study, the neural basis of visual gesture analysis was investigated with functional neuroimaging of brain activity during naming and orientation tasks performed on pictures of either static gestures (upper-limb postures) or tridimensional objects. To prevent automatic object-related cerebral activation during the visual processing of postures, only intransitive postures were selected, i. e., symbolic or meaningless postures which do not imply the handling of objects. Conversely, only intransitive objects which cannot be handled were selected to prevent gesture-related activation during their visual processing. Results clearly demonstrate a significant functional segregation between the processing of static intransitive postures and the processing of intransitive tridimensional objects. Visual processing of objects elicited mainly occipital and fusiform gyrus activity, while visual processing of postures strongly activated the lateral occipitotemporal junction, encroaching upon area MT/V5, involved in motion analysis. These findings suggest that the lateral occipitotemporal junction, working in association with area MT/V5, plays a prominent role in the high-level perceptual analysis of gesture, namely the construction of its visual representation, available for subsequent recognition or imitation. Copyright 2000 Academic Press.
The Vehicle Integrated Performance Analysis Experience: Reconnecting With Technical Integration
NASA Technical Reports Server (NTRS)
McGhee, D. S.
2006-01-01
Very early in the Space Launch Initiative program, a small team of engineers at MSFC proposed a process for performing system-level assessments of a launch vehicle. Aimed primarily at providing insight and making NASA a smart buyer, the Vehicle Integrated Performance Analysis (VIPA) team was created. The difference between the VIPA effort and previous integration attempts is that VIPA a process using experienced people from various disciplines, which focuses them on a technically integrated assessment. The foundations of VIPA s process are described. The VIPA team also recognized the need to target early detailed analysis toward identifying significant systems issues. This process is driven by the T-model for technical integration. VIPA s approach to performing system-level technical integration is discussed in detail. The VIPA process significantly enhances the development and monitoring of realizable project requirements. VIPA s assessment validates the concept s stated performance, identifies significant issues either with the concept or the requirements, and then reintegrates these issues to determine impacts. This process is discussed along with a description of how it may be integrated into a program s insight and review process. The VIPA process has gained favor with both engineering and project organizations for being responsive and insightful
Metacognitive Analysis of Pre-Service Teachers of Chemistry in Posting Questions
NASA Astrophysics Data System (ADS)
Santoso, T.; Yuanita, L.
2017-04-01
Questions addressed to something can induce metacognitive function to monitor a person’s thinking process. This study aims to describe the structure of the level of student questions based on thinking level and chemistry understanding level and describe how students use their metacognitive knowledge in asking. This research is a case study in chemistry learning, followed by 87 students. Results of the analysis revealed that the structure of thinking level of student question consists of knowledge question, understanding and application question, and high thinking question; the structure of chemistry understanding levels of student questions are a symbol, macro, macro-micro, macro-process, micro-process, and the macro-micro-process. The level Questioning skill of students to scientific articles more qualified than the level questioning skills of students to the teaching materials. The analysis result of six student interviews, a student question demonstrate the metacognitive processes with categories: (1) low-level metacognitive process, which is compiled based on questions focusing on a particular phrase or change the words; (2) intermediate level metacognitive process, submission of questions requires knowledge and understanding, and (3) high-level metacognitive process, the student questions posed based on identifying the central topic or abstraction essence of scientific articles.
The MSFC Collaborative Engineering Process for Preliminary Design and Concept Definition Studies
NASA Technical Reports Server (NTRS)
Mulqueen, Jack; Jones, David; Hopkins, Randy
2011-01-01
This paper describes a collaborative engineering process developed by the Marshall Space Flight Center's Advanced Concepts Office for performing rapid preliminary design and mission concept definition studies for potential future NASA missions. The process has been developed and demonstrated for a broad range of mission studies including human space exploration missions, space transportation system studies and in-space science missions. The paper will describe the design team structure and specialized analytical tools that have been developed to enable a unique rapid design process. The collaborative engineering process consists of integrated analysis approach for mission definition, vehicle definition and system engineering. The relevance of the collaborative process elements to the standard NASA NPR 7120.1 system engineering process will be demonstrated. The study definition process flow for each study discipline will be will be outlined beginning with the study planning process, followed by definition of ground rules and assumptions, definition of study trades, mission analysis and subsystem analyses leading to a standardized set of mission concept study products. The flexibility of the collaborative engineering design process to accommodate a wide range of study objectives from technology definition and requirements definition to preliminary design studies will be addressed. The paper will also describe the applicability of the collaborative engineering process to include an integrated systems analysis approach for evaluating the functional requirements of evolving system technologies and capabilities needed to meet the needs of future NASA programs.
Santhanam, Manikandan; Selvaraj, Rajeswari; Annamalai, Sivasankar; Sundaram, Maruthamuthu
2017-11-01
This study presents a combined electrochemical, sunlight-induced oxidation and biological process for the treatment of textile effluent. In the first step, RuO 2 -TiO 2 /Ti and Titanium were used as the electrodes in EO process and color removal was achieved in 40 min at an applied current density of 20 mA cm -2 . The EO process generated about 250 mg L -1 of active chlorine which hampered the subsequent biological treatment process. Thus, in the second step, sun light-induced photolysis (SLIP) is explored to remove hypochlorite present in the EO treated effluent. In the third step, the SLIP treated effluent was fed to laccase positive bacterial consortium for biological process. To assess the effect of SLIP in the overall process, experiments were carried out with and without SLIP process. In experiments without SLIP, sodium thiosulfate was used to remove active chlorine. HPLC analysis showed that SLIP integrated experiments achieved an overall dye component degradation of 71%, where as only 22% degradation was achieved in the absence of SLIP process. The improvement in degradation with SLIP process is attributed to the presence of ClO radicals which detected by EPR analysis. The oxidation of organic molecules during process was confirmed by FT-IR and GC-MS analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.
An Example for Integrated Gas Turbine Engine Testing and Analysis Using Modeling and Simulation
2006-12-01
USAF Academy in a joint test and analysis effort of the F109 turbofan engine. This process uses a swirl investigation as a vehicle to exercise and...test and analysis effort of the F109 turbofan engine. This process uses a swirl investigation as a vehicle to exercise and demonstrate the approach...test and analysis effort of the F109 turbofan engine, an effort which uses a swirl investigation as a vehicle to exercise and demonstrate the
An Approach to Knowledge-Directed Image Analysis,
1977-09-01
34AN APPROACH TO KNOWLEDGE -DIRECTED IMAGE ANALYSIS D.H. Ballard, C.M.’Brown, J.A. Feldman Computer Science Department iThe University of Rochester...Rochester, New York 14627 DTII EECTE UTIC FILE COPY o n I, n 83 - ’ f t 8 11 28 19 1f.. AN APPROACH TO KNOWLEDGE -DIRECTED IMAGE ANALYSIS 5*., D.H...semantic network model and a distributed control structure to accomplish the image analysis process. The process of " understanding an image" leads to
Modernization of the Transonic Axial Compressor Test Rig
2017-12-01
13. ABSTRACT (maximum 200 words) This work presents the design and simulation process of modernizing the Naval Postgraduate School’s transonic...fabricate the materials. Stiffness tests and modal analysis were conducted via Finite Element Analysis (FEA) software. This analysis was used to design ...work presents the design and simulation process of modernizing the Naval Postgraduate School’s transonic compressor test rig (TCR). The TCR, which
Natural Resource Information System, design analysis
NASA Technical Reports Server (NTRS)
1972-01-01
The computer-based system stores, processes, and displays map data relating to natural resources. The system was designed on the basis of requirements established in a user survey and an analysis of decision flow. The design analysis effort is described, and the rationale behind major design decisions, including map processing, cell vs. polygon, choice of classification systems, mapping accuracy, system hardware, and software language is summarized.
Process Correlation Analysis Model for Process Improvement Identification
Park, Sooyong
2014-01-01
Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170
Process correlation analysis model for process improvement identification.
Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong
2014-01-01
Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.
Cost analysis and environmental impact of nonthermal technologies
USDA-ARS?s Scientific Manuscript database
The cost of high pressure processing (HPP) orange juice and its environmental impact were estimated. In addition, the environmental impact of pulsed electric fields (PEF) and thermal pasteurization were assessed for comparison. The cost analysis was based on commercial processing conditions that wer...
77 FR 47337 - Project-Level Predecisional Administrative Review Process
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-08
... upholding environmental standards and encouraging early public input during planning processes. One of the... when they are categorically excluded from analysis under the National Environmental Policy Act (NEPA... applicable to projects and activities evaluated in an environmental analysis (EA) or environmental impact...
Wendt, Dorothea; Brand, Thomas; Kollmeier, Birger
2014-01-01
An eye-tracking paradigm was developed for use in audiology in order to enable online analysis of the speech comprehension process. This paradigm should be useful in assessing impediments in speech processing. In this paradigm, two scenes, a target picture and a competitor picture, were presented simultaneously with an aurally presented sentence that corresponded to the target picture. At the same time, eye fixations were recorded using an eye-tracking device. The effect of linguistic complexity on language processing time was assessed from eye fixation information by systematically varying linguistic complexity. This was achieved with a sentence corpus containing seven German sentence structures. A novel data analysis method computed the average tendency to fixate the target picture as a function of time during sentence processing. This allowed identification of the point in time at which the participant understood the sentence, referred to as the decision moment. Systematic differences in processing time were observed as a function of linguistic complexity. These differences in processing time may be used to assess the efficiency of cognitive processes involved in resolving linguistic complexity. Thus, the proposed method enables a temporal analysis of the speech comprehension process and has potential applications in speech audiology and psychoacoustics.
Wendt, Dorothea; Brand, Thomas; Kollmeier, Birger
2014-01-01
An eye-tracking paradigm was developed for use in audiology in order to enable online analysis of the speech comprehension process. This paradigm should be useful in assessing impediments in speech processing. In this paradigm, two scenes, a target picture and a competitor picture, were presented simultaneously with an aurally presented sentence that corresponded to the target picture. At the same time, eye fixations were recorded using an eye-tracking device. The effect of linguistic complexity on language processing time was assessed from eye fixation information by systematically varying linguistic complexity. This was achieved with a sentence corpus containing seven German sentence structures. A novel data analysis method computed the average tendency to fixate the target picture as a function of time during sentence processing. This allowed identification of the point in time at which the participant understood the sentence, referred to as the decision moment. Systematic differences in processing time were observed as a function of linguistic complexity. These differences in processing time may be used to assess the efficiency of cognitive processes involved in resolving linguistic complexity. Thus, the proposed method enables a temporal analysis of the speech comprehension process and has potential applications in speech audiology and psychoacoustics. PMID:24950184
Book: Marine Bioacoustic Signal Processing and Analysis
2011-09-30
physicists , and mathematicians . However, more and more biologists and psychologists are starting to use advanced signal processing techniques and...Book: Marine Bioacoustic Signal Processing and Analysis 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT ...chapters than it should be, since the project must be finished by Dec. 31. I have started setting aside 2 hours of uninterrupted per workday to work
Automatic Data Processing Equipment (ADPE) acquisition plan for the medical sciences
NASA Technical Reports Server (NTRS)
1979-01-01
An effective mechanism for meeting the SLSD/MSD data handling/processing requirements for Shuttle is discussed. The ability to meet these requirements depends upon the availability of a general purpose high speed digital computer system. This system is expected to implement those data base management and processing functions required across all SLSD/MSD programs during training, laboratory operations/analysis, simulations, mission operations, and post mission analysis/reporting.
1987-09-01
a useful average for population studies, do not delay data processing , and is relatively Inexpensive. Using MVEN and observing recipe preparation...for population studies, do not delay data processing , and is relatively inexpensive. Using HVEM and observing recipe preparation procedures improve the...extensive review of the procedures and problems in design, collection, analysis, processing and interpretation of dietary survey data for individuals
Smartphone Analytics: Mobilizing the Lab into the Cloud for Omic-Scale Analyses.
Montenegro-Burke, J Rafael; Phommavongsay, Thiery; Aisporna, Aries E; Huan, Tao; Rinehart, Duane; Forsberg, Erica; Poole, Farris L; Thorgersen, Michael P; Adams, Michael W W; Krantz, Gregory; Fields, Matthew W; Northen, Trent R; Robbins, Paul D; Niedernhofer, Laura J; Lairson, Luke; Benton, H Paul; Siuzdak, Gary
2016-10-04
Active data screening is an integral part of many scientific activities, and mobile technologies have greatly facilitated this process by minimizing the reliance on large hardware instrumentation. In order to meet with the increasingly growing field of metabolomics and heavy workload of data processing, we designed the first remote metabolomic data screening platform for mobile devices. Two mobile applications (apps), XCMS Mobile and METLIN Mobile, facilitate access to XCMS and METLIN, which are the most important components in the computer-based XCMS Online platforms. These mobile apps allow for the visualization and analysis of metabolic data throughout the entire analytical process. Specifically, XCMS Mobile and METLIN Mobile provide the capabilities for remote monitoring of data processing, real time notifications for the data processing, visualization and interactive analysis of processed data (e.g., cloud plots, principle component analysis, box-plots, extracted ion chromatograms, and hierarchical cluster analysis), and database searching for metabolite identification. These apps, available on Apple iOS and Google Android operating systems, allow for the migration of metabolomic research onto mobile devices for better accessibility beyond direct instrument operation. The utility of XCMS Mobile and METLIN Mobile functionalities was developed and is demonstrated here through the metabolomic LC-MS analyses of stem cells, colon cancer, aging, and bacterial metabolism.
NASA Astrophysics Data System (ADS)
Papanikolaou, Xanthos; Anastasiou, Demitris; Marinou, Aggeliki; Zacharis, Vangelis; Paradissis, Demitris
2015-04-01
Dionysos Satellite Observatory and Higher Geodesy Laboratory of the National Technical University of Athens, have developed an automated processing scheme to accommodate the daily analysis of all available continuous GNSS stations in Greece. For the moment, a total of approximately 150 regional stations are processed, divided in 4 subnetworks. GNSS data are processed routinely on a daily basis, via Bernese GNSS Software v5.0, developed by AIUB. Each network is solved twice, within a period of 20 days, first using ultra-rapid products (with a latency of ~10 hours) and then using final products (with a latency of ~20 days). Observations are processed using carrier phase, modelled to double differences in the ionosphere-free linear combination. Analysis results, include coordinate estimates, ionospheric corrections (TEC maps) and hourly tropospheric parameters (zenith delay). This processing scheme, has proved helpful in investigating in near real-time abrupt geophysical phenomena, as in the 2011 Santorini inflation episode and the 2014 Kephalonia earthquake events. All analysis results and products are made available via a dedicated webpage. Additionally, most of the GNSS data are hosted in a GSAC web platform, available to all interested parties. Data and results are made available through the laboratory's dedicated website: http://dionysos.survey.ntua.gr/.
Smartphone Analytics: Mobilizing the Lab into the Cloud for Omic-Scale Analyses
2016-01-01
Active data screening is an integral part of many scientific activities, and mobile technologies have greatly facilitated this process by minimizing the reliance on large hardware instrumentation. In order to meet with the increasingly growing field of metabolomics and heavy workload of data processing, we designed the first remote metabolomic data screening platform for mobile devices. Two mobile applications (apps), XCMS Mobile and METLIN Mobile, facilitate access to XCMS and METLIN, which are the most important components in the computer-based XCMS Online platforms. These mobile apps allow for the visualization and analysis of metabolic data throughout the entire analytical process. Specifically, XCMS Mobile and METLIN Mobile provide the capabilities for remote monitoring of data processing, real time notifications for the data processing, visualization and interactive analysis of processed data (e.g., cloud plots, principle component analysis, box-plots, extracted ion chromatograms, and hierarchical cluster analysis), and database searching for metabolite identification. These apps, available on Apple iOS and Google Android operating systems, allow for the migration of metabolomic research onto mobile devices for better accessibility beyond direct instrument operation. The utility of XCMS Mobile and METLIN Mobile functionalities was developed and is demonstrated here through the metabolomic LC-MS analyses of stem cells, colon cancer, aging, and bacterial metabolism. PMID:27560777
Smartphone Analytics: Mobilizing the Lab into the Cloud for Omic-Scale Analyses
Montenegro-Burke, J. Rafael; Phommavongsay, Thiery; Aisporna, Aries E.; ...
2016-08-25
Active data screening is an integral part of many scientific activities, and mobile technologies have greatly facilitated this process by minimizing the reliance on large hardware instrumentation. In order to meet with the increasingly growing field of metabolomics and heavy workload of data processing, we designed the first remote metabolomic data screening platform for mobile devices. Two mobile applications (apps), XCMS Mobile and METLIN Mobile, facilitate access to XCMS and METLIN, which are the most important components in the computer-based XCMS Online platforms. These mobile apps allow for the visualization and analysis of metabolic data throughout the entire analytical process.more » Specifically, XCMS Mobile and METLIN Mobile provide the capabilities for remote monitoring of data processing, real time notifications for the data processing, visualization and interactive analysis of processed data (e.g., cloud plots, principle component analysis, box-plots, extracted ion chromatograms, and hierarchical cluster analysis), and database searching for metabolite identification. These apps, available on Apple iOS and Google Android operating systems, allow for the migration of metabolomic research onto mobile devices for better accessibility beyond direct instrument operation. The utility of XCMS Mobile and METLIN Mobile functionalities was developed and is demonstrated here through the metabolomic LC-MS analyses of stem cells, colon cancer, aging, and bacterial metabolism.« less
Smartphone Analytics: Mobilizing the Lab into the Cloud for Omic-Scale Analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montenegro-Burke, J. Rafael; Phommavongsay, Thiery; Aisporna, Aries E.
Active data screening is an integral part of many scientific activities, and mobile technologies have greatly facilitated this process by minimizing the reliance on large hardware instrumentation. In order to meet with the increasingly growing field of metabolomics and heavy workload of data processing, we designed the first remote metabolomic data screening platform for mobile devices. Two mobile applications (apps), XCMS Mobile and METLIN Mobile, facilitate access to XCMS and METLIN, which are the most important components in the computer-based XCMS Online platforms. These mobile apps allow for the visualization and analysis of metabolic data throughout the entire analytical process.more » Specifically, XCMS Mobile and METLIN Mobile provide the capabilities for remote monitoring of data processing, real time notifications for the data processing, visualization and interactive analysis of processed data (e.g., cloud plots, principle component analysis, box-plots, extracted ion chromatograms, and hierarchical cluster analysis), and database searching for metabolite identification. These apps, available on Apple iOS and Google Android operating systems, allow for the migration of metabolomic research onto mobile devices for better accessibility beyond direct instrument operation. The utility of XCMS Mobile and METLIN Mobile functionalities was developed and is demonstrated here through the metabolomic LC-MS analyses of stem cells, colon cancer, aging, and bacterial metabolism.« less
Hughes, Brianna H; Greenberg, Neil J; Yang, Tom C; Skonberg, Denise I
2015-01-01
High-pressure processing (HPP) is used to increase meat safety and shelf-life, with conflicting quality effects depending on rigor status during HPP. In the seafood industry, HPP is used to shuck and pasteurize oysters, but its use on abalones has only been minimally evaluated and the effect of rigor status during HPP on abalone quality has not been reported. Farm-raised abalones (Haliotis rufescens) were divided into 12 HPP treatments and 1 unprocessed control treatment. Treatments were processed pre-rigor or post-rigor at 2 pressures (100 and 300 MPa) and 3 processing times (1, 3, and 5 min). The control was analyzed post-rigor. Uniform plugs were cut from adductor and foot meat for texture profile analysis, shear force, and color analysis. Subsamples were used for scanning electron microscopy of muscle ultrastructure. Texture profile analysis revealed that post-rigor processed abalone was significantly (P < 0.05) less firm and chewy than pre-rigor processed irrespective of muscle type, processing time, or pressure. L values increased with pressure to 68.9 at 300 MPa for pre-rigor processed foot, 73.8 for post-rigor processed foot, 90.9 for pre-rigor processed adductor, and 89.0 for post-rigor processed adductor. Scanning electron microscopy images showed fraying of collagen fibers in processed adductor, but did not show pressure-induced compaction of the foot myofibrils. Post-rigor processed abalone meat was more tender than pre-rigor processed meat, and post-rigor processed foot meat was lighter in color than pre-rigor processed foot meat, suggesting that waiting for rigor to resolve prior to processing abalones may improve consumer perceptions of quality and market value. © 2014 Institute of Food Technologists®
NASA Technical Reports Server (NTRS)
Huang, Norden E.; Hu, Kun; Yang, Albert C. C.; Chang, Hsing-Chih; Jia, Deng; Liang, Wei-Kuang; Yeh, Jia Rong; Kao, Chu-Lan; Juan, Chi-Huang; Peng, Chung Kang;
2016-01-01
The Holo-Hilbert spectral analysis (HHSA) method is introduced to cure the deficiencies of traditional spectral analysis and to give a full informational representation of nonlinear and non-stationary data. It uses a nested empirical mode decomposition and Hilbert-Huang transform (HHT) approach to identify intrinsic amplitude and frequency modulations often present in nonlinear systems. Comparisons are first made with traditional spectrum analysis, which usually achieved its results through convolutional integral transforms based on additive expansions of an a priori determined basis, mostly under linear and stationary assumptions. Thus, for non-stationary processes, the best one could do historically was to use the time- frequency representations, in which the amplitude (or energy density) variation is still represented in terms of time. For nonlinear processes, the data can have both amplitude and frequency modulations (intra-mode and inter-mode) generated by two different mechanisms: linear additive or nonlinear multiplicative processes. As all existing spectral analysis methods are based on additive expansions, either a priori or adaptive, none of them could possibly represent the multiplicative processes. While the earlier adaptive HHT spectral analysis approach could accommodate the intra-wave nonlinearity quite remarkably, it remained that any inter-wave nonlinear multiplicative mechanisms that include cross-scale coupling and phase-lock modulations were left untreated. To resolve the multiplicative processes issue, additional dimensions in the spectrum result are needed to account for the variations in both the amplitude and frequency modulations simultaneously. HHSA accommodates all the processes: additive and multiplicative, intra-mode and inter-mode, stationary and nonstationary, linear and nonlinear interactions. The Holo prefix in HHSA denotes a multiple dimensional representation with both additive and multiplicative capabilities.
From perceptual to lexico‐semantic analysis—cortical plasticity enabling new levels of processing
Schlaffke, Lara; Rüther, Naima N.; Heba, Stefanie; Haag, Lauren M.; Schultz, Thomas; Rosengarth, Katharina; Tegenthoff, Martin; Bellebaum, Christian
2015-01-01
Abstract Certain kinds of stimuli can be processed on multiple levels. While the neural correlates of different levels of processing (LOPs) have been investigated to some extent, most of the studies involve skills and/or knowledge already present when performing the task. In this study we specifically sought to identify neural correlates of an evolving skill that allows the transition from perceptual to a lexico‐semantic stimulus analysis. Eighteen participants were trained to decode 12 letters of Morse code that were presented acoustically inside and outside of the scanner environment. Morse code was presented in trains of three letters while brain activity was assessed with fMRI. Participants either attended to the stimulus length (perceptual analysis), or evaluated its meaning distinguishing words from nonwords (lexico‐semantic analysis). Perceptual and lexico‐semantic analyses shared a mutual network comprising the left premotor cortex, the supplementary motor area (SMA) and the inferior parietal lobule (IPL). Perceptual analysis was associated with a strong brain activation in the SMA and the superior temporal gyrus bilaterally (STG), which remained unaltered from pre and post training. In the lexico‐semantic analysis post learning, study participants showed additional activation in the left inferior frontal cortex (IFC) and in the left occipitotemporal cortex (OTC), regions known to be critically involved in lexical processing. Our data provide evidence for cortical plasticity evolving with a learning process enabling the transition from perceptual to lexico‐semantic stimulus analysis. Importantly, the activation pattern remains task‐related LOP and is thus the result of a decision process as to which LOP to engage in. Hum Brain Mapp 36:4512–4528, 2015. © 2015 The Authors. Human Brain Mapping Published byWiley Periodicals, Inc. PMID:26304153
Huang, Norden E.; Hu, Kun; Yang, Albert C. C.; Chang, Hsing-Chih; Jia, Deng; Liang, Wei-Kuang; Yeh, Jia Rong; Kao, Chu-Lan; Juan, Chi-Hung; Peng, Chung Kang; Meijer, Johanna H.; Wang, Yung-Hung; Long, Steven R.; Wu, Zhauhua
2016-01-01
The Holo-Hilbert spectral analysis (HHSA) method is introduced to cure the deficiencies of traditional spectral analysis and to give a full informational representation of nonlinear and non-stationary data. It uses a nested empirical mode decomposition and Hilbert–Huang transform (HHT) approach to identify intrinsic amplitude and frequency modulations often present in nonlinear systems. Comparisons are first made with traditional spectrum analysis, which usually achieved its results through convolutional integral transforms based on additive expansions of an a priori determined basis, mostly under linear and stationary assumptions. Thus, for non-stationary processes, the best one could do historically was to use the time–frequency representations, in which the amplitude (or energy density) variation is still represented in terms of time. For nonlinear processes, the data can have both amplitude and frequency modulations (intra-mode and inter-mode) generated by two different mechanisms: linear additive or nonlinear multiplicative processes. As all existing spectral analysis methods are based on additive expansions, either a priori or adaptive, none of them could possibly represent the multiplicative processes. While the earlier adaptive HHT spectral analysis approach could accommodate the intra-wave nonlinearity quite remarkably, it remained that any inter-wave nonlinear multiplicative mechanisms that include cross-scale coupling and phase-lock modulations were left untreated. To resolve the multiplicative processes issue, additional dimensions in the spectrum result are needed to account for the variations in both the amplitude and frequency modulations simultaneously. HHSA accommodates all the processes: additive and multiplicative, intra-mode and inter-mode, stationary and non-stationary, linear and nonlinear interactions. The Holo prefix in HHSA denotes a multiple dimensional representation with both additive and multiplicative capabilities. PMID:26953180
Zhu, Hongbin; Wang, Chunyan; Qi, Yao; Song, Fengrui; Liu, Zhiqiang; Liu, Shuying
2012-11-08
This study presents a novel and rapid method to identify chemical markers for the quality control of Radix Aconiti Preparata, a world widely used traditional herbal medicine. In the method, the samples with a fast extraction procedure were analyzed using direct analysis in real time mass spectrometry (DART MS) combined with multivariate data analysis. At present, the quality assessment approach of Radix Aconiti Preparata was based on the two processing methods recorded in Chinese Pharmacopoeia for the purpose of reducing the toxicity of Radix Aconiti and ensuring its clinical therapeutic efficacy. In order to ensure the safety and effectivity in clinical use, the processing degree of Radix Aconiti should be well controlled and assessed. In the paper, hierarchical cluster analysis and principal component analysis were performed to evaluate the DART MS data of Radix Aconiti Preparata samples in different processing times. The results showed that the well processed Radix Aconiti Preparata, unqualified processed and the raw Radix Aconiti could be clustered reasonably corresponding to their constituents. The loading plot shows that the main chemical markers having the most influence on the discrimination amongst the qualified and unqualified samples were mainly some monoester diterpenoid aconitines and diester diterpenoid aconitines, i.e. benzoylmesaconine, hypaconitine, mesaconitine, neoline, benzoylhypaconine, benzoylaconine, fuziline, aconitine and 10-OH-mesaconitine. The established DART MS approach in combination with multivariate data analysis provides a very flexible and reliable method for quality assessment of toxic herbal medicine. Copyright © 2012 Elsevier B.V. All rights reserved.
Samad, Noor Asma Fazli Abdul; Sin, Gürkan; Gernaey, Krist V; Gani, Rafiqul
2013-11-01
This paper presents the application of uncertainty and sensitivity analysis as part of a systematic model-based process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty, while for sensitivity analysis, global methods including the standardized regression coefficients (SRC) and Morris screening are used to identify the most significant parameters. The potassium dihydrogen phosphate (KDP) crystallization process is used as a case study, both in open-loop and closed-loop operation. In the uncertainty analysis, the impact on the predicted output of uncertain parameters related to the nucleation and the crystal growth model has been investigated for both a one- and two-dimensional crystal size distribution (CSD). The open-loop results show that the input uncertainties lead to significant uncertainties on the CSD, with appearance of a secondary peak due to secondary nucleation for both cases. The sensitivity analysis indicated that the most important parameters affecting the CSDs are nucleation order and growth order constants. In the proposed PAT system design (closed-loop), the target CSD variability was successfully reduced compared to the open-loop case, also when considering uncertainty in nucleation and crystal growth model parameters. The latter forms a strong indication of the robustness of the proposed PAT system design in achieving the target CSD and encourages its transfer to full-scale implementation. Copyright © 2013 Elsevier B.V. All rights reserved.
Research on the Intensity Analysis and Result Visualization of Construction Land in Urban Planning
NASA Astrophysics Data System (ADS)
Cui, J.; Dong, B.; Li, J.; Li, L.
2017-09-01
As a fundamental work of urban planning, the intensity analysis of construction land involves many repetitive data processing works that are prone to cause errors or data precision loss, and the lack of efficient methods and tools to visualizing the analysis results in current urban planning. In the research a portable tool is developed by using the Model Builder technique embedded in ArcGIS to provide automatic data processing and rapid result visualization for the works. A series of basic modules provided by ArcGIS are linked together to shape a whole data processing chain in the tool. Once the required data is imported, the analysis results and related maps and graphs including the intensity values and zoning map, the skyline analysis map etc. are produced automatically. Finally the tool is installation-free and can be dispatched quickly between planning teams.
Inferring Group Processes from Computer-Mediated Affective Text Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schryver, Jack C; Begoli, Edmon; Jose, Ajith
2011-02-01
Political communications in the form of unstructured text convey rich connotative meaning that can reveal underlying group social processes. Previous research has focused on sentiment analysis at the document level, but we extend this analysis to sub-document levels through a detailed analysis of affective relationships between entities extracted from a document. Instead of pure sentiment analysis, which is just positive or negative, we explore nuances of affective meaning in 22 affect categories. Our affect propagation algorithm automatically calculates and displays extracted affective relationships among entities in graphical form in our prototype (TEAMSTER), starting with seed lists of affect terms. Severalmore » useful metrics are defined to infer underlying group processes by aggregating affective relationships discovered in a text. Our approach has been validated with annotated documents from the MPQA corpus, achieving a performance gain of 74% over comparable random guessers.« less
NASA Technical Reports Server (NTRS)
Ulbrich, N.; Volden, T.
2018-01-01
Analysis and use of temperature-dependent wind tunnel strain-gage balance calibration data are discussed in the paper. First, three different methods are presented and compared that may be used to process temperature-dependent strain-gage balance data. The first method uses an extended set of independent variables in order to process the data and predict balance loads. The second method applies an extended load iteration equation during the analysis of balance calibration data. The third method uses temperature-dependent sensitivities for the data analysis. Physical interpretations of the most important temperature-dependent regression model terms are provided that relate temperature compensation imperfections and the temperature-dependent nature of the gage factor to sets of regression model terms. Finally, balance calibration recommendations are listed so that temperature-dependent calibration data can be obtained and successfully processed using the reviewed analysis methods.
LANDSAT-4 image data quality analysis for energy related applications. [nuclear power plant sites
NASA Technical Reports Server (NTRS)
Wukelic, G. E. (Principal Investigator)
1983-01-01
No useable LANDSAT 4 TM data were obtained for the Hanford site in the Columbia Plateau region, but TM simulator data for a Virginia Electric Company nuclear power plant was used to test image processing algorithms. Principal component analyses of this data set clearly indicated that thermal plumes in surface waters used for reactor cooling would be discrenible. Image processing and analysis programs were successfully testing using the 7 band Arkansas test scene and preliminary analysis of TM data for the Savanah River Plant shows that current interactive, image enhancement, analysis and integration techniques can be effectively used for LANDSAT 4 data. Thermal band data appear adequate for gross estimates of thermal changes occurring near operating nuclear facilities especially in surface water bodies being used for reactor cooling purposes. Additional image processing software was written and tested which provides for more rapid and effective analysis of the 7 band TM data.
TOOKUIL: A case study in user interface development for safety code application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gray, D.L.; Harkins, C.K.; Hoole, J.G.
1997-07-01
Traditionally, there has been a very high learning curve associated with using nuclear power plant (NPP) analysis codes. Even for seasoned plant analysts and engineers, the process of building or modifying an input model for present day NPP analysis codes is tedious, error prone, and time consuming. Current cost constraints and performance demands place an additional burden on today`s safety analysis community. Advances in graphical user interface (GUI) technology have been applied to obtain significant productivity and quality assurance improvements for the Transient Reactor Analysis Code (TRAC) input model development. KAPL Inc. has developed an X Windows-based graphical user interfacemore » named TOOKUIL which supports the design and analysis process, acting as a preprocessor, runtime editor, help system, and post processor for TRAC. This paper summarizes the objectives of the project, the GUI development process and experiences, and the resulting end product, TOOKUIL.« less
NASA Astrophysics Data System (ADS)
Cerchiari, G.; Croccolo, F.; Cardinaux, F.; Scheffold, F.
2012-10-01
We present an implementation of the analysis of dynamic near field scattering (NFS) data using a graphics processing unit. We introduce an optimized data management scheme thereby limiting the number of operations required. Overall, we reduce the processing time from hours to minutes, for typical experimental conditions. Previously the limiting step in such experiments, the processing time is now comparable to the data acquisition time. Our approach is applicable to various dynamic NFS methods, including shadowgraph, Schlieren and differential dynamic microscopy.
NASA Technical Reports Server (NTRS)
Bernstein, R.; Lotspiech, J. B.
1985-01-01
The MSS and TM sensor performances were evaluated by studying both the sensors and the characteristics of the data. Information content analysis, image statistics, band-to-band registration, the presence of failed or failing detectors, and sensor resolution are discussed. The TM data were explored from the point of view of adequacy of the ground processing and improvements that could be made to compensate for sensor problems and deficiencies. Radiometric correction processing, compensation for a failed detector, and geometric correction processing are also considered.
Kim, Jungkyu; Jensen, Erik C; Stockton, Amanda M; Mathies, Richard A
2013-08-20
A fully integrated multilayer microfluidic chemical analyzer for automated sample processing and labeling, as well as analysis using capillary zone electrophoresis is developed and characterized. Using lifting gate microfluidic control valve technology, a microfluidic automaton consisting of a two-dimensional microvalve cellular array is fabricated with soft lithography in a format that enables facile integration with a microfluidic capillary electrophoresis device. The programmable sample processor performs precise mixing, metering, and routing operations that can be combined to achieve automation of complex and diverse assay protocols. Sample labeling protocols for amino acid, aldehyde/ketone and carboxylic acid analysis are performed automatically followed by automated transfer and analysis by the integrated microfluidic capillary electrophoresis chip. Equivalent performance to off-chip sample processing is demonstrated for each compound class; the automated analysis resulted in a limit of detection of ~16 nM for amino acids. Our microfluidic automaton provides a fully automated, portable microfluidic analysis system capable of autonomous analysis of diverse compound classes in challenging environments.
Modeling Constellation Virtual Missions Using the Vdot(Trademark) Process Management Tool
NASA Technical Reports Server (NTRS)
Hardy, Roger; ONeil, Daniel; Sturken, Ian; Nix, Michael; Yanez, Damian
2011-01-01
The authors have identified a software tool suite that will support NASA's Virtual Mission (VM) effort. This is accomplished by transforming a spreadsheet database of mission events, task inputs and outputs, timelines, and organizations into process visualization tools and a Vdot process management model that includes embedded analysis software as well as requirements and information related to data manipulation and transfer. This paper describes the progress to date, and the application of the Virtual Mission to not only Constellation but to other architectures, and the pertinence to other aerospace applications. Vdot s intuitive visual interface brings VMs to life by turning static, paper-based processes into active, electronic processes that can be deployed, executed, managed, verified, and continuously improved. A VM can be executed using a computer-based, human-in-the-loop, real-time format, under the direction and control of the NASA VM Manager. Engineers in the various disciplines will not have to be Vdot-proficient but rather can fill out on-line, Excel-type databases with the mission information discussed above. The author s tool suite converts this database into several process visualization tools for review and into Microsoft Project, which can be imported directly into Vdot. Many tools can be embedded directly into Vdot, and when the necessary data/information is received from a preceding task, the analysis can be initiated automatically. Other NASA analysis tools are too complex for this process but Vdot automatically notifies the tool user that the data has been received and analysis can begin. The VM can be simulated from end-to-end using the author s tool suite. The planned approach for the Vdot-based process simulation is to generate the process model from a database; other advantages of this semi-automated approach are the participants can be geographically remote and after refining the process models via the human-in-the-loop simulation, the system can evolve into a process management server for the actual process.
OIPAV: an integrated software system for ophthalmic image processing, analysis and visualization
NASA Astrophysics Data System (ADS)
Zhang, Lichun; Xiang, Dehui; Jin, Chao; Shi, Fei; Yu, Kai; Chen, Xinjian
2018-03-01
OIPAV (Ophthalmic Images Processing, Analysis and Visualization) is a cross-platform software which is specially oriented to ophthalmic images. It provides a wide range of functionalities including data I/O, image processing, interaction, ophthalmic diseases detection, data analysis and visualization to help researchers and clinicians deal with various ophthalmic images such as optical coherence tomography (OCT) images and color photo of fundus, etc. It enables users to easily access to different ophthalmic image data manufactured from different imaging devices, facilitate workflows of processing ophthalmic images and improve quantitative evaluations. In this paper, we will present the system design and functional modules of the platform and demonstrate various applications. With a satisfying function scalability and expandability, we believe that the software can be widely applied in ophthalmology field.
Exploiting parallel computing with limited program changes using a network of microcomputers
NASA Technical Reports Server (NTRS)
Rogers, J. L., Jr.; Sobieszczanski-Sobieski, J.
1985-01-01
Network computing and multiprocessor computers are two discernible trends in parallel processing. The computational behavior of an iterative distributed process in which some subtasks are completed later than others because of an imbalance in computational requirements is of significant interest. The effects of asynchronus processing was studied. A small existing program was converted to perform finite element analysis by distributing substructure analysis over a network of four Apple IIe microcomputers connected to a shared disk, simulating a parallel computer. The substructure analysis uses an iterative, fully stressed, structural resizing procedure. A framework of beams divided into three substructures is used as the finite element model. The effects of asynchronous processing on the convergence of the design variables are determined by not resizing particular substructures on various iterations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riddle, F. J.
2003-06-26
The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control and job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the workmore » planning process.« less
Wang, Hai-Xia; Suo, Tong-Chuan; Yu, He-Shui; Li, Zheng
2016-10-01
The manufacture of traditional Chinese medicine (TCM) products is always accompanied by processing complex raw materials and real-time monitoring of the manufacturing process. In this study, we investigated different modeling strategies for the extraction process of licorice. Near-infrared spectra associate with the extraction time was used to detemine the states of the extraction processes. Three modeling approaches, i.e., principal component analysis (PCA), partial least squares regression (PLSR) and parallel factor analysis-PLSR (PARAFAC-PLSR), were adopted for the prediction of the real-time status of the process. The overall results indicated that PCA, PLSR and PARAFAC-PLSR can effectively detect the errors in the extraction procedure and predict the process trajectories, which has important significance for the monitoring and controlling of the extraction processes. Copyright© by the Chinese Pharmaceutical Association.
Conjecturing and Generalization Process on The Structural Development
NASA Astrophysics Data System (ADS)
Ni'mah, Khomsatun; Purwanto; Bambang Irawan, Edy; Hidayanto, Erry
2017-06-01
This study aims to describe how conjecturing process and generalization process of structural development to thirty children in middle school at grade 8 in solving problems of patterns. Processing of the data in this study uses qualitative data analysis techniques. The analyzed data is the data obtained through direct observation technique, documentation, and interviews. This study based on research studies Mulligan et al (2012) which resulted in a five - structural development stage, namely prestructural, emergent, partial, structural, and advance. From the analysis of the data in this study found there are two phenomena that is conjecturing and generalization process are related. During the conjecturing process, the childrens appropriately in making hypothesis of patterns problem through two phases, which are numerically and symbolically. Whereas during the generalization of process, the childrens able to related rule of pattern on conjecturing process to another context.
Peres Penteado, Alissa; Fábio Maciel, Rafael; Erbs, João; Feijó Ortolani, Cristina Lucia; Aguiar Roza, Bartira; Torres Pisa, Ivan
2015-01-01
The entire kidney transplantation process in Brazil is defined through laws, decrees, ordinances, and resolutions, but there is no defined theoretical map describing this process. From this representation it's possible to perform analysis, such as the identification of bottlenecks and information and communication technologies (ICTs) that support this process. The aim of this study was to analyze and represent the kidney transplantation workflow using business process modeling notation (BPMN) and then to identify the ICTs involved in the process. This study was conducted in eight steps, including document analysis and professional evaluation. The results include the BPMN model of the kidney transplantation process in Brazil and the identification of ICTs. We discovered that there are great delays in the process due to there being many different ICTs involved, which can cause information to be poorly integrated.
OCLC-MARC Tape Processing: A Functional Analysis.
ERIC Educational Resources Information Center
Miller, Bruce Cummings
1984-01-01
Analyzes structure of, and data in, the OCLC-MARC record in the form delivered via OCLC's Tape Subscription Service, and outlines important processing functions involved: "unreadable tapes," duplicate records and deduping, match processing, choice processing, locations processing, "automatic" and "input" stamps,…
7 Processes that Enable NASA Software Engineering Technologies: Value-Added Process Engineering
NASA Technical Reports Server (NTRS)
Housch, Helen; Godfrey, Sally
2011-01-01
The presentation reviews Agency process requirements and the purpose, benefits, and experiences or seven software engineering processes. The processes include: product integration, configuration management, verification, software assurance, measurement and analysis, requirements management, and planning and monitoring.
Illeghems, Koen; De Vuyst, Luc; Weckx, Stefan
2015-10-12
Lactobacillus fermentum 222 and Lactobacillus plantarum 80, isolates from a spontaneous Ghanaian cocoa bean fermentation process, proved to be interesting functional starter culture strains for cocoa bean fermentations. Lactobacillus fermentum 222 is a thermotolerant strain, able to dominate the fermentation process, thereby converting citrate and producing mannitol. Lactobacillus plantarum 80 is an acid-tolerant and facultative heterofermentative strain that is competitive during cocoa bean fermentation processes. In this study, whole-genome sequencing and comparative genome analysis was used to investigate the mechanisms of these strains to dominate the cocoa bean fermentation process. Through functional annotation and analysis of the high-coverage contigs obtained through 454 pyrosequencing, plantaricin production was predicted for L. plantarum 80. For L. fermentum 222, genes encoding a complete arginine deiminase pathway were attributed. Further, in-depth functional analysis revealed the capacities of these strains associated with carbohydrate and amino acid metabolism, such as the ability to use alternative external electron acceptors, the presence of an extended pyruvate metabolism, and the occurrence of several amino acid conversion pathways. A comparative genome sequence analysis using publicly available genome sequences of strains of the species L. plantarum and L. fermentum revealed unique features of both strains studied. Indeed, L. fermentum 222 possessed genes encoding additional citrate transporters and enzymes involved in amino acid conversions, whereas L. plantarum 80 is the only member of this species that harboured a gene cluster involved in uptake and consumption of fructose and/or sorbose. In-depth genome sequence analysis of the candidate functional starter culture strains L. fermentum 222 and L. plantarum 80 revealed their metabolic capacities, niche adaptations and functionalities that enable them to dominate the cocoa bean fermentation process. Further, these results offered insights into the cocoa bean fermentation ecosystem as a whole and will facilitate the selection of appropriate starter culture strains for controlled cocoa bean fermentation processes.
NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.
Zhang, Bo; Dai, Ji; Zhang, Tao
2017-11-13
In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided by NeoAnalysis, users can easily obtain publication-quality figures without writing complex codes. NeoAnalysis is a powerful and valuable toolbox for users doing electrophysiological experiments.
INTEGRATED ENVIRONMENTAL ASSESSMENT OF THE MID-ATLANTIC REGION WITH ANALYTICAL NETWORK PROCESS
A decision analysis method for integrating environmental indicators was developed. This was a combination of Principal Component Analysis (PCA) and the Analytic Network Process (ANP). Being able to take into account interdependency among variables, the method was capable of ran...
Analysis of 3D printing parameters of gears for hybrid manufacturing
NASA Astrophysics Data System (ADS)
Budzik, Grzegorz; Przeszlowski, Łukasz; Wieczorowski, Michal; Rzucidlo, Arkadiusz; Gapinski, Bartosz; Krolczyk, Grzegorz
2018-05-01
The paper deals with analysis and selection of parameters of rapid prototyping of gears by selective sintering of metal powders. Presented results show wide spectrum of application of RP systems in manufacturing processes of machine elements, basing on analysis of market in term of application of additive manufacturing technology in different sectors of industry. Considerable growth of these methods over the past years can be observed. The characteristic errors of printed model with respect to ideal one for each technique were pointed out. Special attention was paid to the method of preparation of numerical data CAD/STL/RP. Moreover the analysis of manufacturing processes of gear type elements was presented. The tested gears were modeled with different allowances for final machining and made by DMLS. Metallographic analysis and strength tests on prepared specimens were performed. The above mentioned analysis and tests were used to compare the real properties of material with the nominal ones. To improve the quality of surface after sintering the gears were subjected to final machining. The analysis of geometry of gears after hybrid manufacturing method was performed (fig.1). The manufacturing process was defined in a traditional way as well as with the aid of modern manufacturing techniques. Methodology and obtained results can be used for other machine elements than gears and constitutes the general theory of production processes in rapid prototyping methods as well as in designing and implementation of production.
Digital image processing and analysis for activated sludge wastewater treatment.
Khan, Muhammad Burhan; Lee, Xue Yong; Nisar, Humaira; Ng, Choon Aun; Yeap, Kim Ho; Malik, Aamir Saeed
2015-01-01
Activated sludge system is generally used in wastewater treatment plants for processing domestic influent. Conventionally the activated sludge wastewater treatment is monitored by measuring physico-chemical parameters like total suspended solids (TSSol), sludge volume index (SVI) and chemical oxygen demand (COD) etc. For the measurement, tests are conducted in the laboratory, which take many hours to give the final measurement. Digital image processing and analysis offers a better alternative not only to monitor and characterize the current state of activated sludge but also to predict the future state. The characterization by image processing and analysis is done by correlating the time evolution of parameters extracted by image analysis of floc and filaments with the physico-chemical parameters. This chapter briefly reviews the activated sludge wastewater treatment; and, procedures of image acquisition, preprocessing, segmentation and analysis in the specific context of activated sludge wastewater treatment. In the latter part additional procedures like z-stacking, image stitching are introduced for wastewater image preprocessing, which are not previously used in the context of activated sludge. Different preprocessing and segmentation techniques are proposed, along with the survey of imaging procedures reported in the literature. Finally the image analysis based morphological parameters and correlation of the parameters with regard to monitoring and prediction of activated sludge are discussed. Hence it is observed that image analysis can play a very useful role in the monitoring of activated sludge wastewater treatment plants.
An Improved Spectral Analysis Method for Fatigue Damage Assessment of Details in Liquid Cargo Tanks
NASA Astrophysics Data System (ADS)
Zhao, Peng-yuan; Huang, Xiao-ping
2018-03-01
Errors will be caused in calculating the fatigue damages of details in liquid cargo tanks by using the traditional spectral analysis method which is based on linear system, for the nonlinear relationship between the dynamic stress and the ship acceleration. An improved spectral analysis method for the assessment of the fatigue damage in detail of a liquid cargo tank is proposed in this paper. Based on assumptions that the wave process can be simulated by summing the sinusoidal waves in different frequencies and the stress process can be simulated by summing the stress processes induced by these sinusoidal waves, the stress power spectral density (PSD) is calculated by expanding the stress processes induced by the sinusoidal waves into Fourier series and adding the amplitudes of each harmonic component with the same frequency. This analysis method can take the nonlinear relationship into consideration and the fatigue damage is then calculated based on the PSD of stress. Take an independent tank in an LNG carrier for example, the accuracy of the improved spectral analysis method is proved much better than that of the traditional spectral analysis method by comparing the calculated damage results with the results calculated by the time domain method. The proposed spectral analysis method is more accurate in calculating the fatigue damages in detail of ship liquid cargo tanks.
Forensic Analysis of Compromised Computers
NASA Technical Reports Server (NTRS)
Wolfe, Thomas
2004-01-01
Directory Tree Analysis File Generator is a Practical Extraction and Reporting Language (PERL) script that simplifies and automates the collection of information for forensic analysis of compromised computer systems. During such an analysis, it is sometimes necessary to collect and analyze information about files on a specific directory tree. Directory Tree Analysis File Generator collects information of this type (except information about directories) and writes it to a text file. In particular, the script asks the user for the root of the directory tree to be processed, the name of the output file, and the number of subtree levels to process. The script then processes the directory tree and puts out the aforementioned text file. The format of the text file is designed to enable the submission of the file as input to a spreadsheet program, wherein the forensic analysis is performed. The analysis usually consists of sorting files and examination of such characteristics of files as ownership, time of creation, and time of most recent access, all of which characteristics are among the data included in the text file.
Hierarchical Processing of Auditory Objects in Humans
Kumar, Sukhbinder; Stephan, Klaas E; Warren, Jason D; Friston, Karl J; Griffiths, Timothy D
2007-01-01
This work examines the computational architecture used by the brain during the analysis of the spectral envelope of sounds, an important acoustic feature for defining auditory objects. Dynamic causal modelling and Bayesian model selection were used to evaluate a family of 16 network models explaining functional magnetic resonance imaging responses in the right temporal lobe during spectral envelope analysis. The models encode different hypotheses about the effective connectivity between Heschl's Gyrus (HG), containing the primary auditory cortex, planum temporale (PT), and superior temporal sulcus (STS), and the modulation of that coupling during spectral envelope analysis. In particular, we aimed to determine whether information processing during spectral envelope analysis takes place in a serial or parallel fashion. The analysis provides strong support for a serial architecture with connections from HG to PT and from PT to STS and an increase of the HG to PT connection during spectral envelope analysis. The work supports a computational model of auditory object processing, based on the abstraction of spectro-temporal “templates” in the PT before further analysis of the abstracted form in anterior temporal lobe areas. PMID:17542641
Engine Icing Data - An Analytics Approach
NASA Technical Reports Server (NTRS)
Fitzgerald, Brooke A.; Flegel, Ashlie B.
2017-01-01
Engine icing researchers at the NASA Glenn Research Center use the Escort data acquisition system in the Propulsion Systems Laboratory (PSL) to generate and collect a tremendous amount of data every day. Currently these researchers spend countless hours processing and formatting their data, selecting important variables, and plotting relationships between variables, all by hand, generally analyzing data in a spreadsheet-style program (such as Microsoft Excel). Though spreadsheet-style analysis is familiar and intuitive to many, processing data in spreadsheets is often unreproducible and small mistakes are easily overlooked. Spreadsheet-style analysis is also time inefficient. The same formatting, processing, and plotting procedure has to be repeated for every dataset, which leads to researchers performing the same tedious data munging process over and over instead of making discoveries within their data. This paper documents a data analysis tool written in Python hosted in a Jupyter notebook that vastly simplifies the analysis process. From the file path of any folder containing time series datasets, this tool batch loads every dataset in the folder, processes the datasets in parallel, and ingests them into a widget where users can search for and interactively plot subsets of columns in a number of ways with a click of a button, easily and intuitively comparing their data and discovering interesting dynamics. Furthermore, comparing variables across data sets and integrating video data (while extremely difficult with spreadsheet-style programs) is quite simplified in this tool. This tool has also gathered interest outside the engine icing branch, and will be used by researchers across NASA Glenn Research Center. This project exemplifies the enormous benefit of automating data processing, analysis, and visualization, and will help researchers move from raw data to insight in a much smaller time frame.
Platform for Post-Processing Waveform-Based NDE
NASA Technical Reports Server (NTRS)
Roth, Don J.
2010-01-01
Signal- and image-processing methods are commonly needed to extract information from the waves, improve resolution of, and highlight defects in an image. Since some similarity exists for all waveform-based nondestructive evaluation (NDE) methods, it would seem that a common software platform containing multiple signal- and image-processing techniques to process the waveforms and images makes sense where multiple techniques, scientists, engineers, and organizations are involved. NDE Wave & Image Processor Version 2.0 software provides a single, integrated signal- and image-processing and analysis environment for total NDE data processing and analysis. It brings some of the most useful algorithms developed for NDE over the past 20 years into a commercial-grade product. The software can import signal/spectroscopic data, image data, and image series data. This software offers the user hundreds of basic and advanced signal- and image-processing capabilities including esoteric 1D and 2D wavelet-based de-noising, de-trending, and filtering. Batch processing is included for signal- and image-processing capability so that an optimized sequence of processing operations can be applied to entire folders of signals, spectra, and images. Additionally, an extensive interactive model-based curve-fitting facility has been included to allow fitting of spectroscopy data such as from Raman spectroscopy. An extensive joint-time frequency module is included for analysis of non-stationary or transient data such as that from acoustic emission, vibration, or earthquake data.
Gene ARMADA: an integrated multi-analysis platform for microarray data implemented in MATLAB.
Chatziioannou, Aristotelis; Moulos, Panagiotis; Kolisis, Fragiskos N
2009-10-27
The microarray data analysis realm is ever growing through the development of various tools, open source and commercial. However there is absence of predefined rational algorithmic analysis workflows or batch standardized processing to incorporate all steps, from raw data import up to the derivation of significantly differentially expressed gene lists. This absence obfuscates the analytical procedure and obstructs the massive comparative processing of genomic microarray datasets. Moreover, the solutions provided, heavily depend on the programming skills of the user, whereas in the case of GUI embedded solutions, they do not provide direct support of various raw image analysis formats or a versatile and simultaneously flexible combination of signal processing methods. We describe here Gene ARMADA (Automated Robust MicroArray Data Analysis), a MATLAB implemented platform with a Graphical User Interface. This suite integrates all steps of microarray data analysis including automated data import, noise correction and filtering, normalization, statistical selection of differentially expressed genes, clustering, classification and annotation. In its current version, Gene ARMADA fully supports 2 coloured cDNA and Affymetrix oligonucleotide arrays, plus custom arrays for which experimental details are given in tabular form (Excel spreadsheet, comma separated values, tab-delimited text formats). It also supports the analysis of already processed results through its versatile import editor. Besides being fully automated, Gene ARMADA incorporates numerous functionalities of the Statistics and Bioinformatics Toolboxes of MATLAB. In addition, it provides numerous visualization and exploration tools plus customizable export data formats for seamless integration by other analysis tools or MATLAB, for further processing. Gene ARMADA requires MATLAB 7.4 (R2007a) or higher and is also distributed as a stand-alone application with MATLAB Component Runtime. Gene ARMADA provides a highly adaptable, integrative, yet flexible tool which can be used for automated quality control, analysis, annotation and visualization of microarray data, constituting a starting point for further data interpretation and integration with numerous other tools.
Xu, Min; Zhang, Lei; Yue, Hong-Shui; Pang, Hong-Wei; Ye, Zheng-Liang; Ding, Li
2017-10-01
To establish an on-line monitoring method for extraction process of Schisandrae Chinensis Fructus, the formula medicinal material of Yiqi Fumai lyophilized injection by combining near infrared spectroscopy with multi-variable data analysis technology. The multivariate statistical process control (MSPC) model was established based on 5 normal batches in production and 2 test batches were monitored by PC scores, DModX and Hotelling T2 control charts. The results showed that MSPC model had a good monitoring ability for the extraction process. The application of the MSPC model to actual production process could effectively achieve on-line monitoring for extraction process of Schisandrae Chinensis Fructus, and can reflect the change of material properties in the production process in real time. This established process monitoring method could provide reference for the application of process analysis technology in the process quality control of traditional Chinese medicine injections. Copyright© by the Chinese Pharmaceutical Association.
Ivezic, Nenad; Potok, Thomas E.
2003-09-30
A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.
Process yield improvements with process control terminal for varian serial ion implanters
NASA Astrophysics Data System (ADS)
Higashi, Harry; Soni, Ameeta; Martinez, Larry; Week, Ken
Implant processes in a modern wafer production fab are extremely complex. There can be several types of misprocessing, i.e. wrong dose or species, double implants and missed implants. Process Control Terminals (PCT) for Varian 350Ds installed at Intel fabs were found to substantially reduce the number of misprocessing steps. This paper describes those misprocessing steps and their subsequent reduction with use of PCTs. Reliable and simple process control with serial process ion implanters has been in increasing demand. A well designed process control terminal greatly increases device yield by monitoring all pertinent implanter functions and enabling process engineering personnel to set up process recipes for simple and accurate system operation. By programming user-selectable interlocks, implant errors are reduced and those that occur are logged for further analysis and prevention. A process control terminal should also be compatible with office personal computers for greater flexibility in system use and data analysis. The impact from the capability of a process control terminal is increased productivity, ergo higher device yield.
NASA Technical Reports Server (NTRS)
Gaston, S.; Wertheim, M.; Orourke, J. A.
1973-01-01
Summary, consolidation and analysis of specifications, manufacturing process and test controls, and performance results for OAO-2 and OAO-3 lot 20 Amp-Hr sealed nickel cadmium cells and batteries are reported. Correlation of improvements in control requirements with performance is a key feature. Updates for a cell/battery computer model to improve performance prediction capability are included. Applicability of regression analysis computer techniques to relate process controls to performance is checked.
An analysis of hydrogen production via closed-cycle schemes. [thermochemical processings from water
NASA Technical Reports Server (NTRS)
Chao, R. E.; Cox, K. E.
1975-01-01
A thermodynamic analysis and state-of-the-art review of three basic schemes for production of hydrogen from water: electrolysis, thermal water-splitting, and multi-step thermochemical closed cycles is presented. Criteria for work-saving thermochemical closed-cycle processes are established, and several schemes are reviewed in light of such criteria. An economic analysis is also presented in the context of energy costs.
Analysis And Control System For Automated Welding
NASA Technical Reports Server (NTRS)
Powell, Bradley W.; Burroughs, Ivan A.; Kennedy, Larry Z.; Rodgers, Michael H.; Goode, K. Wayne
1994-01-01
Automated variable-polarity plasma arc (VPPA) welding apparatus operates under electronic supervision by welding analysis and control system. System performs all major monitoring and controlling functions. It acquires, analyzes, and displays weld-quality data in real time and adjusts process parameters accordingly. Also records pertinent data for use in post-weld analysis and documentation of quality. System includes optoelectronic sensors and data processors that provide feedback control of welding process.
Combination of PCA and LORETA for sources analysis of ERP data: an emotional processing study
NASA Astrophysics Data System (ADS)
Hu, Jin; Tian, Jie; Yang, Lei; Pan, Xiaohong; Liu, Jiangang
2006-03-01
The purpose of this paper is to study spatiotemporal patterns of neuronal activity in emotional processing by analysis of ERP data. 108 pictures (categorized as positive, negative and neutral) were presented to 24 healthy, right-handed subjects while 128-channel EEG data were recorded. An analysis of two steps was applied to the ERP data. First, principal component analysis was performed to obtain significant ERP components. Then LORETA was applied to each component to localize their brain sources. The first six principal components were extracted, each of which showed different spatiotemporal patterns of neuronal activity. The results agree with other emotional study by fMRI or PET. The combination of PCA and LORETA can be used to analyze spatiotemporal patterns of ERP data in emotional processing.
LANL Institutional Decision Support By Process Modeling and Analysis Group (AET-2)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Booth, Steven Richard
2016-04-04
AET-2 has expertise in process modeling, economics, business case analysis, risk assessment, Lean/Six Sigma tools, and decision analysis to provide timely decision support to LANS leading to continuous improvement. This capability is critical during the current tight budgetary environment as LANS pushes to identify potential areas of cost savings and efficiencies. An important arena is business systems and operations, where processes can impact most or all laboratory employees. Lab-wide efforts are needed to identify and eliminate inefficiencies to accomplish Director McMillan’s charge of “doing more with less.” LANS faces many critical and potentially expensive choices that require sound decision supportmore » to ensure success. AET-2 is available to provide this analysis support to expedite the decisions at hand.« less
Kim, Heung-Kyu; Lee, Seong Hyeon; Choi, Hyunjoo
2015-01-01
Using an inverse analysis technique, the heat transfer coefficient on the die-workpiece contact surface of a hot stamping process was evaluated as a power law function of contact pressure. This evaluation was to determine whether the heat transfer coefficient on the contact surface could be used for finite element analysis of the entire hot stamping process. By comparing results of the finite element analysis and experimental measurements of the phase transformation, an evaluation was performed to determine whether the obtained heat transfer coefficient function could provide reasonable finite element prediction for workpiece properties affected by the hot stamping process. PMID:28788046
MO-E-9A-01: Risk Based Quality Management: TG100 In Action
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huq, M; Palta, J; Dunscombe, P
2014-06-15
One of the goals of quality management in radiation therapy is to gain high confidence that patients will receive the prescribed treatment correctly. To accomplish these goals professional societies such as the American Association of Physicists in Medicine (AAPM) has published many quality assurance (QA), quality control (QC), and quality management (QM) guidance documents. In general, the recommendations provided in these documents have emphasized on performing device-specific QA at the expense of process flow and protection of the patient against catastrophic errors. Analyses of radiation therapy incidents find that they are most often caused by flaws in the overall therapymore » process, from initial consult through final treatment, than by isolated hardware or computer failures detectable by traditional physics QA. This challenge is shared by many intrinsically hazardous industries. Risk assessment tools and analysis techniques have been developed to define, identify, and eliminate known and/or potential failures, problems, or errors, from a system, process and/or service before they reach the customer. These include, but are not limited to, process mapping, failure modes and effects analysis (FMEA), fault tree analysis (FTA), and establishment of a quality management program that best avoids the faults and risks that have been identified in the overall process. These tools can be easily adapted to radiation therapy practices because of their simplicity and effectiveness to provide efficient ways to enhance the safety and quality of treatment processes. Task group 100 (TG100) of AAPM has developed a risk-based quality management program that uses these tools. This session will be devoted to a discussion of these tools and how these tools can be used in a given radiotherapy clinic to develop a risk based QM program. Learning Objectives: Learn how to design a process map for a radiotherapy process. Learn how to perform a FMEA analysis for a given process. Learn what Fault tree analysis is all about. Learn how to design a quality management program based upon the information obtained from process mapping, FMEA and FTA.« less
State machine analysis of sensor data from dynamic processes
Cook, William R.; Brabson, John M.; Deland, Sharon M.
2003-12-23
A state machine model analyzes sensor data from dynamic processes at a facility to identify the actual processes that were performed at the facility during a period of interest for the purpose of remote facility inspection. An inspector can further input the expected operations into the state machine model and compare the expected, or declared, processes to the actual processes to identify undeclared processes at the facility. The state machine analysis enables the generation of knowledge about the state of the facility at all levels, from location of physical objects to complex operational concepts. Therefore, the state machine method and apparatus may benefit any agency or business with sensored facilities that stores or manipulates expensive, dangerous, or controlled materials or information.
ERIC Educational Resources Information Center
Indefrey, Peter
2006-01-01
This article presents the results of a meta-analysis of 30 hemodynamic experiments comparing first language (L1) and second language (L2) processing in a range of tasks. The results suggest that reliably stronger activation during L2 processing is found (a) only for task-specific subgroups of L2 speakers and (b) within some, but not all regions…
The Use of Multi-Criteria Evaluation and Network Analysis in the Area Development Planning Process
2013-03-01
layouts. The alternative layout scoring process, base in multi-criteria evaluation, returns a quantitative score for each alternative layout and a...The purpose of this research was to develop improvements to the area development planning process. These plans are used to improve operations within...an installation sub-section by altering the physical layout of facilities. One methodology was developed based on apply network analysis concepts to
Global analysis of bacterial transcription factors to predict cellular target processes.
Doerks, Tobias; Andrade, Miguel A; Lathe, Warren; von Mering, Christian; Bork, Peer
2004-03-01
Whole-genome sequences are now available for >100 bacterial species, giving unprecedented power to comparative genomics approaches. We have applied genome-context methods to predict target processes that are regulated by transcription factors (TFs). Of 128 orthologous groups of proteins annotated as TFs, to date, 36 are functionally uncharacterized; in our analysis we predict a probable cellular target process or biochemical pathway for half of these functionally uncharacterized TFs.
Contractor relationships and inter-organizational strategies in NASA's R and D acquisition process
NASA Technical Reports Server (NTRS)
Guiltinan, J.
1976-01-01
Interorganizational analysis of NASA's acquisition process for research and development systems is discussed. The importance of understanding the contractor environment, constraints, and motives in selecting an acquisition strategy is demonstrated. By articulating clear project goals, by utilizing information about the contractor and his needs at each stage in the acquisition process, and by thorough analysis of the inter-organizational relationship, improved selection of acquisition strategies and business practices is possible.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Yunhua; Jones, Susanne B.; Biddy, Mary J.
2012-08-01
This study reports the comparison of biomass gasification based syngas-to-distillate (S2D) systems using techno-economic analysis (TEA). Three cases, state of technology (SOT) case, goal case, and conventional case, were compared in terms of performance and cost. The SOT case and goal case represent technology being developed at Pacific Northwest National Laboratory for a process starting with syngas using a single-step dual-catalyst reactor for distillate generation (S2D process). The conventional case mirrors the two-step S2D process previously utilized and reported by Mobil using natural gas feedstock and consisting of separate syngas-to-methanol and methanol-to-gasoline (MTG) processes. Analysis of the three cases revealedmore » that the goal case could indeed reduce fuel production cost over the conventional case, but that the SOT was still more expensive than the conventional. The SOT case suffers from low one-pass yield and high selectivity to light hydrocarbons, both of which drive up production cost. Sensitivity analysis indicated that light hydrocarbon yield, single pass conversion efficiency, and reactor space velocity are the key factors driving the high cost for the SOT case.« less
Kryklywy, James H; Macpherson, Ewan A; Mitchell, Derek G V
2018-04-01
Emotion can have diverse effects on behaviour and perception, modulating function in some circumstances, and sometimes having little effect. Recently, it was identified that part of the heterogeneity of emotional effects could be due to a dissociable representation of emotion in dual pathway models of sensory processing. Our previous fMRI experiment using traditional univariate analyses showed that emotion modulated processing in the auditory 'what' but not 'where' processing pathway. The current study aims to further investigate this dissociation using a more recently emerging multi-voxel pattern analysis searchlight approach. While undergoing fMRI, participants localized sounds of varying emotional content. A searchlight multi-voxel pattern analysis was conducted to identify activity patterns predictive of sound location and/or emotion. Relative to the prior univariate analysis, MVPA indicated larger overlapping spatial and emotional representations of sound within early secondary regions associated with auditory localization. However, consistent with the univariate analysis, these two dimensions were increasingly segregated in late secondary and tertiary regions of the auditory processing streams. These results, while complimentary to our original univariate analyses, highlight the utility of multiple analytic approaches for neuroimaging, particularly for neural processes with known representations dependent on population coding.
The paper discusses the analysis of trace-level organic combustion process emissions using novel multidimensional gas chromatography-mass spectrometry (MDGC-MS) procedures. It outlines the application of the technique through the analyses of various incinerator effluent and produ...
2009-09-01
SAS Statistical Analysis Software SE Systems Engineering SEP Systems Engineering Process SHP Shaft Horsepower SIGINT Signals Intelligence......management occurs (OSD 2002). The Systems Engineering Process (SEP), displayed in Figure 2, is a comprehensive , iterative and recursive problem
Kashiha, Mohammad Amin; Green, Angela R; Sales, Tatiana Glogerley; Bahr, Claudia; Berckmans, Daniel; Gates, Richard S
2014-10-01
Image processing systems have been widely used in monitoring livestock for many applications, including identification, tracking, behavior analysis, occupancy rates, and activity calculations. The primary goal of this work was to quantify image processing performance when monitoring laying hens by comparing length of stay in each compartment as detected by the image processing system with the actual occurrences registered by human observations. In this work, an image processing system was implemented and evaluated for use in an environmental animal preference chamber to detect hen navigation between 4 compartments of the chamber. One camera was installed above each compartment to produce top-view images of the whole compartment. An ellipse-fitting model was applied to captured images to detect whether the hen was present in a compartment. During a choice-test study, mean ± SD success detection rates of 95.9 ± 2.6% were achieved when considering total duration of compartment occupancy. These results suggest that the image processing system is currently suitable for determining the response measures for assessing environmental choices. Moreover, the image processing system offered a comprehensive analysis of occupancy while substantially reducing data processing time compared with the time-intensive alternative of manual video analysis. The above technique was used to monitor ammonia aversion in the chamber. As a preliminary pilot study, different levels of ammonia were applied to different compartments while hens were allowed to navigate between compartments. Using the automated monitor tool to assess occupancy, a negative trend of compartment occupancy with ammonia level was revealed, though further examination is needed. ©2014 Poultry Science Association Inc.
Develop Advanced Nonlinear Signal Analysis Topographical Mapping System
NASA Technical Reports Server (NTRS)
Jong, Jen-Yi
1997-01-01
During the development of the SSME, a hierarchy of advanced signal analysis techniques for mechanical signature analysis has been developed by NASA and AI Signal Research Inc. (ASRI) to improve the safety and reliability for Space Shuttle operations. These techniques can process and identify intelligent information hidden in a measured signal which is often unidentifiable using conventional signal analysis methods. Currently, due to the highly interactive processing requirements and the volume of dynamic data involved, detailed diagnostic analysis is being performed manually which requires immense man-hours with extensive human interface. To overcome this manual process, NASA implemented this program to develop an Advanced nonlinear signal Analysis Topographical Mapping System (ATMS) to provide automatic/unsupervised engine diagnostic capabilities. The ATMS will utilize a rule-based Clips expert system to supervise a hierarchy of diagnostic signature analysis techniques in the Advanced Signal Analysis Library (ASAL). ASAL will perform automatic signal processing, archiving, and anomaly detection/identification tasks in order to provide an intelligent and fully automated engine diagnostic capability. The ATMS has been successfully developed under this contract. In summary, the program objectives to design, develop, test and conduct performance evaluation for an automated engine diagnostic system have been successfully achieved. Software implementation of the entire ATMS system on MSFC's OISPS computer has been completed. The significance of the ATMS developed under this program is attributed to the fully automated coherence analysis capability for anomaly detection and identification which can greatly enhance the power and reliability of engine diagnostic evaluation. The results have demonstrated that ATMS can significantly save time and man-hours in performing engine test/flight data analysis and performance evaluation of large volumes of dynamic test data.
Inauen, A; Jenny, G J; Bauer, G F
2012-06-01
This article focuses on organizational analysis in workplace health promotion (WHP) projects. It shows how this analysis can be designed such that it provides rational data relevant to the further context-specific and goal-oriented planning of WHP and equally supports individual and organizational change processes implied by WHP. Design principles for organizational analysis were developed on the basis of a narrative review of the guiding principles of WHP interventions and organizational change as well as the scientific principles of data collection. Further, the practical experience of WHP consultants who routinely conduct organizational analysis was considered. This resulted in a framework with data-oriented and change-oriented design principles, addressing the following elements of organizational analysis in WHP: planning the overall procedure, data content, data-collection methods and information processing. Overall, the data-oriented design principles aim to produce valid, reliable and representative data, whereas the change-oriented design principles aim to promote motivation, coherence and a capacity for self-analysis. We expect that the simultaneous consideration of data- and change-oriented design principles for organizational analysis will strongly support the WHP process. We finally illustrate the applicability of the design principles to health promotion within a WHP case study.