NASA Astrophysics Data System (ADS)
Yussup, N.; Rahman, N. A. A.; Ibrahim, M. M.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.
2017-01-01
Neutron Activation Analysis (NAA) process has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s. Most of the procedures established especially from sample registration to sample analysis are performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient. Hence, a software to support the system automation is developed to provide an effective method to replace redundant manual data entries and produce faster sample analysis and calculation process. This paper describes the design and development of automation software for NAA process which consists of three sub-programs. The sub-programs are sample registration, hardware control and data acquisition; and sample analysis. The data flow and connection between the sub-programs will be explained. The software is developed by using National Instrument LabView development package.
An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis
NASA Astrophysics Data System (ADS)
Kim, Yongmin; Alexander, Thomas
1986-06-01
In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.
Analysis of complex decisionmaking processes. [with application to jet engine development
NASA Technical Reports Server (NTRS)
Hill, J. D.; Ollila, R. G.
1978-01-01
The analysis of corporate decisionmaking processes related to major system developments is unusually difficult because of the number of decisionmakers involved in the process and the long development cycle. A method for analyzing such decision processes is developed and illustrated through its application to the analysis of the commercial jet engine development process. The method uses interaction matrices as the key tool for structuring the problem, recording data, and analyzing the data to establish the rank order of the major factors affecting development decisions. In the example, the use of interaction matrices permitted analysts to collect and analyze approximately 50 factors that influenced decisions during the four phases of the development cycle, and to determine the key influencers of decisions at each development phase. The results of this study indicate that the cost of new technology installed on an aircraft is the prime concern of the engine manufacturer.
Introduction of male circumcision for HIV prevention in Uganda: analysis of the policy process.
Odoch, Walter Denis; Kabali, Kenneth; Ankunda, Racheal; Zulu, Joseph Mumba; Tetui, Moses
2015-06-20
Health policy analysis is important for all health policies especially in fields with ever changing evidence-based interventions such as HIV prevention. However, there are few published reports of health policy analysis in sub-Saharan Africa in this field. This study explored the policy process of the introduction of male circumcision (MC) for HIV prevention in Uganda in order to inform the development processes of similar health policies. Desk review of relevant documents was conducted between March and May 2012. Thematic analysis was used to analyse the data. Conceptual frameworks that demonstrate the interrelationship within the policy development processes and influence of actors in the policy development processes guided the analysis. Following the introduction of MC on the national policy agenda in 2007, negotiation and policy formulation preceded its communication and implementation. Policy proponents included academic researchers in the early 2000s and development partners around 2007. Favourable contextual factors that supported the development of the policy included the rising HIV prevalence, adoption of MC for HIV prevention in other sub-Saharan African countries, and expertise on MC. Additionally, the networking capability of proponents facilitated the change in position of non-supportive or neutral actors. Non-supportive and neutral actors in the initial stages of the policy development process included the Ministry of Health, traditional and Muslim leaders, and the Republican President. Using political authority, legitimacy, and charisma, actors who opposed the policy tried to block the policy development process. Researchers' initial disregard of the Ministry of Health in the research process of MC and the missing civil society advocacy arm contributed to delays in the policy development process. This study underscores the importance of securing top political leadership as well as key implementing partners' support in policy development processes. Equally important is the appreciation of the various forms of actors' power and how such power shapes the policy agenda, development process, and content.
STATISTICAL ANALYSIS OF SNAP 10A THERMOELECTRIC CONVERTER ELEMENT PROCESS DEVELOPMENT VARIABLES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitch, S.H.; Morris, J.W.
1962-12-15
Statistical analysis, primarily analysis of variance, was applied to evaluate several factors involved in the development of suitable fabrication and processing techniques for the production of lead telluride thermoelectric elements for the SNAP 10A energy conversion system. The analysis methods are described as to their application for determining the effects of various processing steps, estabIishing the value of individual operations, and evaluating the significance of test results. The elimination of unnecessary or detrimental processing steps was accomplished and the number of required tests was substantially reduced by application of these statistical methods to the SNAP 10A production development effort. (auth)
Self-conscious robotic system design process--from analysis to implementation.
Chella, Antonio; Cossentino, Massimo; Seidita, Valeria
2011-01-01
Developing robotic systems endowed with self-conscious capabilities means realizing complex sub-systems needing ad-hoc software engineering techniques for their modelling, analysis and implementation. In this chapter the whole process (from analysis to implementation) to model the development of self-conscious robotic systems is presented and the new created design process, PASSIC, supporting each part of it, is fully illustrated.
2011-12-01
systems engineering technical and technical management processes. Technical Planning, Stakeholders Requirements Development, and Architecture Design were...Stakeholder Requirements Definition, Architecture Design and Technical Planning. A purposive sampling of AFRL rapid development program managers and engineers...emphasize one process over another however Architecture Design , Implementation scored higher among Technical Processes. Decision Analysis, Technical
Schaub, Jochen; Clemens, Christoph; Kaufmann, Hitto; Schulz, Torsten W
2012-01-01
Development of efficient bioprocesses is essential for cost-effective manufacturing of recombinant therapeutic proteins. To achieve further process improvement and process rationalization comprehensive data analysis of both process data and phenotypic cell-level data is essential. Here, we present a framework for advanced bioprocess data analysis consisting of multivariate data analysis (MVDA), metabolic flux analysis (MFA), and pathway analysis for mapping of large-scale gene expression data sets. This data analysis platform was applied in a process development project with an IgG-producing Chinese hamster ovary (CHO) cell line in which the maximal product titer could be increased from about 5 to 8 g/L.Principal component analysis (PCA), k-means clustering, and partial least-squares (PLS) models were applied to analyze the macroscopic bioprocess data. MFA and gene expression analysis revealed intracellular information on the characteristics of high-performance cell cultivations. By MVDA, for example, correlations between several essential amino acids and the product concentration were observed. Also, a grouping into rather cell specific productivity-driven and process control-driven processes could be unraveled. By MFA, phenotypic characteristics in glycolysis, glutaminolysis, pentose phosphate pathway, citrate cycle, coupling of amino acid metabolism to citrate cycle, and in the energy yield could be identified. By gene expression analysis 247 deregulated metabolic genes were identified which are involved, inter alia, in amino acid metabolism, transport, and protein synthesis.
The Use of Multi-Criteria Evaluation and Network Analysis in the Area Development Planning Process
2013-03-01
layouts. The alternative layout scoring process, base in multi-criteria evaluation, returns a quantitative score for each alternative layout and a...The purpose of this research was to develop improvements to the area development planning process. These plans are used to improve operations within...an installation sub-section by altering the physical layout of facilities. One methodology was developed based on apply network analysis concepts to
Huang, Jun; Kaul, Goldi; Cai, Chunsheng; Chatlapalli, Ramarao; Hernandez-Abad, Pedro; Ghosh, Krishnendu; Nagi, Arwinder
2009-12-01
To facilitate an in-depth process understanding, and offer opportunities for developing control strategies to ensure product quality, a combination of experimental design, optimization and multivariate techniques was integrated into the process development of a drug product. A process DOE was used to evaluate effects of the design factors on manufacturability and final product CQAs, and establish design space to ensure desired CQAs. Two types of analyses were performed to extract maximal information, DOE effect & response surface analysis and multivariate analysis (PCA and PLS). The DOE effect analysis was used to evaluate the interactions and effects of three design factors (water amount, wet massing time and lubrication time), on response variables (blend flow, compressibility and tablet dissolution). The design space was established by the combined use of DOE, optimization and multivariate analysis to ensure desired CQAs. Multivariate analysis of all variables from the DOE batches was conducted to study relationships between the variables and to evaluate the impact of material attributes/process parameters on manufacturability and final product CQAs. The integrated multivariate approach exemplifies application of QbD principles and tools to drug product and process development.
NASA Astrophysics Data System (ADS)
Kurniati, D. R.; Rohman, I.
2018-05-01
This study aims to analyze the concepts and science process skills in bomb calorimeter experiment as a basis for developing the virtual laboratory of bomb calorimeter. This study employed research and development method (R&D) to gain the answer to the proposed problems. This paper discussed the concepts and process skills analysis. The essential concepts and process skills associated with bomb calorimeter are analyze by optimizing the bomb calorimeter experiment. The concepts analysis found seven fundamental concepts to be concerned in developing the virtual laboratory that are internal energy, burning heat, perfect combustion, incomplete combustion, calorimeter constant, bomb calorimeter, and Black principle. Since the concept of bomb calorimeter, perfect and incomplete combustion created to figure out the real situation and contain controllable variables, in virtual the concepts displayed in the form of simulation. Meanwhile, the last four concepts presented in the form of animation because no variable found to be controlled. The process skills analysis detect four notable skills to be developed that are ability to observe, design experiment, interpretation, and communication skills.
CPAS Preflight Drop Test Analysis Process
NASA Technical Reports Server (NTRS)
Englert, Megan E.; Bledsoe, Kristin J.; Romero, Leah M.
2015-01-01
Throughout the Capsule Parachute Assembly System (CPAS) drop test program, the CPAS Analysis Team has developed a simulation and analysis process to support drop test planning and execution. This process includes multiple phases focused on developing test simulations and communicating results to all groups involved in the drop test. CPAS Engineering Development Unit (EDU) series drop test planning begins with the development of a basic operational concept for each test. Trajectory simulation tools include the Flight Analysis and Simulation Tool (FAST) for single bodies, and the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulation for the mated vehicle. Results are communicated to the team at the Test Configuration Review (TCR) and Test Readiness Review (TRR), as well as at Analysis Integrated Product Team (IPT) meetings in earlier and intermediate phases of the pre-test planning. The ability to plan and communicate efficiently with rapidly changing objectives and tight schedule constraints is a necessity for safe and successful drop tests.
Process Correlation Analysis Model for Process Improvement Identification
Park, Sooyong
2014-01-01
Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170
Process correlation analysis model for process improvement identification.
Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong
2014-01-01
Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.
Analysis of the Technical Writing Profession through the DACUM Process.
ERIC Educational Resources Information Center
Nolan, Timothy; Green, Marc
To help develop a curriculum program for technical writers, Cincinnati Technical College used the Developing a Curriculum (DACUM) method to produce a technical writing skills profile. DACUM develops an occupation analysis through a modified brainstorming process by a panel of expert workers under the direction of a qualified coordinator. This…
Develop Advanced Nonlinear Signal Analysis Topographical Mapping System
NASA Technical Reports Server (NTRS)
Jong, Jen-Yi
1997-01-01
During the development of the SSME, a hierarchy of advanced signal analysis techniques for mechanical signature analysis has been developed by NASA and AI Signal Research Inc. (ASRI) to improve the safety and reliability for Space Shuttle operations. These techniques can process and identify intelligent information hidden in a measured signal which is often unidentifiable using conventional signal analysis methods. Currently, due to the highly interactive processing requirements and the volume of dynamic data involved, detailed diagnostic analysis is being performed manually which requires immense man-hours with extensive human interface. To overcome this manual process, NASA implemented this program to develop an Advanced nonlinear signal Analysis Topographical Mapping System (ATMS) to provide automatic/unsupervised engine diagnostic capabilities. The ATMS will utilize a rule-based Clips expert system to supervise a hierarchy of diagnostic signature analysis techniques in the Advanced Signal Analysis Library (ASAL). ASAL will perform automatic signal processing, archiving, and anomaly detection/identification tasks in order to provide an intelligent and fully automated engine diagnostic capability. The ATMS has been successfully developed under this contract. In summary, the program objectives to design, develop, test and conduct performance evaluation for an automated engine diagnostic system have been successfully achieved. Software implementation of the entire ATMS system on MSFC's OISPS computer has been completed. The significance of the ATMS developed under this program is attributed to the fully automated coherence analysis capability for anomaly detection and identification which can greatly enhance the power and reliability of engine diagnostic evaluation. The results have demonstrated that ATMS can significantly save time and man-hours in performing engine test/flight data analysis and performance evaluation of large volumes of dynamic test data.
Techno-economic analysis Process model development for existing and conceptual processes Detailed heat integration Economic analysis of integrated processes Integration of process simulation learnings into control ;Conceptual Process Design and Techno-Economic Assessment of Ex Situ Catalytic Fast Pyrolysis of Biomass: A
NASA Technical Reports Server (NTRS)
Griesel, Martha Ann
1988-01-01
Several Laboratory software development projects that followed nonstandard development processes, which were hybrids of incremental development and prototyping, are being studied. Factors in the project environment leading to the decision to use a nonstandard development process and affecting its success are analyzed. A simple characterization of project environment based on this analysis is proposed, together with software development approaches which have been found effective for each category. These approaches include both documentation and review requirements.
TOOKUIL: A case study in user interface development for safety code application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gray, D.L.; Harkins, C.K.; Hoole, J.G.
1997-07-01
Traditionally, there has been a very high learning curve associated with using nuclear power plant (NPP) analysis codes. Even for seasoned plant analysts and engineers, the process of building or modifying an input model for present day NPP analysis codes is tedious, error prone, and time consuming. Current cost constraints and performance demands place an additional burden on today`s safety analysis community. Advances in graphical user interface (GUI) technology have been applied to obtain significant productivity and quality assurance improvements for the Transient Reactor Analysis Code (TRAC) input model development. KAPL Inc. has developed an X Windows-based graphical user interfacemore » named TOOKUIL which supports the design and analysis process, acting as a preprocessor, runtime editor, help system, and post processor for TRAC. This paper summarizes the objectives of the project, the GUI development process and experiences, and the resulting end product, TOOKUIL.« less
Modeling and Analysis of the Reverse Water Gas Shift Process for In-Situ Propellant Production
NASA Technical Reports Server (NTRS)
Whitlow, Jonathan E.
2000-01-01
This report focuses on the development of mathematical models and simulation tools developed for the Reverse Water Gas Shift (RWGS) process. This process is a candidate technology for oxygen production on Mars under the In-Situ Propellant Production (ISPP) project. An analysis of the RWGS process was performed using a material balance for the system. The material balance is very complex due to the downstream separations and subsequent recycle inherent with the process. A numerical simulation was developed for the RWGS process to provide a tool for analysis and optimization of experimental hardware, which will be constructed later this year at Kennedy Space Center (KSC). Attempts to solve the material balance for the system, which can be defined by 27 nonlinear equations, initially failed. A convergence scheme was developed which led to successful solution of the material balance, however the simplified equations used for the gas separation membrane were found insufficient. Additional more rigorous models were successfully developed and solved for the membrane separation. Sample results from these models are included in this report, with recommendations for experimental work needed for model validation.
Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.; Nagpal, Vinod K.
2007-01-01
An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.
Methodological aspects of fuel performance system analysis at raw hydrocarbon processing plants
NASA Astrophysics Data System (ADS)
Kulbjakina, A. V.; Dolotovskij, I. V.
2018-01-01
The article discusses the methodological aspects of fuel performance system analysis at raw hydrocarbon (RH) processing plants. Modern RH processing facilities are the major consumers of energy resources (ER) for their own needs. To reduce ER, including fuel consumption, and to develop rational fuel system structure are complex and relevant scientific tasks that can only be done using system analysis and complex system synthesis. In accordance with the principles of system analysis, the hierarchical structure of the fuel system, the block scheme for the synthesis of the most efficient alternative of the fuel system using mathematical models and the set of performance criteria have been developed on the main stages of the study. The results from the introduction of specific engineering solutions to develop their own energy supply sources for RH processing facilities have been provided.
NASA Technical Reports Server (NTRS)
Amundsen, R. M.; Feldhaus, W. S.; Little, A. D.; Mitchum, M. V.
1995-01-01
Electronic integration of design and analysis processes was achieved and refined at Langley Research Center (LaRC) during the development of an optical bench for a laser-based aerospace experiment. Mechanical design has been integrated with thermal, structural and optical analyses. Electronic import of the model geometry eliminates the repetitive steps of geometry input to develop each analysis model, leading to faster and more accurate analyses. Guidelines for integrated model development are given. This integrated analysis process has been built around software that was already in use by designers and analysis at LaRC. The process as currently implemented used Pro/Engineer for design, Pro/Manufacturing for fabrication, PATRAN for solid modeling, NASTRAN for structural analysis, SINDA-85 and P/Thermal for thermal analysis, and Code V for optical analysis. Currently, the only analysis model to be built manually is the Code V model; all others can be imported for the Pro/E geometry. The translator from PATRAN results to Code V optical analysis (PATCOD) was developed and tested at LaRC. Directions for use of the translator or other models are given.
Model prototype utilization in the analysis of fault tolerant control and data processing systems
NASA Astrophysics Data System (ADS)
Kovalev, I. V.; Tsarev, R. Yu; Gruzenkin, D. V.; Prokopenko, A. V.; Knyazkov, A. N.; Laptenok, V. D.
2016-04-01
The procedure assessing the profit of control and data processing system implementation is presented in the paper. The reasonability of model prototype creation and analysis results from the implementing of the approach of fault tolerance provision through the inclusion of structural and software assessment redundancy. The developed procedure allows finding the best ratio between the development cost and the analysis of model prototype and earnings from the results of this utilization and information produced. The suggested approach has been illustrated by the model example of profit assessment and analysis of control and data processing system.
Risk analysis within environmental impact assessment of proposed construction activity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeleňáková, Martina; Zvijáková, Lenka
Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less
Retinal imaging analysis based on vessel detection.
Jamal, Arshad; Hazim Alkawaz, Mohammed; Rehman, Amjad; Saba, Tanzila
2017-07-01
With an increase in the advancement of digital imaging and computing power, computationally intelligent technologies are in high demand to be used in ophthalmology cure and treatment. In current research, Retina Image Analysis (RIA) is developed for optometrist at Eye Care Center in Management and Science University. This research aims to analyze the retina through vessel detection. The RIA assists in the analysis of the retinal images and specialists are served with various options like saving, processing and analyzing retinal images through its advanced interface layout. Additionally, RIA assists in the selection process of vessel segment; processing these vessels by calculating its diameter, standard deviation, length, and displaying detected vessel on the retina. The Agile Unified Process is adopted as the methodology in developing this research. To conclude, Retina Image Analysis might help the optometrist to get better understanding in analyzing the patient's retina. Finally, the Retina Image Analysis procedure is developed using MATLAB (R2011b). Promising results are attained that are comparable in the state of art. © 2017 Wiley Periodicals, Inc.
An Evidential Reasoning-Based CREAM to Human Reliability Analysis in Maritime Accident Process.
Wu, Bing; Yan, Xinping; Wang, Yang; Soares, C Guedes
2017-10-01
This article proposes a modified cognitive reliability and error analysis method (CREAM) for estimating the human error probability in the maritime accident process on the basis of an evidential reasoning approach. This modified CREAM is developed to precisely quantify the linguistic variables of the common performance conditions and to overcome the problem of ignoring the uncertainty caused by incomplete information in the existing CREAM models. Moreover, this article views maritime accident development from the sequential perspective, where a scenario- and barrier-based framework is proposed to describe the maritime accident process. This evidential reasoning-based CREAM approach together with the proposed accident development framework are applied to human reliability analysis of a ship capsizing accident. It will facilitate subjective human reliability analysis in different engineering systems where uncertainty exists in practice. © 2017 Society for Risk Analysis.
Welding process modelling and control
NASA Technical Reports Server (NTRS)
Romine, Peter L.; Adenwala, Jinen A.
1993-01-01
The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.
Thermochemical Conversion Techno-Economic Analysis | Bioenergy | NREL
Conversion Techno-Economic Analysis Thermochemical Conversion Techno-Economic Analysis NREL's Thermochemical Conversion Analysis team focuses on the conceptual process design and techno-economic analysis , detailed process models, and TEA developed under this project provide insights into the potential economic
Structured Analysis and the Data Flow Diagram: Tools for Library Analysis.
ERIC Educational Resources Information Center
Carlson, David H.
1986-01-01
This article discusses tools developed to aid the systems analysis process (program evaluation and review technique, Gantt charts, organizational charts, decision tables, flowcharts, hierarchy plus input-process-output). Similarities and differences among techniques, library applications of analysis, structured systems analysis, and the data flow…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shekiro, Joe; Elander, Richard
2015-12-01
The purpose of this cooperative work agreement between General Mills Inc. (GMI) and NREL is to determine the feasibility of producing a valuable food ingredient (xylo-oligosaccharides or XOS), a highly soluble fiber material, from agricultural waste streams, at an advantaged cost level relative to similar existing ingredients. The scope of the project includes pilot-scale process development (Task 1), compositional analysis (Task 2), and techno-economic analysis (Task 3).
System enhancements of Mesoscale Analysis and Space Sensor (MASS) computer system
NASA Technical Reports Server (NTRS)
Hickey, J. S.; Karitani, S.
1985-01-01
The interactive information processing for the mesoscale analysis and space sensor (MASS) program is reported. The development and implementation of new spaceborne remote sensing technology to observe and measure atmospheric processes is described. The space measurements and conventional observational data are processed together to gain an improved understanding of the mesoscale structure and dynamical evolution of the atmosphere relative to cloud development and precipitation processes. A Research Computer System consisting of three primary computers was developed (HP-1000F, Perkin-Elmer 3250, and Harris/6) which provides a wide range of capabilities for processing and displaying interactively large volumes of remote sensing data. The development of a MASS data base management and analysis system on the HP-1000F computer and extending these capabilities by integration with the Perkin-Elmer and Harris/6 computers using the MSFC's Apple III microcomputer workstations is described. The objectives are: to design hardware enhancements for computer integration and to provide data conversion and transfer between machines.
The impact of distributed computing on education
NASA Technical Reports Server (NTRS)
Utku, S.; Lestingi, J.; Salama, M.
1982-01-01
In this paper, developments in digital computer technology since the early Fifties are reviewed briefly, and the parallelism which exists between these developments and developments in analysis and design procedures of structural engineering is identified. The recent trends in digital computer technology are examined in order to establish the fact that distributed processing is now an accepted philosophy for further developments. The impact of this on the analysis and design practices of structural engineering is assessed by first examining these practices from a data processing standpoint to identify the key operations and data bases, and then fitting them to the characteristics of distributed processing. The merits and drawbacks of the present philosophy in educating structural engineers are discussed and projections are made for the industry-academia relations in the distributed processing environment of structural analysis and design. An ongoing experiment of distributed computing in a university environment is described.
The project office of the Gaia Data Processing and Analysis Consortium
NASA Astrophysics Data System (ADS)
Mercier, E.; Els, S.; Gracia, G.; O'Mullane, W.; Lock, T.; Comoretto, G.
2010-07-01
Gaia is Europe's future astrometry satellite which is currently under development. The data collected by Gaia will be treated and analyzed by the "Data Processing and Analysis Consortium" (DPAC). DPAC consists of over 400 scientists in more than 22 countries, which are currently developing the required data reduction, analysis and handling algorithms and routines. DPAC is organized in Coordination Units (CU's) and Data Processing Centres (DPCs). Each of these entities is individually responsible for the development of software for the processing of the different data. In 2008, the DPAC Project Office (PO) has been set-up with the task to manage the day-to-day activities of the consortium including implementation, development and operations. This paper describes the tasks DPAC faces and the role of the DPAC PO in the Gaia framework and how it supports the DPAC entities in their effort to fulfill the Gaia promise.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zitney, S.E.; McCorkle, D.; Yang, C.
Process modeling and simulation tools are widely used for the design and operation of advanced power generation systems. These tools enable engineers to solve the critical process systems engineering problems that arise throughout the lifecycle of a power plant, such as designing a new process, troubleshooting a process unit or optimizing operations of the full process. To analyze the impact of complex thermal and fluid flow phenomena on overall power plant performance, the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL) has developed the Advanced Process Engineering Co-Simulator (APECS). The APECS system is an integrated software suite that combinesmore » process simulation (e.g., Aspen Plus) and high-fidelity equipment simulations such as those based on computational fluid dynamics (CFD), together with advanced analysis capabilities including case studies, sensitivity analysis, stochastic simulation for risk/uncertainty analysis, and multi-objective optimization. In this paper we discuss the initial phases of the integration of the APECS system with the immersive and interactive virtual engineering software, VE-Suite, developed at Iowa State University and Ames Laboratory. VE-Suite uses the ActiveX (OLE Automation) controls in the Aspen Plus process simulator wrapped by the CASI library developed by Reaction Engineering International to run process/CFD co-simulations and query for results. This integration represents a necessary step in the development of virtual power plant co-simulations that will ultimately reduce the time, cost, and technical risk of developing advanced power generation systems.« less
Wójcicki, Tomasz; Nowicki, Michał
2016-01-01
The article presents a selected area of research and development concerning the methods of material analysis based on the automatic image recognition of the investigated metallographic sections. The objectives of the analyses of the materials for gas nitriding technology are described. The methods of the preparation of nitrided layers, the steps of the process and the construction and operation of devices for gas nitriding are given. We discuss the possibility of using the methods of digital images processing in the analysis of the materials, as well as their essential task groups: improving the quality of the images, segmentation, morphological transformations and image recognition. The developed analysis model of the nitrided layers formation, covering image processing and analysis techniques, as well as selected methods of artificial intelligence are presented. The model is divided into stages, which are formalized in order to better reproduce their actions. The validation of the presented method is performed. The advantages and limitations of the developed solution, as well as the possibilities of its practical use, are listed. PMID:28773389
Logistics Process Analysis ToolProcess Analysis Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
2008-03-31
LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fortmore » Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less
Tornado detection data reduction and analysis
NASA Technical Reports Server (NTRS)
Davisson, L. D.
1977-01-01
Data processing and analysis was provided in support of tornado detection by analysis of radio frequency interference in various frequency bands. Sea state determination data from short pulse radar measurements were also processed and analyzed. A backscatter simulation was implemented to predict radar performance as a function of wind velocity. Computer programs were developed for the various data processing and analysis goals of the effort.
Human Factors Operability Timeline Analysis to Improve the Processing Flow of the Orion Spacecraft
NASA Technical Reports Server (NTRS)
Schlierf, Roland; Stambolian, Damon B.; Miller, Darcy; Posanda, Juan; Haddock, Mike; Haddad, Mike; Tran, Donald; Henderson, Gena; Barth, Tim
2010-01-01
The Constellation Program (CxP) Orion vehicle goes through several areas and stages of processing before its launched at the Kennedy Space Center. In order to have efficient and effective processing, all of the activities need to be analyzed. This was accomplished by first developing a timeline of events that included each activity, and then each activity was analyzed by operability experts and human factors experts with spacecraft processing experience. This papers focus is to explain the results and the process for developing this human factors operability timeline analysis to improve the processing flow of Orion.
Coal liquefaction processes and development requirements analysis for synthetic fuels production
NASA Technical Reports Server (NTRS)
1980-01-01
Focus of the study is on: (1) developing a technical and programmatic data base on direct and indirect liquefaction processes which have potential for commercialization during the 1980's and beyond, and (2) performing analyses to assess technology readiness and development trends, development requirements, commercial plant costs, and projected synthetic fuel costs. Numerous data sources and references were used as the basis for the analysis results and information presented.
Development of Low-cost, High Energy-per-unit-area Solar Cell Modules
NASA Technical Reports Server (NTRS)
Jones, G. T.; Chitre, S.; Rhee, S. S.
1978-01-01
The development of two hexagonal solar cell process sequences, a laserscribing process technique for scribing hexagonal and modified hexagonal solar cells, a large through-put diffusion process, and two surface macrostructure processes suitable for large scale production is reported. Experimental analysis was made on automated spin-on anti-reflective coating equipment and high pressure wafer cleaning equipment. Six hexagonal solar cell modules were fabricated. Also covered is a detailed theoretical analysis on the optimum silicon utilization by modified hexagonal solar cells.
Conjecturing and Generalization Process on The Structural Development
NASA Astrophysics Data System (ADS)
Ni'mah, Khomsatun; Purwanto; Bambang Irawan, Edy; Hidayanto, Erry
2017-06-01
This study aims to describe how conjecturing process and generalization process of structural development to thirty children in middle school at grade 8 in solving problems of patterns. Processing of the data in this study uses qualitative data analysis techniques. The analyzed data is the data obtained through direct observation technique, documentation, and interviews. This study based on research studies Mulligan et al (2012) which resulted in a five - structural development stage, namely prestructural, emergent, partial, structural, and advance. From the analysis of the data in this study found there are two phenomena that is conjecturing and generalization process are related. During the conjecturing process, the childrens appropriately in making hypothesis of patterns problem through two phases, which are numerically and symbolically. Whereas during the generalization of process, the childrens able to related rule of pattern on conjecturing process to another context.
Airport Planning and Development Process: Analysis and Documentation Report
DOT National Transportation Integrated Search
1997-01-01
The Federal Aviation Administration (FAA) is facing extreme resource constraints : and increasing demands on the aviation system. The Airport Planning and : Development Process (APDP) links organizations, people, and processes together : to provide c...
NASA Astrophysics Data System (ADS)
McNeese, L. E.
1981-01-01
Increased utilization of coal and other fossil fuel alternatives as sources of clean energy is reported. The following topics are discussed: coal conversion development, chemical research and development, materials technology, component development and process evaluation studies, technical support to major liquefaction projects, process analysis and engineering evaluations, fossil energy environmental analysis, flue gas desulfurization, solid waste disposal, coal preparation waste utilization, plant control development, atmospheric fluidized bed coal combustor for cogeneration, TVA FBC demonstration plant program technical support, PFBC systems analysis, fossil fuel applications assessments, performance assurance system support for fossil energy projects, international energy technology assessment, and general equilibrium models of liquid and gaseous fuel supplies.
Preliminary Thermal-Mechanical Sizing of Metallic TPS: Process Development and Sensitivity Studies
NASA Technical Reports Server (NTRS)
Poteet, Carl C.; Abu-Khajeel, Hasan; Hsu, Su-Yuen
2002-01-01
The purpose of this research was to perform sensitivity studies and develop a process to perform thermal and structural analysis and sizing of the latest Metallic Thermal Protection System (TPS) developed at NASA LaRC (Langley Research Center). Metallic TPS is a key technology for reducing the cost of reusable launch vehicles (RLV), offering the combination of increased durability and competitive weights when compared to other systems. Accurate sizing of metallic TPS requires combined thermal and structural analysis. Initial sensitivity studies were conducted using transient one-dimensional finite element thermal analysis to determine the influence of various TPS and analysis parameters on TPS weight. The thermal analysis model was then used in combination with static deflection and failure mode analysis of the sandwich panel outer surface of the TPS to obtain minimum weight TPS configurations at three vehicle stations on the windward centerline of a representative RLV. The coupled nature of the analysis requires an iterative analysis process, which will be described herein. Findings from the sensitivity analysis are reported, along with TPS designs at the three RLV vehicle stations considered.
On-road anomaly detection by multimodal sensor analysis and multimedia processing
NASA Astrophysics Data System (ADS)
Orhan, Fatih; Eren, P. E.
2014-03-01
The use of smartphones in Intelligent Transportation Systems is gaining popularity, yet many challenges exist in developing functional applications. Due to the dynamic nature of transportation, vehicular social applications face complexities such as developing robust sensor management, performing signal and image processing tasks, and sharing information among users. This study utilizes a multimodal sensor analysis framework which enables the analysis of sensors in multimodal aspect. It also provides plugin-based analyzing interfaces to develop sensor and image processing based applications, and connects its users via a centralized application as well as to social networks to facilitate communication and socialization. With the usage of this framework, an on-road anomaly detector is being developed and tested. The detector utilizes the sensors of a mobile device and is able to identify anomalies such as hard brake, pothole crossing, and speed bump crossing. Upon such detection, the video portion containing the anomaly is automatically extracted in order to enable further image processing analysis. The detection results are shared on a central portal application for online traffic condition monitoring.
Automated data acquisition technology development:Automated modeling and control development
NASA Technical Reports Server (NTRS)
Romine, Peter L.
1995-01-01
This report documents the completion of, and improvements made to, the software developed for automated data acquisition and automated modeling and control development on the Texas Micro rackmounted PC's. This research was initiated because a need was identified by the Metal Processing Branch of NASA Marshall Space Flight Center for a mobile data acquisition and data analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC based system was chosen. The Welding Measurement System (WMS), is a dedicated instrument strickly for use of data acquisition and data analysis. In addition to the data acquisition functions described in this thesis, WMS also supports many functions associated with process control. The hardware and software requirements for an automated acquisition system for welding process parameters, welding equipment checkout, and welding process modeling were determined in 1992. From these recommendations, NASA purchased the necessary hardware and software. The new welding acquisition system is designed to collect welding parameter data and perform analysis to determine the voltage versus current arc-length relationship for VPPA welding. Once the results of this analysis are obtained, they can then be used to develop a RAIL function to control welding startup and shutdown without torch crashing.
NASA Technical Reports Server (NTRS)
Lieberman, S. L.
1974-01-01
Tables are presented which include: material properties; elemental analysis; silicone RTV formulations; polyester systems and processing; epoxy preblends and processing; urethane materials and processing; epoxy-urethanes elemental analysis; flammability test results, and vacuum effects.
NASA Astrophysics Data System (ADS)
Syafrina, R.; Rohman, I.; Yuliani, G.
2018-05-01
This study aims to analyze the concept characteristics of solubility and solubility products that will serve as the basis for the development of virtual laboratory and students' science process skills. Characteristics of the analyzed concepts include concept definitions, concept attributes, and types of concepts. The concept analysis method uses concept analysis according to Herron. The results of the concept analysis show that there are twelve chemical concepts that become the prerequisite concept before studying the solubility and solubility and five core concepts that students must understand in the solubility and Solubility product. As many as 58.3% of the definitions of the concepts contained in high school textbooks support students' science process skills, the rest of the definition of the concept is memorized. Concept attributes that meet three levels of chemical representation and can be poured into a virtual laboratory have a percentage of 66.6%. Type of concept, 83.3% is a concept based on principle; and 16.6% concepts that state the process. Meanwhile, the science process skills that can be developed based on concept analysis are the ability to observe, calculate, measure, predict, interpret, hypothesize, apply, classify, and inference.
TU-AB-BRD-04: Development of Quality Management Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomadsen, B.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
Child versus adult psychoanalysis: two processes or one?
Sugarman, Alan
2009-12-01
Child analysis continues to be seen as a different technique from adult analysis because children are still involved in a developmental process and because the primary objects continue to play active roles in their lives. This paper argues that this is a false dichotomy. An extended vignette of the analysis of a latency-aged girl is used to demonstrate that the psychoanalytic process that develops in child analysis is structurally the same as that in adult analysis. Both revolve around the analysis of resistance and transference and use both to promote knowledge of the patient's mind at work. And both techniques formulate interventions based on the analyst's appraisal of the patient's mental organization. It is hoped that stressing the essential commonality of both techniques will promote the development of an overarching theory of psychoanalytic technique.
Applications of High-speed motion analysis system on Solid Rocket Motor (SRM)
NASA Astrophysics Data System (ADS)
Liu, Yang; He, Guo-qiang; Li, Jiang; Liu, Pei-jin; Chen, Jian
2007-01-01
High-speed motion analysis system could record images up to 12,000fps and analyzed with the image processing system. The system stored data and images directly in electronic memory convenient for managing and analyzing. The high-speed motion analysis system and the X-ray radiography system were established the high-speed real-time X-ray radiography system, which could diagnose and measure the dynamic and high-speed process in opaque. The image processing software was developed for improve quality of the original image for acquiring more precise information. The typical applications of high-speed motion analysis system on solid rocket motor (SRM) were introduced in the paper. The research of anomalous combustion of solid propellant grain with defects, real-time measurement experiment of insulator eroding, explosion incision process of motor, structure and wave character of plume during the process of ignition and flameout, measurement of end burning of solid propellant, measurement of flame front and compatibility between airplane and missile during the missile launching were carried out using high-speed motion analysis system. The significative results were achieved through the research. Aim at application of high-speed motion analysis system on solid rocket motor, the key problem, such as motor vibrancy, electrical source instability, geometry aberrance, and yawp disturbance, which damaged the image quality, was solved. The image processing software was developed which improved the capability of measuring the characteristic of image. The experimental results showed that the system was a powerful facility to study instantaneous and high-speed process in solid rocket motor. With the development of the image processing technique, the capability of high-speed motion analysis system was enhanced.
Harbison, K; Kelly, J; Burnell, L; Silva, J
1995-01-01
The Scenario-based Engineering Process (SEP) is a user-focused methodology for large and complex system design. This process supports new application development from requirements analysis with domain models to component selection, design and modification, implementation, integration, and archival placement. It is built upon object-oriented methodologies, domain modeling strategies, and scenario-based techniques to provide an analysis process for mapping application requirements to available components. We are using SEP in the health care applications that we are developing. The process has already achieved success in the manufacturing and military domains and is being adopted by many organizations. SEP should prove viable in any domain containing scenarios that can be decomposed into tasks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wahanani, Nursinta Adi, E-mail: sintaadi@batan.go.id; Natsir, Khairina, E-mail: sintaadi@batan.go.id; Hartini, Entin, E-mail: sintaadi@batan.go.id
Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this softwaremore » 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu{sup 239} and Pu{sup 241}. Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis.« less
NASA Technical Reports Server (NTRS)
Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.
1974-01-01
A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.
Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0
Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; ...
2008-01-01
The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less
TU-AB-BRD-03: Fault Tree Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunscombe, P.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
TU-AB-BRD-02: Failure Modes and Effects Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huq, M.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
NASA Technical Reports Server (NTRS)
1975-01-01
Preliminary development plans, analysis of required R and D and production resources, the costs of such resources, and, finally, the potential profitability of a commercial space processing opportunity for the production of very high frequency surface acoustic wave devices are presented.
Analysis of a bubble deformation process in a microcapsule by shock waves for developing DDS
NASA Astrophysics Data System (ADS)
Tamagawa, Masaaki; Morimoto, Kenshi
2012-09-01
This paper describes development of DDS (drug delivery systems) microcapsule using underwater shock waves, especially (1) making polymer microcapsules including a bubble and analysis of a bubble deformation process in a polymer capsule by pressure wave, (2) making liposome microcapsules with different elastic membrane and disintegration tests by ultrasonic waves.
Global Persistent Attack: A Systems Architecture, Process Modeling, and Risk Analysis Approach
2008-06-01
develop an analysis process for quantifying risk associated with the limitations presented by a fiscally constrained environment. The second step...previous independent analysis of each force structure provided information for quantifying risk associated with the given force presentations, the
Wet weather highway accident analysis and skid resistance data management system (volume I).
DOT National Transportation Integrated Search
1992-06-01
The objectives and scope of this research are to establish an effective methodology for wet weather accident analysis and to develop a database management system to facilitate information processing and storage for the accident analysis process, skid...
Big Data Analysis of Manufacturing Processes
NASA Astrophysics Data System (ADS)
Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert
2015-11-01
The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results.
ERIC Educational Resources Information Center
Ho, Hsuan-Fu; Hung, Chia-Chi
2008-01-01
Purpose: The purpose of this paper is to examine how a graduate institute at National Chiayi University (NCYU), by using a model that integrates analytic hierarchy process, cluster analysis and correspondence analysis, can develop effective marketing strategies. Design/methodology/approach: This is primarily a quantitative study aimed at…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, B.E.
1995-04-01
A cross-functional team of process, product, quality, material, and design lab engineers was assembled to develop an environmentally friendly cleaning process for leadless chip carrier assemblies (LCCAs). Using flush and filter testing, Auger surface analysis, GC-Mass spectrophotometry, production yield results, and electrical testing results over an extended testing period, the team developed an aqueous cleaning process for LCCAs. The aqueous process replaced the Freon vapor degreasing/ultrasonic rinse process.
The Development of Reading for Comprehension: An Information Processing Analysis. Final Report.
ERIC Educational Resources Information Center
Schadler, Margaret; Juola, James F.
This report summarizes research performed at the Universtiy of Kansas that involved several topics related to reading and learning to read, including the development of automatic word recognition processes, reading for comprehension, and the development of new computer technologies designed to facilitate the reading process. The first section…
NASA Technical Reports Server (NTRS)
Smiles, Michael D.; Blythe, Michael P.; Bejmuk, Bohdan; Currie, Nancy J.; Doremus, Robert C.; Franzo, Jennifer C.; Gordon, Mark W.; Johnson, Tracy D.; Kowaleski, Mark M.; Laube, Jeffrey R.
2015-01-01
The Chief Engineer of the Exploration Systems Development (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard development process. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g., missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.
NASA Technical Reports Server (NTRS)
Smiles, Michael D.; Blythe, Michael P.; Bejmuk, Bohdan; Currie, Nancy J.; Doremus, Robert C.; Franzo, Jennifer C.; Gordon, Mark W.; Johnson, Tracy D.; Kowaleski, Mark M.; Laube, Jeffrey R.
2015-01-01
The Chief Engineer of the Exploration Systems Development (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard development process. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g. missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.
Yan, Binjun; Chen, Teng; Xu, Zhilin; Qu, Haibin
2014-06-01
The concept of quality by design (QbD) is widely applied in the process development of pharmaceuticals. However, the additional cost and time have caused some resistance about QbD implementation. To show a possible solution, this work proposed a rapid process development method, which used direct analysis in real time mass spectrometry (DART-MS) as a process analytical technology (PAT) tool for studying the chromatographic process of Ginkgo biloba L., as an example. The breakthrough curves were fast determined by DART-MS at-line. A high correlation coefficient of 0.9520 was found between the concentrations of ginkgolide A determined by DART-MS and HPLC. Based on the PAT tool, the impacts of process parameters on the adsorption capacity were discovered rapidly, which showed a decreased adsorption capacity with the increase of the flow rate. This work has shown the feasibility and advantages of integrating PAT into QbD implementation for rapid process development. Copyright © 2014 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreau, J.W.
1980-12-01
This engineering and economic study evaluated the potential for developing a geothermal industrial park in the Puna District near Pahoa on the Island of Hawaii. Direct heat industrial applications were analyzed from a marketing, engineering, economic, environmental, and sociological standpoint to determine the most viable industries for the park. An extensive literature search produced 31 existing processes currently using geothermal heat. An additional list was compiled indicating industrial processes that require heat that could be provided by geothermal energy. From this information, 17 possible processes were selected for consideration. Careful scrutiny and analysis of these 17 processes revealed three thatmore » justified detailed economic workups. The three processes chosen for detailed analysis were: an ethanol plant using bagasse and wood as feedstock; a cattle feed mill using sugar cane leaf trash as feedstock; and a papaya processing facility providing both fresh and processed fruit. In addition, a research facility to assess and develop other processes was treated as a concept. Consideration was given to the impediments to development, the engineering process requirements and the governmental support for each process. The study describes the geothermal well site chosen, the pipeline to transmit the hydrothermal fluid, and the infrastructure required for the industrial park. A conceptual development plan for the ethanol plant, the feedmill and the papaya processing facility was prepared. The study concluded that a direct heat industrial park in Pahoa, Hawaii, involves considerable risks.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-06
... analyses and the development of other elements of the standard; developing a written action plan for..., revalidating and retaining the process hazard analysis; developing and implementing written operating [[Page 66639
ERIC Educational Resources Information Center
Tutlys, Vidmantas; Spöttl, Georg
2017-01-01
Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…
Evaluation of interaction dynamics of concurrent processes
NASA Astrophysics Data System (ADS)
Sobecki, Piotr; Białasiewicz, Jan T.; Gross, Nicholas
2017-03-01
The purpose of this paper is to present the wavelet tools that enable the detection of temporal interactions of concurrent processes. In particular, the determination of interaction coherence of time-varying signals is achieved using a complex continuous wavelet transform. This paper has used electrocardiogram (ECG) and seismocardiogram (SCG) data set to show multiple continuous wavelet analysis techniques based on Morlet wavelet transform. MATLAB Graphical User Interface (GUI), developed in the reported research to assist in quick and simple data analysis, is presented. These software tools can discover the interaction dynamics of time-varying signals, hence they can reveal their correlation in phase and amplitude, as well as their non-linear interconnections. The user-friendly MATLAB GUI enables effective use of the developed software what enables to load two processes under investigation, make choice of the required processing parameters, and then perform the analysis. The software developed is a useful tool for researchers who have a need for investigation of interaction dynamics of concurrent processes.
Odean, Rosalie; Nazareth, Alina; Pruden, Shannon M.
2015-01-01
Developmental systems theory posits that development cannot be segmented by influences acting in isolation, but should be studied through a scientific lens that highlights the complex interactions between these forces over time (Overton, 2013a). This poses a unique challenge for developmental psychologists studying complex processes like language development. In this paper, we advocate for the combining of highly sophisticated data collection technologies in an effort to move toward a more systemic approach to studying language development. We investigate the efficiency and appropriateness of combining eye-tracking technology and the LENA (Language Environment Analysis) system, an automated language analysis tool, in an effort to explore the relation between language processing in early development, and external dynamic influences like parent and educator language input in the home and school environments. Eye-tracking allows us to study language processing via eye movement analysis; these eye movements have been linked to both conscious and unconscious cognitive processing, and thus provide one means of evaluating cognitive processes underlying language development that does not require the use of subjective parent reports or checklists. The LENA system, on the other hand, provides automated language output that describes a child’s language-rich environment. In combination, these technologies provide critical information not only about a child’s language processing abilities but also about the complexity of the child’s language environment. Thus, when used in conjunction these technologies allow researchers to explore the nature of interacting systems involved in language development. PMID:26379591
A Reference Model for Software and System Inspections. White Paper
NASA Technical Reports Server (NTRS)
He, Lulu; Shull, Forrest
2009-01-01
Software Quality Assurance (SQA) is an important component of the software development process. SQA processes provide assurance that the software products and processes in the project life cycle conform to their specified requirements by planning, enacting, and performing a set of activities to provide adequate confidence that quality is being built into the software. Typical techniques include: (1) Testing (2) Simulation (3) Model checking (4) Symbolic execution (5) Management reviews (6) Technical reviews (7) Inspections (8) Walk-throughs (9) Audits (10) Analysis (complexity analysis, control flow analysis, algorithmic analysis) (11) Formal method Our work over the last few years has resulted in substantial knowledge about SQA techniques, especially the areas of technical reviews and inspections. But can we apply the same QA techniques to the system development process? If yes, what kind of tailoring do we need before applying them in the system engineering context? If not, what types of QA techniques are actually used at system level? And, is there any room for improvement.) After a brief examination of the system engineering literature (especially focused on NASA and DoD guidance) we found that: (1) System and software development process interact with each other at different phases through development life cycle (2) Reviews are emphasized in both system and software development. (Figl.3). For some reviews (e.g. SRR, PDR, CDR), there are both system versions and software versions. (3) Analysis techniques are emphasized (e.g. Fault Tree Analysis, Preliminary Hazard Analysis) and some details are given about how to apply them. (4) Reviews are expected to use the outputs of the analysis techniques. In other words, these particular analyses are usually conducted in preparation for (before) reviews. The goal of our work is to explore the interaction between the Quality Assurance (QA) techniques at the system level and the software level.
DOT National Transportation Integrated Search
1992-06-01
The objectives and scope of this research are to establish an effective methodology for wet weather accident analysis and to develop a database management system to facilitate information processing and storage for the accident analysis process, skid...
Interactive information processing for NASA's mesoscale analysis and space sensor program
NASA Technical Reports Server (NTRS)
Parker, K. G.; Maclean, L.; Reavis, N.; Wilson, G.; Hickey, J. S.; Dickerson, M.; Karitani, S.; Keller, D.
1985-01-01
The Atmospheric Sciences Division (ASD) of the Systems Dynamics Laboratory at NASA's Marshall Space Flight Center (MSFC) is currently involved in interactive information processing for the Mesoscale Analysis and Space Sensor (MASS) program. Specifically, the ASD is engaged in the development and implementation of new space-borne remote sensing technology to observe and measure mesoscale atmospheric processes. These space measurements and conventional observational data are being processed together to gain an improved understanding of the mesoscale structure and the dynamical evolution of the atmosphere relative to cloud development and precipitation processes. To satisfy its vast data processing requirements, the ASD has developed a Researcher Computer System consiting of three primary computer systems which provides over 20 scientists with a wide range of capabilities for processing and displaying a large volumes of remote sensing data. Each of the computers performs a specific function according to its unique capabilities.
Ferreira, Ana P; Tobyn, Mike
2015-01-01
In the pharmaceutical industry, chemometrics is rapidly establishing itself as a tool that can be used at every step of product development and beyond: from early development to commercialization. This set of multivariate analysis methods allows the extraction of information contained in large, complex data sets thus contributing to increase product and process understanding which is at the core of the Food and Drug Administration's Process Analytical Tools (PAT) Guidance for Industry and the International Conference on Harmonisation's Pharmaceutical Development guideline (Q8). This review is aimed at providing pharmaceutical industry professionals an introduction to multivariate analysis and how it is being adopted and implemented by companies in the transition from "quality-by-testing" to "quality-by-design". It starts with an introduction to multivariate analysis and the two methods most commonly used: principal component analysis and partial least squares regression, their advantages, common pitfalls and requirements for their effective use. That is followed with an overview of the diverse areas of application of multivariate analysis in the pharmaceutical industry: from the development of real-time analytical methods to definition of the design space and control strategy, from formulation optimization during development to the application of quality-by-design principles to improve manufacture of existing commercial products.
Post-test navigation data analysis techniques for the shuttle ALT
NASA Technical Reports Server (NTRS)
1975-01-01
Postflight test analysis data processing techniques for shuttle approach and landing tests (ALT) navigation data are defined. Postfight test processor requirements are described along with operational and design requirements, data input requirements, and software test requirements. The postflight test data processing is described based on the natural test sequence: quick-look analysis, postflight navigation processing, and error isolation processing. Emphasis is placed on the tradeoffs that must remain open and subject to analysis until final definition is achieved in the shuttle data processing system and the overall ALT plan. A development plan for the implementation of the ALT postflight test navigation data processing system is presented. Conclusions are presented.
NASA Technical Reports Server (NTRS)
Singh, S. P.
1979-01-01
The computer software developed to set up a method for Wiener spectrum analysis of photographic films is presented. This method is used for the quantitative analysis of the autoradiographic enhancement process. The software requirements and design for the autoradiographic enhancement process are given along with the program listings and the users manual. A software description and program listings modification of the data analysis software are included.
Comprehensive Mass Analysis for Chemical Processes, a Case Study on L-Dopa Manufacture
To evaluate the “greenness” of chemical processes in route selection and process development, we propose a comprehensive mass analysis to inform the stakeholders from different fields. This is carried out by characterizing the mass intensity for each contributing chemical or wast...
NASA Astrophysics Data System (ADS)
Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.
2018-01-01
Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.
NASA Technical Reports Server (NTRS)
Wanthal, Steven; Schaefer, Joseph; Justusson, Brian; Hyder, Imran; Engelstad, Stephen; Rose, Cheryl
2017-01-01
The Advanced Composites Consortium is a US Government/Industry partnership supporting technologies to enable timeline and cost reduction in the development of certified composite aerospace structures. A key component of the consortium's approach is the development and validation of improved progressive damage and failure analysis methods for composite structures. These methods will enable increased use of simulations in design trade studies and detailed design development, and thereby enable more targeted physical test programs to validate designs. To accomplish this goal with confidence, a rigorous verification and validation process was developed. The process was used to evaluate analysis methods and associated implementation requirements to ensure calculation accuracy and to gage predictability for composite failure modes of interest. This paper introduces the verification and validation process developed by the consortium during the Phase I effort of the Advanced Composites Project. Specific structural failure modes of interest are first identified, and a subset of standard composite test articles are proposed to interrogate a progressive damage analysis method's ability to predict each failure mode of interest. Test articles are designed to capture the underlying composite material constitutive response as well as the interaction of failure modes representing typical failure patterns observed in aerospace structures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palta, J.
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
Fourier analysis and signal processing by use of the Moebius inversion formula
NASA Technical Reports Server (NTRS)
Reed, Irving S.; Yu, Xiaoli; Shih, Ming-Tang; Tufts, Donald W.; Truong, T. K.
1990-01-01
A novel Fourier technique for digital signal processing is developed. This approach to Fourier analysis is based on the number-theoretic method of the Moebius inversion of series. The Fourier transform method developed is shown also to yield the convolution of two signals. A computer simulation shows that this method for finding Fourier coefficients is quite suitable for digital signal processing. It competes with the classical FFT (fast Fourier transform) approach in terms of accuracy, complexity, and speed.
Using task analysis to understand the Data System Operations Team
NASA Technical Reports Server (NTRS)
Holder, Barbara E.
1994-01-01
The Data Systems Operations Team (DSOT) currently monitors the Multimission Ground Data System (MGDS) at JPL. The MGDS currently supports five spacecraft and within the next five years, it will support ten spacecraft simultaneously. The ground processing element of the MGDS consists of a distributed UNIX-based system of over 40 nodes and 100 processes. The MGDS system provides operators with little or no information about the system's end-to-end processing status or end-to-end configuration. The lack of system visibility has become a critical issue in the daily operation of the MGDS. A task analysis was conducted to determine what kinds of tools were needed to provide DSOT with useful status information and to prioritize the tool development. The analysis provided the formality and structure needed to get the right information exchange between development and operations. How even a small task analysis can improve developer-operator communications is described, and the challenges associated with conducting a task analysis in a real-time mission operations environment are examined.
A Comparative Analysis of Extract, Transformation and Loading (ETL) Process
NASA Astrophysics Data System (ADS)
Runtuwene, J. P. A.; Tangkawarow, I. R. H. T.; Manoppo, C. T. M.; Salaki, R. J.
2018-02-01
The current growth of data and information occurs rapidly in varying amount and media. These types of development will eventually produce large number of data better known as the Big Data. Business Intelligence (BI) utilizes large number of data and information for analysis so that one can obtain important information. This type of information can be used to support decision-making process. In practice a process integrating existing data and information into data warehouse is needed. This data integration process is known as Extract, Transformation and Loading (ETL). In practice, many applications have been developed to carry out the ETL process, but selection which applications are more time, cost and power effective and efficient may become a challenge. Therefore, the objective of the study was to provide comparative analysis through comparison between the ETL process using Microsoft SQL Server Integration Service (SSIS) and one using Pentaho Data Integration (PDI).
SDI-based business processes: A territorial analysis web information system in Spain
NASA Astrophysics Data System (ADS)
Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.
2012-09-01
Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.
2013-04-01
project was to provide the Royal Canadian Navy ( RCN ) with a set of guidelines on analysis, design, and verification processes for effective room...design, and verification processes that should be used in the development of effective room layouts for Royal Canadian Navy ( RCN ) ships. The primary...designed CSC; however, the guidelines could be applied to the design of any multiple-operator space in any RCN vessel. Results: The development of
Research Status and Development Trend of Remote Sensing in China Using Bibliometric Analysis
NASA Astrophysics Data System (ADS)
Zeng, Y.; Zhang, J.; Niu, R.
2015-06-01
Remote sensing was introduced into China in 1970s and then began to flourish. At present, China has developed into a big remote sensing country, and remote sensing is increasingly playing an important role in various fields of national economic construction and social development. Based on China Academic Journals Full-text Database and China Citation Database published by China National Knowledge Infrastructure, this paper analyzed academic characteristics of 963 highly cited papers published by 16 professional and academic journals in the field of surveying and mapping from January 2010 to December 2014 in China, which include hot topics, literature authors, research institutions, and fundations. At the same time, it studied a total of 51,149 keywords published by these 16 journals during the same period. Firstly by keyword selection, keyword normalization, keyword consistency and keyword incorporation, and then by analysis of high frequency keywords, the progress and prospect of China's remote sensing technology in data acquisition, data processing and applications during the past five years were further explored and revealed. It can be seen that: highly cited paper analysis and word frequency analysis is complementary on subject progress analysis; in data acquisition phase, research focus is new civilian remote sensing satellite systems and UAV remote sensing system; research focus of data processing and analysis is multi-source information extraction and classification, laser point cloud data processing, objectoriented high resolution image analysis, SAR data and hyper-spectral image processing, etc.; development trend of remote sensing data processing is quantitative, intelligent, automated, and real-time, and the breadth and depth of remote sensing application is gradually increased; parallel computing, cloud computing and geographic conditions monitoring and census are the new research focuses to be paid attention to.
Encapsulation Processing and Manufacturing Yield Analysis
NASA Technical Reports Server (NTRS)
Willis, P. B.
1984-01-01
The development of encapsulation processing and a manufacturing productivity analysis for photovoltaic cells are discussed. The goals were: (1) to understand the relationships between both formulation variables and process variables; (2) to define conditions required for optimum performance; (3) to predict manufacturing yield; and (4) to provide documentation to industry.
Vygotsky's Analysis of Children's Meaning Making Processes
ERIC Educational Resources Information Center
Mahn, Holbrook
2012-01-01
Vygotsky's work is extensive and covers many aspects of the development of children's meaning-making processes in social and cultural contexts. However, his main focus is on the examination of the unification of speaking and thinking processes. His investigation centers on the analysis of the entity created by this unification--an internal…
Encapsulation processing and manufacturing yield analysis
NASA Astrophysics Data System (ADS)
Willis, P. B.
1984-10-01
The development of encapsulation processing and a manufacturing productivity analysis for photovoltaic cells are discussed. The goals were: (1) to understand the relationships between both formulation variables and process variables; (2) to define conditions required for optimum performance; (3) to predict manufacturing yield; and (4) to provide documentation to industry.
Zimmermann, Hartmut F; Hentschel, Norbert
2011-01-01
With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here explains how a biopharmaceutical process FMEA can be conducted. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. With the help of this guideline, different details of the manufacturing process can be ranked according to their potential risks, and this can help pharmaceutical companies to identify aspects with high potential risks and to react accordingly to improve the safety of medicines.
A survey of compiler development aids. [concerning lexical, syntax, and semantic analysis
NASA Technical Reports Server (NTRS)
Buckles, B. P.; Hodges, B. C.; Hsia, P.
1977-01-01
A theoretical background was established for the compilation process by dividing it into five phases and explaining the concepts and algorithms that underpin each. The five selected phases were lexical analysis, syntax analysis, semantic analysis, optimization, and code generation. Graph theoretical optimization techniques were presented, and approaches to code generation were described for both one-pass and multipass compilation environments. Following the initial tutorial sections, more than 20 tools that were developed to aid in the process of writing compilers were surveyed. Eight of the more recent compiler development aids were selected for special attention - SIMCMP/STAGE2, LANG-PAK, COGENT, XPL, AED, CWIC, LIS, and JOCIT. The impact of compiler development aids were assessed some of their shortcomings and some of the areas of research currently in progress were inspected.
Anima: Modular Workflow System for Comprehensive Image Data Analysis
Rantanen, Ville; Valori, Miko; Hautaniemi, Sampsa
2014-01-01
Modern microscopes produce vast amounts of image data, and computational methods are needed to analyze and interpret these data. Furthermore, a single image analysis project may require tens or hundreds of analysis steps starting from data import and pre-processing to segmentation and statistical analysis; and ending with visualization and reporting. To manage such large-scale image data analysis projects, we present here a modular workflow system called Anima. Anima is designed for comprehensive and efficient image data analysis development, and it contains several features that are crucial in high-throughput image data analysis: programing language independence, batch processing, easily customized data processing, interoperability with other software via application programing interfaces, and advanced multivariate statistical analysis. The utility of Anima is shown with two case studies focusing on testing different algorithms developed in different imaging platforms and an automated prediction of alive/dead C. elegans worms by integrating several analysis environments. Anima is a fully open source and available with documentation at www.anduril.org/anima. PMID:25126541
ERIC Educational Resources Information Center
Hampton, Greg
2011-01-01
Narrative policy analysis is examined for its contribution to participatory policy development within higher education. Within narrative policy analysis the meta-narrative is developed by the policy analyst in order to find a way to bridge opposing narratives. This development can be combined with participants deliberating in a policy process,…
NASA Supportability Engineering Implementation Utilizing DoD Practices and Processes
NASA Technical Reports Server (NTRS)
Smith, David A.; Smith, John V.
2010-01-01
The Ares I design and development program made the determination early in the System Design Review Phase to utilize DoD ILS and LSA approach for supportability engineering as an integral part of the system engineering process. This paper is to provide a review of the overall approach to design Ares-I with an emphasis on a more affordable, supportable, and sustainable launch vehicle. Discussions will include the requirements development, design influence, support concept alternatives, ILS and LSA planning, Logistics support analyses/trades performed, LSA tailoring for NASA Ares Program, support system infrastructure identification, ILS Design Review documentation, Working Group coordination, and overall ILS implementation. At the outset, the Ares I Project initiated the development of the Integrated Logistics Support Plan (ILSP) and a Logistics Support Analysis process to provide a path forward for the management of the Ares-I ILS program and supportability analysis activities. The ILSP provide the initial planning and coordination between the Ares-I Project Elements and Ground Operation Project. The LSA process provided a system engineering approach in the development of the Ares-I supportability requirements; influence the design for supportability and development of alternative support concepts that satisfies the program operability requirements. The LSA planning and analysis results are documented in the Logistics Support Analysis Report. This document was required during the Ares-I System Design Review (SDR) and Preliminary Design Review (PDR) review cycles. To help coordinate the LSA process across the Ares-I project and between programs, the LSA Report is updated and released quarterly. A System Requirement Analysis was performed to determine the supportability requirements and technical performance measurements (TPMs). Two working groups were established to provide support in the management and implement the Ares-I ILS program, the Integrated Logistics Support Working Group (ILSWG) and the Logistics Support Analysis Record Working Group (LSARWG). The Ares I ILSWG is established to assess the requirements and conduct, evaluate analyses and trade studies associated with acquisition logistic and supportability processes and to resolve Ares I integrated logistics and supportability issues. It established a strategic collaborative alliance for coordination of Logistics Support Analysis activates in support of the integrated Ares I vehicle design and development of logistics support infrastructure. A Joint Ares I - Orion LSAR Working Group was established to: 1) Guide the development of Ares-I and Orion LSAR data and serve as a model for future Constellation programs, 2) Develop rules and assumptions that will apply across the Constellation program with regards to the program's LSAR development, and 3) Maintain the Constellation LSAR Style Guide.
Interdisciplinary Investigations in Support of Project DI-MOD
NASA Technical Reports Server (NTRS)
Starks, Scott A. (Principal Investigator)
1996-01-01
Various concepts from time series analysis are used as the basis for the development of algorithms to assist in the analysis and interpretation of remote sensed imagery. An approach to trend detection that is based upon the fractal analysis of power spectrum estimates is presented. Additionally, research was conducted toward the development of a software architecture to support processing tasks associated with databases housing a variety of data. An algorithmic approach which provides for the automation of the state monitoring process is presented.
Dynamic Systems Analysis for Turbine Based Aero Propulsion Systems
NASA Technical Reports Server (NTRS)
Csank, Jeffrey T.
2016-01-01
The aircraft engine design process seeks to optimize the overall system-level performance, weight, and cost for a given concept. Steady-state simulations and data are used to identify trade-offs that should be balanced to optimize the system in a process known as systems analysis. These systems analysis simulations and data may not adequately capture the true performance trade-offs that exist during transient operation. Dynamic systems analysis provides the capability for assessing the dynamic tradeoffs at an earlier stage of the engine design process. The dynamic systems analysis concept, developed tools, and potential benefit are presented in this paper. To provide this capability, the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) was developed to provide the user with an estimate of the closed-loop performance (response time) and operability (high pressure compressor surge margin) for a given engine design and set of control design requirements. TTECTrA along with engine deterioration information, can be used to develop a more generic relationship between performance and operability that can impact the engine design constraints and potentially lead to a more efficient engine.
On the Hilbert-Huang Transform Theoretical Developments
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Blank, Karin; Flatley, Thomas; Huang, Norden E.; Patrick, David; Hestnes, Phyllis
2005-01-01
One of the main heritage tools used in scientific and engineering data spectrum analysis is the Fourier Integral Transform and its high performance digital equivalent - the Fast Fourier Transform (FFT). Both carry strong a-priori assumptions about the source data, such as linearity, of being stationary, and of satisfying the Dirichlet conditions. A recent development at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC), known as the Hilbert-Huang Transform (HHT), proposes a novel approach to the solution for the nonlinear class of spectrum analysis problems. Using a-posteriori data processing based on the Empirical Mode Decomposition (EMD) sifting process (algorithm), followed by the normalized Hilbert Transform of the decomposition data, the HHT allows spectrum analysis of nonlinear and nonstationary data. The EMD sifting process results in a non-constrained decomposition of a source real value data vector into a finite set of Intrinsic Mode Functions (IMF). These functions form a near orthogonal adaptive basis, a basis that is derived from the data. The IMFs can be further analyzed for spectrum interpretation by the classical Hilbert Transform. A new engineering spectrum analysis tool using HHT has been developed at NASA GSFC, the HHT Data Processing System (HHT-DPS). As the HHT-DPS has been successfully used and commercialized, new applications post additional questions about the theoretical basis behind the HHT and EMD algorithms. Why is the fastest changing component of a composite signal being sifted out first in the EMD sifting process? Why does the EMD sifting process seemingly converge and why does it converge rapidly? Does an IMF have a distinctive structure? Why are the IMFs near orthogonal? We address these questions and develop the initial theoretical background for the HHT. This will contribute to the developments of new HHT processing options, such as real-time and 2-D processing using Field Programmable Array (FPGA) computational resources, enhanced HHT synthesis, and broaden the scope of HHT applications for signal processing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2015-06-15
Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less
The MSFC Collaborative Engineering Process for Preliminary Design and Concept Definition Studies
NASA Technical Reports Server (NTRS)
Mulqueen, Jack; Jones, David; Hopkins, Randy
2011-01-01
This paper describes a collaborative engineering process developed by the Marshall Space Flight Center's Advanced Concepts Office for performing rapid preliminary design and mission concept definition studies for potential future NASA missions. The process has been developed and demonstrated for a broad range of mission studies including human space exploration missions, space transportation system studies and in-space science missions. The paper will describe the design team structure and specialized analytical tools that have been developed to enable a unique rapid design process. The collaborative engineering process consists of integrated analysis approach for mission definition, vehicle definition and system engineering. The relevance of the collaborative process elements to the standard NASA NPR 7120.1 system engineering process will be demonstrated. The study definition process flow for each study discipline will be will be outlined beginning with the study planning process, followed by definition of ground rules and assumptions, definition of study trades, mission analysis and subsystem analyses leading to a standardized set of mission concept study products. The flexibility of the collaborative engineering design process to accommodate a wide range of study objectives from technology definition and requirements definition to preliminary design studies will be addressed. The paper will also describe the applicability of the collaborative engineering process to include an integrated systems analysis approach for evaluating the functional requirements of evolving system technologies and capabilities needed to meet the needs of future NASA programs.
Thermal Deformation and RF Performance Analyses for the SWOT Large Deployable Ka-Band Reflectarray
NASA Technical Reports Server (NTRS)
Fang, H.; Sunada, E.; Chaubell, J.; Esteban-Fernandez, D.; Thomson, M.; Nicaise, F.
2010-01-01
A large deployable antenna technology for the NASA Surface Water and Ocean Topography (SWOT) Mission is currently being developed by JPL in response to NRC Earth Science Tier 2 Decadal Survey recommendations. This technology is required to enable the SWOT mission due to the fact that no currently available antenna is capable of meeting SWOT's demanding Ka-Band remote sensing requirements. One of the key aspects of this antenna development is to minimize the effect of the on-orbit thermal distortion to the antenna RF performance. An analysis process which includes: 1) the on-orbit thermal analysis to obtain the temperature distribution; 2) structural deformation analysis to get the geometry of the antenna surface; and 3) the RF performance with the given deformed antenna surface has been developed to accommodate the development of this antenna technology. The detailed analysis process and some analysis results will be presented and discussed by this paper.
Mawocha, Samkeliso C; Fetters, Michael D; Legocki, Laurie J; Guetterman, Timothy C; Frederiksen, Shirley; Barsan, William G; Lewis, Roger J; Berry, Donald A; Meurer, William J
2017-06-01
Adaptive clinical trials use accumulating data from enrolled subjects to alter trial conduct in pre-specified ways based on quantitative decision rules. In this research, we sought to characterize the perspectives of key stakeholders during the development process of confirmatory-phase adaptive clinical trials within an emergency clinical trials network and to build a model to guide future development of adaptive clinical trials. We used an ethnographic, qualitative approach to evaluate key stakeholders' views about the adaptive clinical trial development process. Stakeholders participated in a series of multidisciplinary meetings during the development of five adaptive clinical trials and completed a Strengths-Weaknesses-Opportunities-Threats questionnaire. In the analysis, we elucidated overarching themes across the stakeholders' responses to develop a conceptual model. Four major overarching themes emerged during the analysis of stakeholders' responses to questioning: the perceived statistical complexity of adaptive clinical trials and the roles of collaboration, communication, and time during the development process. Frequent and open communication and collaboration were viewed by stakeholders as critical during the development process, as were the careful management of time and logistical issues related to the complexity of planning adaptive clinical trials. The Adaptive Design Development Model illustrates how statistical complexity, time, communication, and collaboration are moderating factors in the adaptive design development process. The intensity and iterative nature of this process underscores the need for funding mechanisms for the development of novel trial proposals in academic settings.
Activating clinical trials: a process improvement approach.
Martinez, Diego A; Tsalatsanis, Athanasios; Yalcin, Ali; Zayas-Castro, José L; Djulbegovic, Benjamin
2016-02-24
The administrative process associated with clinical trial activation has been criticized as costly, complex, and time-consuming. Prior research has concentrated on identifying administrative barriers and proposing various solutions to reduce activation time, and consequently associated costs. Here, we expand on previous research by incorporating social network analysis and discrete-event simulation to support process improvement decision-making. We searched for all operational data associated with the administrative process of activating industry-sponsored clinical trials at the Office of Clinical Research of the University of South Florida in Tampa, Florida. We limited the search to those trials initiated and activated between July 2011 and June 2012. We described the process using value stream mapping, studied the interactions of the various process participants using social network analysis, and modeled potential process modifications using discrete-event simulation. The administrative process comprised 5 sub-processes, 30 activities, 11 decision points, 5 loops, and 8 participants. The mean activation time was 76.6 days. Rate-limiting sub-processes were those of contract and budget development. Key participants during contract and budget development were the Office of Clinical Research, sponsors, and the principal investigator. Simulation results indicate that slight increments on the number of trials, arriving to the Office of Clinical Research, would increase activation time by 11 %. Also, incrementing the efficiency of contract and budget development would reduce the activation time by 28 %. Finally, better synchronization between contract and budget development would reduce time spent on batching documentation; however, no improvements would be attained in total activation time. The presented process improvement analytic framework not only identifies administrative barriers, but also helps to devise and evaluate potential improvement scenarios. The strength of our framework lies in its system analysis approach that recognizes the stochastic duration of the activation process and the interdependence between process activities and entities.
ERIC Educational Resources Information Center
Johnson-Leslie, Natalie; Gaskill, LuAnn R.
2006-01-01
While the process and practices of retail product development in developed countries have been documented, a void exists in descriptive analysis regarding retail product development in an international setting. The primary purpose of this study was to explore small business apparel retailing, and specifically the retail product development process…
USDA-ARS?s Scientific Manuscript database
Using five centimeter resolution images acquired with an unmanned aircraft system (UAS), we developed and evaluated an image processing workflow that included the integration of resolution-appropriate field sampling, feature selection, object-based image analysis, and processing approaches for UAS i...
ERIC Educational Resources Information Center
Aagard, James A.; Ansbro, Thomas M.
The Naval Enlisted Professional Development Information System (NEPDIS) was designed to function as a fully computerized information assembly and analysis system to support labor force, personnel, and training management. The NEPDIS comprises separate training development, instructional, training record and evaluation, career development, and…
NASA Technical Reports Server (NTRS)
Bonine, Lauren
2015-01-01
The presentation provides insight into the schedule risk analysis process used by the Stratospheric Aerosol and Gas Experiment III on the International Space Station Project. The presentation focuses on the schedule risk analysis process highlighting the methods for identification of risk inputs, the inclusion of generic risks identified outside the traditional continuous risk management process, and the development of tailored analysis products used to improve risk informed decision making.
Comparative Analysis of the Measurement of Total Instructional Alignment
ERIC Educational Resources Information Center
Kick, Laura C.
2013-01-01
In 2007, Lisa Carter created the Total Instructional Alignment system--a process that aligns standards, curriculum, assessment, and instruction. Employed in several hundred school systems, the TIA process is a successful professional development program. The researcher developed an instrument to measure the success of the TIA process with the…
Contributing Factors to a Successful Online Course Development Process
ERIC Educational Resources Information Center
Stevens, Karl B.
2013-01-01
This qualitative case study examined the experiences of instructional designers and professors during the online course development process. The purpose of this study was to determine if their experiences had an effect on the process itself. Data analysis revealed five emergent themes: communication, commitment to quality online courses,…
Collection, processing and dissemination of data for the national solar demonstration program
NASA Technical Reports Server (NTRS)
Day, R. E.; Murphy, L. J.; Smok, J. T.
1978-01-01
A national solar data system developed for the DOE by IBM provides for automatic gathering, conversion, transfer, and analysis of demonstration site data. NASA requirements for this system include providing solar site hardware, engineering, data collection, and analysis. The specific tasks include: (1) solar energy system design/integration; (2) developing a site data acquisition subsystem; (3) developing a central data processing system; (4) operating the test facility at Marshall Space Flight Center; (5) collecting and analyzing data. The systematic analysis and evaluation of the data from the National Solar Data System is reflected in a monthly performance report and a solar energy system performance evaluation report.
Research and Analysis of Image Processing Technologies Based on DotNet Framework
NASA Astrophysics Data System (ADS)
Ya-Lin, Song; Chen-Xi, Bai
Microsoft.Net is a kind of most popular program development tool. This paper gave a detailed analysis concluded about some image processing technologies of the advantages and disadvantages by .Net processed image while the same algorithm is used in Programming experiments. The result shows that the two best efficient methods are unsafe pointer and Direct 3D, and Direct 3D used to 3D simulation development, and the others are useful in some fields while these technologies are poor efficiency and not suited to real-time processing. The experiment results in paper will help some projects about image processing and simulation based DotNet and it has strong practicability.
ERIC Educational Resources Information Center
Foley, John P., Jr.
A study was conducted to refine and coordinate occupational analysis, job performance aids, and elements of the instructional systems development process for task specific Air Force maintenance training. Techniques for task identification and analysis (TI & A) and data gathering techniques for occupational analysis were related. While TI &…
Designing Image Analysis Pipelines in Light Microscopy: A Rational Approach.
Arganda-Carreras, Ignacio; Andrey, Philippe
2017-01-01
With the progress of microscopy techniques and the rapidly growing amounts of acquired imaging data, there is an increased need for automated image processing and analysis solutions in biological studies. Each new application requires the design of a specific image analysis pipeline, by assembling a series of image processing operations. Many commercial or free bioimage analysis software are now available and several textbooks and reviews have presented the mathematical and computational fundamentals of image processing and analysis. Tens, if not hundreds, of algorithms and methods have been developed and integrated into image analysis software, resulting in a combinatorial explosion of possible image processing sequences. This paper presents a general guideline methodology to rationally address the design of image processing and analysis pipelines. The originality of the proposed approach is to follow an iterative, backwards procedure from the target objectives of analysis. The proposed goal-oriented strategy should help biologists to better apprehend image analysis in the context of their research and should allow them to efficiently interact with image processing specialists.
Development of a New VLBI Data Analysis Software
NASA Technical Reports Server (NTRS)
Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.
2010-01-01
We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.
Sensitivity analysis of the add-on price estimate for the edge-defined film-fed growth process
NASA Technical Reports Server (NTRS)
Mokashi, A. R.; Kachare, A. H.
1981-01-01
The analysis is in terms of cost parameters and production parameters. The cost parameters include equipment, space, direct labor, materials, and utilities. The production parameters include growth rate, process yield, and duty cycle. A computer program was developed specifically to do the sensitivity analysis.
Strategic and Market Analysis | Bioenergy | NREL
recent efforts in comparative techno-economic analysis. Our analysis considers a wide range of conversion Intermediates NREL has developed first-of-its-kind process models and economic assessments of the co-processing work strives to understand the economic incentives, technical risks, and key data gaps that need to be
2013-03-01
9 B. REQUIREMENTS ANALYSIS PROCESS ..................................................9 1. Requirements Management and... Analysis Plan ................................9 2. Knowledge Point Reviews .................................................................11 3...are Identified .......12 5. RMAP/CDD Process Analysis and Results......................................13 IV. TD PHASE BEGINS
Flat-plate solar array project. Volume 5: Process development
NASA Technical Reports Server (NTRS)
Gallagher, B.; Alexander, P.; Burger, D.
1986-01-01
The goal of the Process Development Area, as part of the Flat-Plate Solar Array (FSA) Project, was to develop and demonstrate solar cell fabrication and module assembly process technologies required to meet the cost, lifetime, production capacity, and performance goals of the FSA Project. R&D efforts expended by Government, Industry, and Universities in developing processes capable of meeting the projects goals during volume production conditions are summarized. The cost goals allocated for processing were demonstrated by small volume quantities that were extrapolated by cost analysis to large volume production. To provide proper focus and coverage of the process development effort, four separate technology sections are discussed: surface preparation, junction formation, metallization, and module assembly.
Usability engineering: domain analysis activities for augmented-reality systems
NASA Astrophysics Data System (ADS)
Gabbard, Joseph; Swan, J. E., II; Hix, Deborah; Lanzagorta, Marco O.; Livingston, Mark; Brown, Dennis B.; Julier, Simon J.
2002-05-01
This paper discusses our usability engineering process for the Battlefield Augmented Reality System (BARS). Usability engineering is a structured, iterative, stepwise development process. Like the related disciplines of software and systems engineering, usability engineering is a combination of management principals and techniques, formal and semi- formal evaluation techniques, and computerized tools. BARS is an outdoor augmented reality system that displays heads- up battlefield intelligence information to a dismounted warrior. The paper discusses our general usability engineering process. We originally developed the process in the context of virtual reality applications, but in this work we are adapting the procedures to an augmented reality system. The focus of this paper is our work on domain analysis, the first activity of the usability engineering process. We describe our plans for and our progress to date on our domain analysis for BARS. We give results in terms of a specific urban battlefield use case we have designed.
Process-based Cost Estimation for Ramjet/Scramjet Engines
NASA Technical Reports Server (NTRS)
Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John
2003-01-01
Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.
On Certain Theoretical Developments Underlying the Hilbert-Huang Transform
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Blank, Karin; Flatley, Thomas; Huang, Norden E.; Petrick, David; Hestness, Phyllis
2006-01-01
One of the main traditional tools used in scientific and engineering data spectral analysis is the Fourier Integral Transform and its high performance digital equivalent - the Fast Fourier Transform (FFT). Both carry strong a-priori assumptions about the source data, such as being linear and stationary, and of satisfying the Dirichlet conditions. A recent development at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC), known as the Hilbert-Huang Transform (HHT), proposes a novel approach to the solution for the nonlinear class of spectral analysis problems. Using a-posteriori data processing based on the Empirical Mode Decomposition (EMD) sifting process (algorithm), followed by the normalized Hilbert Transform of the decomposed data, the HHT allows spectral analysis of nonlinear and nonstationary data. The EMD sifting process results in a non-constrained decomposition of a source real-value data vector into a finite set of Intrinsic Mode Functions (IMF). These functions form a nearly orthogonal derived from the data (adaptive) basis. The IMFs can be further analyzed for spectrum content by using the classical Hilbert Transform. A new engineering spectral analysis tool using HHT has been developed at NASA GSFC, the HHT Data Processing System (HHT-DPS). As the HHT-DPS has been successfully used and commercialized, new applications pose additional questions about the theoretical basis behind the HHT and EMD algorithms. Why is the fastest changing component of a composite signal being sifted out first in the EMD sifting process? Why does the EMD sifting process seemingly converge and why does it converge rapidly? Does an IMF have a distinctive structure? Why are the IMFs nearly orthogonal? We address these questions and develop the initial theoretical background for the HHT. This will contribute to the development of new HHT processing options, such as real-time and 2-D processing using Field Programmable Gate Array (FPGA) computational resources,
Efficiency analysis of wood processing industry in China during 2006-2015
NASA Astrophysics Data System (ADS)
Zhang, Kun; Yuan, Baolong; Li, Yanxuan
2018-03-01
The wood processing industry is an important industry which affects the national economy and social development. The data envelopment analysis model (DEA) is a quantitative evaluation method for studying industrial efficiency. In this paper, the wood processing industry of 8 provinces in southern China is taken as the study object, and the efficiency of each province in 2006 to 2015 was measured and calculated with the DEA method, and the efficiency changes, technological changes and Malmquist index were analyzed dynamically. The empirical results show that there is a widening gap in the efficiency of wood processing industry of the 8 provinces, and the technological progress has shown a lag in the promotion of wood processing industry. According to the research conclusion, along with the situation of domestic and foreign wood processing industry development, the government must introduce relevant policies to strengthen the construction of the wood processing industry technology innovation policy system and the industrial coordinated development system.
Designing a ticket to ride with the Cognitive Work Analysis Design Toolkit.
Read, Gemma J M; Salmon, Paul M; Lenné, Michael G; Jenkins, Daniel P
2015-01-01
Cognitive work analysis has been applied in the design of numerous sociotechnical systems. The process used to translate analysis outputs into design concepts, however, is not always clear. Moreover, structured processes for translating the outputs of ergonomics methods into concrete designs are lacking. This paper introduces the Cognitive Work Analysis Design Toolkit (CWA-DT), a design approach which has been developed specifically to provide a structured means of incorporating cognitive work analysis outputs in design using design principles and values derived from sociotechnical systems theory. This paper outlines the CWA-DT and describes its application in a public transport ticketing design case study. Qualitative and quantitative evaluations of the process provide promising early evidence that the toolkit fulfils the evaluation criteria identified for its success, with opportunities for improvement also highlighted. The Cognitive Work Analysis Design Toolkit has been developed to provide ergonomics practitioners with a structured approach for translating the outputs of cognitive work analysis into design solutions. This paper demonstrates an application of the toolkit and provides evaluation findings.
Application of structured analysis to a telerobotic system
NASA Technical Reports Server (NTRS)
Dashman, Eric; Mclin, David; Harrison, F. W.; Soloway, Donald; Young, Steven
1990-01-01
The analysis and evaluation of a multiple arm telerobotic research and demonstration system developed by the NASA Intelligent Systems Research Laboratory (ISRL) is described. Structured analysis techniques were used to develop a detailed requirements model of an existing telerobotic testbed. Performance models generated during this process were used to further evaluate the total system. A commercial CASE tool called Teamwork was used to carry out the structured analysis and development of the functional requirements model. A structured analysis and design process using the ISRL telerobotic system as a model is described. Evaluation of this system focused on the identification of bottlenecks in this implementation. The results demonstrate that the use of structured methods and analysis tools can give useful performance information early in a design cycle. This information can be used to ensure that the proposed system meets its design requirements before it is built.
A mathematical study of a random process proposed as an atmospheric turbulence model
NASA Technical Reports Server (NTRS)
Sidwell, K.
1977-01-01
A random process is formed by the product of a local Gaussian process and a random amplitude process, and the sum of that product with an independent mean value process. The mathematical properties of the resulting process are developed, including the first and second order properties and the characteristic function of general order. An approximate method for the analysis of the response of linear dynamic systems to the process is developed. The transition properties of the process are also examined.
Measuring the software process and product: Lessons learned in the SEL
NASA Technical Reports Server (NTRS)
Basili, V. R.
1985-01-01
The software development process and product can and should be measured. The software measurement process at the Software Engineering Laboratory (SEL) has taught a major lesson: develop a goal-driven paradigm (also characterized as a goal/question/metric paradigm) for data collection. Project analysis under this paradigm leads to a design for evaluating and improving the methodology of software development and maintenance.
An expert system for integrated structural analysis and design optimization for aerospace structures
NASA Technical Reports Server (NTRS)
1992-01-01
The results of a research study on the development of an expert system for integrated structural analysis and design optimization is presented. An Object Representation Language (ORL) was developed first in conjunction with a rule-based system. This ORL/AI shell was then used to develop expert systems to provide assistance with a variety of structural analysis and design optimization tasks, in conjunction with procedural modules for finite element structural analysis and design optimization. The main goal of the research study was to provide expertise, judgment, and reasoning capabilities in the aerospace structural design process. This will allow engineers performing structural analysis and design, even without extensive experience in the field, to develop error-free, efficient and reliable structural designs very rapidly and cost-effectively. This would not only improve the productivity of design engineers and analysts, but also significantly reduce time to completion of structural design. An extensive literature survey in the field of structural analysis, design optimization, artificial intelligence, and database management systems and their application to the structural design process was first performed. A feasibility study was then performed, and the architecture and the conceptual design for the integrated 'intelligent' structural analysis and design optimization software was then developed. An Object Representation Language (ORL), in conjunction with a rule-based system, was then developed using C++. Such an approach would improve the expressiveness for knowledge representation (especially for structural analysis and design applications), provide ability to build very large and practical expert systems, and provide an efficient way for storing knowledge. Functional specifications for the expert systems were then developed. The ORL/AI shell was then used to develop a variety of modules of expert systems for a variety of modeling, finite element analysis, and design optimization tasks in the integrated aerospace structural design process. These expert systems were developed to work in conjunction with procedural finite element structural analysis and design optimization modules (developed in-house at SAT, Inc.). The complete software, AutoDesign, so developed, can be used for integrated 'intelligent' structural analysis and design optimization. The software was beta-tested at a variety of companies, used by a range of engineers with different levels of background and expertise. Based on the feedback obtained by such users, conclusions were developed and are provided.
An expert system for integrated structural analysis and design optimization for aerospace structures
NASA Astrophysics Data System (ADS)
1992-04-01
The results of a research study on the development of an expert system for integrated structural analysis and design optimization is presented. An Object Representation Language (ORL) was developed first in conjunction with a rule-based system. This ORL/AI shell was then used to develop expert systems to provide assistance with a variety of structural analysis and design optimization tasks, in conjunction with procedural modules for finite element structural analysis and design optimization. The main goal of the research study was to provide expertise, judgment, and reasoning capabilities in the aerospace structural design process. This will allow engineers performing structural analysis and design, even without extensive experience in the field, to develop error-free, efficient and reliable structural designs very rapidly and cost-effectively. This would not only improve the productivity of design engineers and analysts, but also significantly reduce time to completion of structural design. An extensive literature survey in the field of structural analysis, design optimization, artificial intelligence, and database management systems and their application to the structural design process was first performed. A feasibility study was then performed, and the architecture and the conceptual design for the integrated 'intelligent' structural analysis and design optimization software was then developed. An Object Representation Language (ORL), in conjunction with a rule-based system, was then developed using C++. Such an approach would improve the expressiveness for knowledge representation (especially for structural analysis and design applications), provide ability to build very large and practical expert systems, and provide an efficient way for storing knowledge. Functional specifications for the expert systems were then developed. The ORL/AI shell was then used to develop a variety of modules of expert systems for a variety of modeling, finite element analysis, and design optimization tasks in the integrated aerospace structural design process. These expert systems were developed to work in conjunction with procedural finite element structural analysis and design optimization modules (developed in-house at SAT, Inc.). The complete software, AutoDesign, so developed, can be used for integrated 'intelligent' structural analysis and design optimization. The software was beta-tested at a variety of companies, used by a range of engineers with different levels of background and expertise. Based on the feedback obtained by such users, conclusions were developed and are provided.
WISE: Automated support for software project management and measurement. M.S. Thesis
NASA Technical Reports Server (NTRS)
Ramakrishnan, Sudhakar
1995-01-01
One important aspect of software development and IV&V is measurement. Unless a software development effort is measured in some way, it is difficult to judge the effectiveness of current efforts and predict future performances. Collection of metrics and adherence to a process are difficult tasks in a software project. Change activity is a powerful indicator of project status. Automated systems that can handle change requests, issues, and other process documents provide an excellent platform for tracking the status of the project. A World Wide Web based architecture is developed for (a) making metrics collection an implicit part of the software process, (b) providing metric analysis dynamically, (c) supporting automated tools that can complement current practices of in-process improvement, and (d) overcoming geographical barrier. An operational system (WISE) instantiates this architecture allowing for the improvement of software process in a realistic environment. The tool tracks issues in software development process, provides informal communication between the users with different roles, supports to-do lists (TDL), and helps in software process improvement. WISE minimizes the time devoted to metrics collection, analysis, and captures software change data. Automated tools like WISE focus on understanding and managing the software process. The goal is improvement through measurement.
The Dispositions for Culturally Responsive Pedagogy Scale
ERIC Educational Resources Information Center
Whitaker, Manya C.; Valtierra, Kristina Marie
2018-01-01
Purpose: The purpose of this study is to develop and validate the dispositions for culturally responsive pedagogy scale (DCRPS). Design/methodology/approach: Scale development consisted of a six-step process including item development, expert review, exploratory factor analysis, factor interpretation, confirmatory factor analysis and convergent…
Situation-specific theories from the middle-range transitions theory.
Im, Eun-Ok
2014-01-01
The purpose of this article was to analyze the theory development process of the situation-specific theories that were derived from the middle-range transitions theory. This analysis aims to provide directions for future development of situation-specific theories. First, transitions theory is concisely described with its history, goal, and major concepts. Then, the approach that was used to retrieve the situation-specific theories derived from transitions theory is described. Next, an analysis of 6 situation-specific theories is presented. Finally, 4 themes reflecting commonalities and variances in the theory development process are discussed with implications for future theoretical development.
[Development of Hospital Equipment Maintenance Information System].
Zhou, Zhixin
2015-11-01
Hospital equipment maintenance information system plays an important role in improving medical treatment quality and efficiency. By requirement analysis of hospital equipment maintenance, the system function diagram is drawed. According to analysis of input and output data, tables and reports in connection with equipment maintenance process, relationships between entity and attribute is found out, and E-R diagram is drawed and relational database table is established. Software development should meet actual process requirement of maintenance and have a friendly user interface and flexible operation. The software can analyze failure cause by statistical analysis.
Hailey, P A; Doherty, P; Tapsell, P; Oliver, T; Aldridge, P K
1996-03-01
An automated system for the on-line monitoring of powder blending processes is described. The system employs near-infrared (NIR) spectroscopy using fibre-optics and a graphical user interface (GUI) developed in the LabVIEW environment. The complete supervisory control and data analysis (SCADA) software controls blender and spectrophotometer operation and performs statistical spectral data analysis in real time. A data analysis routine using standard deviation is described to demonstrate an approach to the real-time determination of blend homogeneity.
Understanding healing: a conceptual analysis.
Wendler, M C
1996-10-01
The practice of the healing arts has been a part of human history since ancient times. Despite the development of related scholarly concepts in nursing such as caring, healing remains an enigma. Using conceptual analysis a clear definition of healing within a Rogerian/Newmanian framework is explicated. Case development assists in the understanding of healing as a concept, and questions arising from this definition provide focus for further scholarly work. A result of this process of concept analysis was the development of a definition of healing which is clear and which fits the theoretical underpinnings of the unitary-transformative paradigm. Healing, as a core variable of interest in the study of health, provides important parameters for study. The definition of healing which arose from the concept analysis is: Healing is an experiential, energy-requiring process in which space is created through a caring relationship in a process of expanding consciousness and results in a sense of wholeness, integration, balance and transformation and which can never be fully known.
Sentiment analysis of Arabic tweets using text mining techniques
NASA Astrophysics Data System (ADS)
Al-Horaibi, Lamia; Khan, Muhammad Badruddin
2016-07-01
Sentiment analysis has become a flourishing field of text mining and natural language processing. Sentiment analysis aims to determine whether the text is written to express positive, negative, or neutral emotions about a certain domain. Most sentiment analysis researchers focus on English texts, with very limited resources available for other complex languages, such as Arabic. In this study, the target was to develop an initial model that performs satisfactorily and measures Arabic Twitter sentiment by using machine learning approach, Naïve Bayes and Decision Tree for classification algorithms. The datasets used contains more than 2,000 Arabic tweets collected from Twitter. We performed several experiments to check the performance of the two algorithms classifiers using different combinations of text-processing functions. We found that available facilities for Arabic text processing need to be made from scratch or improved to develop accurate classifiers. The small functionalities developed by us in a Python language environment helped improve the results and proved that sentiment analysis in the Arabic domain needs lot of work on the lexicon side.
Educational Technology in Military Training Applications: A Current Assessment.
ERIC Educational Resources Information Center
Platt, William A.; Andrews, Dee H.
This chapter considers the history of instructional development (ID) in the military, with particular emphasis on the U.S. Navy. The ID process used at the Navy's Instructional Program Development Centers is presented, including the process for simulator development. An in-depth analysis of the problems encountered with educational technology…
NASA Astrophysics Data System (ADS)
Magomedova, D. K.; Efimov, M. A.; Murashkin, M. Yu.
2018-05-01
The main purpose of this work was the development of an experimental technique for search and analysis of pore formation in the presented material. Geometry of the samples, the procedure of experiment and processing the samples for investigation were developed.
ISO 9000 and/or Systems Engineering Capability Maturity Model?
NASA Technical Reports Server (NTRS)
Gholston, Sampson E.
2002-01-01
For businesses and organizations to remain competitive today they must have processes and systems in place that will allow them to first identify customer needs and then develop products/processes that will meet or exceed the customers needs and expectations. Customer needs, once identified, are normally stated as requirements. Designers can then develop products/processes that will meet these requirements. Several functions, such as quality management and systems engineering management are used to assist product development teams in the development process. Both functions exist in all organizations and both have a similar objective, which is to ensure that developed processes will meet customer requirements. Are efforts in these organizations being duplicated? Are both functions needed by organizations? What are the similarities and differences between the functions listed above? ISO 9000 is an international standard of goods and services. It sets broad requirements for the assurance of quality and for management's involvement. It requires organizations to document the processes and to follow these documented processes. ISO 9000 gives customers assurance that the suppliers have control of the process for product development. Systems engineering can broadly be defined as a discipline that seeks to ensure that all requirements for a system are satisfied throughout the life of the system by preserving their interrelationship. The key activities of systems engineering include requirements analysis, functional analysis/allocation, design synthesis and verification, and system analysis and control. The systems engineering process, when followed properly, will lead to higher quality products, lower cost products, and shorter development cycles. The System Engineering Capability Maturity Model (SE-CMM) will allow companies to measure their system engineering capability and continuously improve those capabilities. ISO 9000 and SE-CMM seem to have a similar objective, which is to document the organization's processes and certify to potential customers the capability of a supplier to control the processes that determine the quality of the product or services being produced. The remaining sections of this report examine the differences and similarities between ISO 9000 and SE-CMM and make recommendations for implementation.
ERIC Educational Resources Information Center
Duffy, Melissa C.; Azevedo, Roger; Sun, Ning-Zi; Griscom, Sophia E.; Stead, Victoria; Crelinsten, Linda; Wiseman, Jeffrey; Maniatis, Thomas; Lachapelle, Kevin
2015-01-01
This study examined the nature of cognitive, metacognitive, and affective processes among a medical team experiencing difficulty managing a challenging simulated medical emergency case by conducting in-depth analysis of process data. Medical residents participated in a simulation exercise designed to help trainees to develop medical expertise,…
Scarp development in the Valles Marineris
NASA Technical Reports Server (NTRS)
Patton, P. C.
1984-01-01
The scarps along the margins of the Vales Marineris display a complex assemblage of forms that have been related to a variety of mass wasting and sapping processes. These scarp segments display variations in the degree of development of spur and gully topography, the number and density of apparent sapping features and the frequency of large scale landslides which reflect the age, geology and processes of slope development throughout the Valles Marineris. This regional analysis should provide more information on the geologic evolution of the Valles Marineris as well as new insight into the relative importance of different processes in the development of the scarp forms. In order to evaluate the regional variation in scarp form and the influence of time and structure on scarp development geomorphic mapping and morphometric analysis of geologically distinct regions of Valles Marineris is being undertaken.
microRNA expression profiling in fetal single ventricle malformation identified by deep sequencing.
Yu, Zhang-Bin; Han, Shu-Ping; Bai, Yun-Fei; Zhu, Chun; Pan, Ya; Guo, Xi-Rong
2012-01-01
microRNAs (miRNAs) have emerged as key regulators in many biological processes, particularly cardiac growth and development, although the specific miRNA expression profile associated with this process remains to be elucidated. This study aimed to characterize the cellular microRNA profile involved in the development of congenital heart malformation, through the investigation of single ventricle (SV) defects. Comprehensive miRNA profiling in human fetal SV cardiac tissue was performed by deep sequencing. Differential expression of 48 miRNAs was revealed by sequencing by oligonucleotide ligation and detection (SOLiD) analysis. Of these, 38 were down-regulated and 10 were up-regulated in differentiated SV cardiac tissue, compared to control cardiac tissue. This was confirmed by real-time quantitative reverse transcription-polymerase chain reaction (qRT-PCR) analysis. Predicted target genes of the 48 differentially expressed miRNAs were analyzed by gene ontology and categorized according to cellular process, regulation of biological process and metabolic process. Pathway-Express analysis identified the WNT and mTOR signaling pathways as the most significant processes putatively affected by the differential expression of these miRNAs. The candidate genes involved in cardiac development were identified as potential targets for these differentially expressed microRNAs and the collaborative network of microRNAs and cardiac development related-mRNAs was constructed. These data provide the basis for future investigation of the mechanism of the occurrence and development of fetal SV malformations.
Heat and Mass Transfer Processes in Scrubber of Flue Gas Heat Recovery Device
NASA Astrophysics Data System (ADS)
Veidenbergs, Ivars; Blumberga, Dagnija; Vigants, Edgars; Kozuhars, Grigorijs
2010-01-01
The paper deals with the heat and mass transfer process research in a flue gas heat recovery device, where complicated cooling, evaporation and condensation processes are taking place simultaneously. The analogy between heat and mass transfer is used during the process of analysis. In order to prepare a detailed process analysis based on heat and mass process descriptive equations, as well as the correlation for wet gas parameter calculation, software in the
The Developmental Process of the Growing Motile Ciliary Tip Region.
Reynolds, Matthew J; Phetruen, Tanaporn; Fisher, Rebecca L; Chen, Ke; Pentecost, Brian T; Gomez, George; Ounjai, Puey; Sui, Haixin
2018-05-22
Eukaryotic motile cilia/flagella play vital roles in various physiological processes in mammals and some protists. Defects in cilia formation underlie multiple human disorders, known as ciliopathies. The detailed processes of cilia growth and development are still far from clear despite extensive studies. In this study, we characterized the process of cilium formation (ciliogenesis) by investigating the newly developed motile cilia of deciliated protists using complementary techniques in electron microscopy and image analysis. Our results demonstrated that the distal tip region of motile cilia exhibit progressive morphological changes as cilia develop. This developmental process is time-dependent and continues after growing cilia reach their full lengths. The structural analysis of growing ciliary tips revealed that B-tubules of axonemal microtubule doublets terminate far away from the tip end, which is led by the flagellar tip complex (FTC), demonstrating that the FTC might not directly mediate the fast turnover of intraflagellar transport (IFT).
[Mathematic analysis of risk factors influence on occupational respiratory diseases development].
Budkar', L N; Bugaeva, I V; Obukhova, T Iu; Tereshina, L G; Karpova, E A; Shmonina, O G
2010-01-01
Analysis covered 1348 case histories of workers exposed to industrial dust in Urals region. The analysis applied mathematical processing of survival theory and correlation analysis. The authors studied influence of various factors: dust concentration, connective tissue dysplasia, smoking habits--on duration for diseases caused by dust to appear. Findings are that occupational diseases develop reliably faster with higher ambient dust concentrations and with connective tissue dysplasia syndrome. Smoking habits do not alter duration of pneumoconiosis development, but reliably increases development of occupational dust bronchitis.
Philippine Wind Farm Analysis and Site Selection Analysis, 1 January 2000 - 31 December 2000
DOE Office of Scientific and Technical Information (OSTI.GOV)
Conover, K.
2001-12-01
The U.S. Department of Energy (DOE), through the National Renewable Energy Laboratory (NREL), has been working in partnership with the U.S. Agency for International Development (USAID) in an ongoing process to quantify the Philippine wind energy potential and foster wind farm development. As part of that process, NREL retained Global Energy Concepts, LLC (GEC) to review and update the policy needs as well as develop a site-screening process applicable for the Philippines. GEC worked closely with the Philippines National Power Corporation (NPC) in completing this work. This report provides the results of the policy needs and site selection analyses conductedmore » by GEC.« less
A prototype for communitising technology: Development of a smart salt water desalination device
NASA Astrophysics Data System (ADS)
Fakharuddin, F. M.; Fatchurrohman, N.; Puteh, S.; Puteri, H. M. A. R.
2018-04-01
Desalination is defined as the process that removes minerals from saline water or commonly known as salt water. Seawater desalination is becoming an attractive source of drinking water in coastal states as the costs for desalination declines. The purpose of this study is to develop a small scale desalination device and able to do an analysis of the process flow by using suitable sensors. Thermal technology was used to aid the desalination process. A graphical user interface (GUI) for the interface was made to enable the real time data analysis of the desalination device. ArduinoTM microcontroller was used in this device in order to develop an automatic device.
Formal Analysis of BPMN Models Using Event-B
NASA Astrophysics Data System (ADS)
Bryans, Jeremy W.; Wei, Wei
The use of business process models has gone far beyond documentation purposes. In the development of business applications, they can play the role of an artifact on which high level properties can be verified and design errors can be revealed in an effort to reduce overhead at later software development and diagnosis stages. This paper demonstrates how formal verification may add value to the specification, design and development of business process models in an industrial setting. The analysis of these models is achieved via an algorithmic translation from the de-facto standard business process modeling language BPMN to Event-B, a widely used formal language supported by the Rodin platform which offers a range of simulation and verification technologies.
San Diego's Capital Planning Process
ERIC Educational Resources Information Center
Lytton, Michael
2009-01-01
This article describes San Diego's capital planning process. As part of its capital planning process, the San Diego Unified School District has developed a systematic analysis of functional quality at each of its school sites. The advantage of this approach is that it seeks to develop and apply quantifiable metrics and standards for the more…
Model for Simulating a Spiral Software-Development Process
NASA Technical Reports Server (NTRS)
Mizell, Carolyn; Curley, Charles; Nayak, Umanath
2010-01-01
A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code), productivity (number of lines of code per hour), and number of defects per source line of code. The user provides the number of resources, the overall percent of effort that should be allocated to each process step, and the number of desired staff members for each step. The output of PATT includes the size of the product, a measure of effort, a measure of rework effort, the duration of the entire process, and the numbers of injected, detected, and corrected defects as well as a number of other interesting features. In the development of the present model, steps were added to the IEEE 12207 waterfall process, and this model and its implementing software were made to run repeatedly through the sequence of steps, each repetition representing an iteration in a spiral process. Because the IEEE 12207 model is founded on a waterfall paradigm, it enables direct comparison of spiral and waterfall processes. The model can be used throughout a software-development project to analyze the project as more information becomes available. For instance, data from early iterations can be used as inputs to the model, and the model can be used to estimate the time and cost of carrying the project to completion.
Cognitive/Information Processing Psychology and Instruction: Reviewing Recent Theory and Practice.
ERIC Educational Resources Information Center
Gallagher, John P.
1979-01-01
Discusses recent developments in instructional psychology relative to cognitive task analysis, individual difference variables, and cognitive models of interactive instructional decision making, which use constructs developed within the field of cognitive/information processing psychology. (Author/WBC)
Dynamic analysis of process reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shadle, L.J.; Lawson, L.O.; Noel, S.D.
1995-06-01
The approach and methodology of conducting a dynamic analysis is presented in this poster session in order to describe how this type of analysis can be used to evaluate the operation and control of process reactors. Dynamic analysis of the PyGas{trademark} gasification process is used to illustrate the utility of this approach. PyGas{trademark} is the gasifier being developed for the Gasification Product Improvement Facility (GPIF) by Jacobs-Siffine Engineering and Riley Stoker. In the first step of the analysis, process models are used to calculate the steady-state conditions and associated sensitivities for the process. For the PyGas{trademark} gasifier, the process modelsmore » are non-linear mechanistic models of the jetting fluidized-bed pyrolyzer and the fixed-bed gasifier. These process sensitivities are key input, in the form of gain parameters or transfer functions, to the dynamic engineering models.« less
Understanding Online Teacher Best Practices: A Thematic Analysis to Improve Learning
ERIC Educational Resources Information Center
Corry, Michael; Ianacone, Robert; Stella, Julie
2014-01-01
The purpose of this study was to examine brick-and-mortar and online teacher best practice themes using thematic analysis and a newly developed theory-based analytic process entitled Synthesized Thematic Analysis Criteria (STAC). The STAC was developed to facilitate the meaningful thematic analysis of research based best practices of K-12…
A review of risk management process in construction projects of developing countries
NASA Astrophysics Data System (ADS)
Bahamid, R. A.; Doh, S. I.
2017-11-01
In the construction industry, risk management concept is a less popular technique. There are three main stages in the systematic approach to risk management in construction industry. These stages include: a) risk response; b) risk analysis and evaluation; and c) risk identification. The high risk related to construction business affects each of its participants; while operational analysis and management of construction related risks remain an enormous task to practitioners of the industry. This paper tends towards reviewing the existing literature on construction project risk managements in developing countries specifically on risk management process. The literature lacks ample risk management process approach capable of capturing risk impact on diverse project objectives. This literature review aims at discovering the frequently used techniques in risk identification and analysis. It also attempts to identify response to clarifying the different classifications of risk sources in the existing literature of developing countries, and to identify the future research directions on project risks in the area of construction in developing countries.
Main Engine Prototype Development for 2nd Generation RLV RS-83
NASA Technical Reports Server (NTRS)
Vilja, John; Fisher, Mark; Lyles, Garry M. (Technical Monitor)
2002-01-01
This presentation reports on the NASA project to develop a prototype for RS-83 engine designed for use on reusable launch vehicles (RLV). Topics covered include: program objectives, overview schedule, organizational chart, integrated systems engineering processes, requirement analysis, catastrophic engine loss, maintainability analysis tools, and prototype design analysis.
A Narrative Analysis of a Teacher Educator's Professional Learning Journey
ERIC Educational Resources Information Center
Vanassche, Eline; Kelchtermans, Geert
2016-01-01
This article reports on a narrative analysis of one teacher educator's learning journey in a two-year professional development project. Professional development is conceived of as the complex learning processes resulting from the meaningful interactions between the individual teacher educator and his/her working context. Our analysis indicates…
Carroll, Adam J; Badger, Murray R; Harvey Millar, A
2010-07-14
Standardization of analytical approaches and reporting methods via community-wide collaboration can work synergistically with web-tool development to result in rapid community-driven expansion of online data repositories suitable for data mining and meta-analysis. In metabolomics, the inter-laboratory reproducibility of gas-chromatography/mass-spectrometry (GC/MS) makes it an obvious target for such development. While a number of web-tools offer access to datasets and/or tools for raw data processing and statistical analysis, none of these systems are currently set up to act as a public repository by easily accepting, processing and presenting publicly submitted GC/MS metabolomics datasets for public re-analysis. Here, we present MetabolomeExpress, a new File Transfer Protocol (FTP) server and web-tool for the online storage, processing, visualisation and statistical re-analysis of publicly submitted GC/MS metabolomics datasets. Users may search a quality-controlled database of metabolite response statistics from publicly submitted datasets by a number of parameters (eg. metabolite, species, organ/biofluid etc.). Users may also perform meta-analysis comparisons of multiple independent experiments or re-analyse public primary datasets via user-friendly tools for t-test, principal components analysis, hierarchical cluster analysis and correlation analysis. They may interact with chromatograms, mass spectra and peak detection results via an integrated raw data viewer. Researchers who register for a free account may upload (via FTP) their own data to the server for online processing via a novel raw data processing pipeline. MetabolomeExpress https://www.metabolome-express.org provides a new opportunity for the general metabolomics community to transparently present online the raw and processed GC/MS data underlying their metabolomics publications. Transparent sharing of these data will allow researchers to assess data quality and draw their own insights from published metabolomics datasets.
NASA Technical Reports Server (NTRS)
1981-01-01
The engineering design, fabrication, assembly, operation, economic analysis, and process support research and development for an Experimental Process System Development Unit for producing semiconductor-grade silicon using the slane-to-silicon process are reported. The design activity was completed. About 95% of purchased equipment was received. The draft of the operations manual was about 50% complete and the design of the free-space system continued. The system using silicon power transfer, melting, and shotting on a psuedocontinuous basis was demonstrated.
Janknegt, Robert; Scott, Mike; Mairs, Jill; Timoney, Mark; McElnay, James; Brenninkmeijer, Rob
2007-10-01
Drug selection should be a rational process that embraces the principles of evidence-based medicine. However, many factors may affect the choice of agent. It is against this background that the System of Objectified Judgement Analysis (SOJA) process for rational drug-selection was developed. This article describes how the information on which the SOJA process is based, was researched and processed.
NASA Astrophysics Data System (ADS)
Mehrpooya, Mehdi; Ansarinasab, Hojat; Moftakhari Sharifzadeh, Mohammad Mehdi; Rosen, Marc A.
2017-10-01
An integrated power plant with a net electrical power output of 3.71 × 105 kW is developed and investigated. The electrical efficiency of the process is found to be 60.1%. The process includes three main sub-systems: molten carbonate fuel cell system, heat recovery section and cryogenic carbon dioxide capturing process. Conventional and advanced exergoeconomic methods are used for analyzing the process. Advanced exergoeconomic analysis is a comprehensive evaluation tool which combines an exergetic approach with economic analysis procedures. With this method, investment and exergy destruction costs of the process components are divided into endogenous/exogenous and avoidable/unavoidable parts. Results of the conventional exergoeconomic analyses demonstrate that the combustion chamber has the largest exergy destruction rate (182 MW) and cost rate (13,100 /h). Also, the total process cost rate can be decreased by reducing the cost rate of the fuel cell and improving the efficiency of the combustion chamber and heat recovery steam generator. Based on the total avoidable endogenous cost rate, the priority for modification is the heat recovery steam generator, a compressor and a turbine of the power plant, in rank order. A sensitivity analysis is done to investigate the exergoeconomic factor parameters through changing the effective parameter variations.
On-Line GIS Analysis and Image Processing for Geoportal Kielce/poland Development
NASA Astrophysics Data System (ADS)
Hejmanowska, B.; Głowienka, E.; Florek-Paszkowski, R.
2016-06-01
GIS databases are widely available on the Internet, but mainly for visualization with limited functionality; very simple queries are possible i.e. attribute query, coordinate readout, line and area measurements or pathfinder. A little more complex analysis (i.e. buffering or intersection) are rare offered. Paper aims at the concept of Geoportal functionality development in the field of GIS analysis. Multi-Criteria Evaluation (MCE) is planned to be implemented in web application. OGC Service is used for data acquisition from the server and results visualization. Advanced GIS analysis is planned in PostGIS and Python programming. In the paper an example of MCE analysis basing on Geoportal Kielce is presented. Other field where Geoportal can be developed is implementation of processing new available satellite images free of charge (Sentinel-2, Landsat 8, ASTER, WV-2). Now we are witnessing a revolution in access to the satellite imagery without charge. This should result in an increase of interest in the use of these data in various fields by a larger number of users, not necessarily specialists in remote sensing. Therefore, it seems reasonable to expand the functionality of Internet's tools for data processing by non-specialists, by automating data collection and prepared predefined analysis.
NASA Astrophysics Data System (ADS)
Filippov, A. V.; Tarasov, S. Yu; Podgornyh, O. A.; Shamarin, N. N.; Filippova, E. O.
2017-01-01
Automatization of engineering processes requires developing relevant mathematical support and a computer software. Analysis of metal cutting kinematics and tool geometry is a necessary key task at the preproduction stage. This paper is focused on developing a procedure for determining the geometry of oblique peakless round-nose tool lathe machining with the use of vector/matrix transformations. Such an approach allows integration into modern mathematical software packages in distinction to the traditional analytic description. Such an advantage is very promising for developing automated control of the preproduction process. A kinematic criterion for the applicable tool geometry has been developed from the results of this study. The effect of tool blade inclination and curvature on the geometry-dependent process parameters was evaluated.
Correlation signatures of wet soils and snows. [algorithm development and computer programming
NASA Technical Reports Server (NTRS)
Phillips, M. R.
1972-01-01
Interpretation, analysis, and development of algorithms have provided the necessary computational programming tools for soil data processing, data handling and analysis. Algorithms that have been developed thus far, are adequate and have been proven successful for several preliminary and fundamental applications such as software interfacing capabilities, probability distributions, grey level print plotting, contour plotting, isometric data displays, joint probability distributions, boundary mapping, channel registration and ground scene classification. A description of an Earth Resources Flight Data Processor, (ERFDP), which handles and processes earth resources data under a users control is provided.
NASA Technical Reports Server (NTRS)
1975-01-01
A separation method to provide reasonable yields of high specificity isoenzymes for the purpose of large scale, early clinical diagnosis of diseases and organic damage such as, myocardial infarction, hepatoma, muscular dystrophy, and infectous disorders is presented. Preliminary development plans are summarized. An analysis of required research and development and production resources is included. The costs of such resources and the potential profitability of a commercial space processing opportunity for electrophoretic separation of high specificity isoenzymes are reviewed.
Aviation System Analysis Capability Executive Assistant Development
NASA Technical Reports Server (NTRS)
Roberts, Eileen; Villani, James A.; Anderson, Kevin; Book, Paul
1999-01-01
In this technical document, we describe the development of the Aviation System Analysis Capability (ASAC) Executive Assistant (EA) Proof of Concept (POC) and Beta version. We describe the genesis and role of the ASAC system, discuss the objectives of the ASAC system and provide an overview of components and models in the ASAC system, and describe the design process and the results of the ASAC EA POC and Beta system development. We also describe the evaluation process and results for applicable COTS software. The document has seven chapters, a bibliography, and two appendices.
Streamlining project delivery through risk analysis.
DOT National Transportation Integrated Search
2015-08-01
Project delivery is a significant area of concern and is subject to several risks throughout Plan Development : Process (PDP). These risks are attributed to major areas of project development, such as environmental : analysis, right-of-way (ROW) acqu...
Kim, Jung Woo; Sul, Sang Hun; Choi, Jae Boong
2018-06-07
In a hyper-connected society, IoT environment, markets are rapidly changing as smartphones penetrate global market. As smartphones are applied to various digital media, development of a novel smart product is required. In this paper, a Smart Product Design-Finite Element Analysis Process (SPD-FEAP) is developed to adopt fast-changing tends and user requirements that can be visually verified. The user requirements are derived and quantitatively evaluated from Smart Quality Function Deployment (SQFD) using WebData. Then the usage scenarios are created according to the priority of the functions derived from SQFD. 3D shape analysis by Finite Element Analysis (FEA) was conducted and printed out through Rapid Prototyping (RP) technology to identify any possible errors. Thus, a User Customized Smart Keyboard has been developed using SPD-FEAP. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Mangalgiri, P. D.; Prabhakaran, R.
1986-01-01
An algorithm for vectorized computation of stiffness matrices of an 8 noded isoparametric hexahedron element for geometric nonlinear analysis was developed. This was used in conjunction with the earlier 2-D program GAMNAS to develop the new program NAS3D for geometric nonlinear analysis. A conventional, modified Newton-Raphson process is used for the nonlinear analysis. New schemes for the computation of stiffness and strain energy release rates is presented. The organization the program is explained and some results on four sample problems are given. The study of CPU times showed that savings by a factor of 11 to 13 were achieved when vectorized computation was used for the stiffness instead of the conventional scalar one. Finally, the scheme of inputting data is explained.
Developing a framework for transferring knowledge into action: a thematic analysis of the literature
Ward, Vicky; House, Allan; Hamer, Susan
2010-01-01
Objectives Although there is widespread agreement about the importance of transferring knowledge into action, we still lack high quality information about what works, in which settings and with whom. Whilst there are a large number of models and theories for knowledge transfer interventions, they are untested meaning that their applicability and relevance is largely unknown. This paper describes the development of a conceptual framework of translating knowledge into action and discusses how it can be used for developing a useful model of the knowledge transfer process. Methods A narrative review of the knowledge transfer literature identified 28 different models which explained all or part of the knowledge transfer process. The models were subjected to a thematic analysis to identify individual components and the types of processes used when transferring knowledge into action. The results were used to build a conceptual framework of the process. Results Five common components of the knowledge transfer process were identified: problem identification and communication; knowledge/research development and selection; analysis of context; knowledge transfer activities or interventions; and knowledge/research utilization. We also identified three types of knowledge transfer processes: a linear process; a cyclical process; and a dynamic multidirectional process. From these results a conceptual framework of knowledge transfer was developed. The framework illustrates the five common components of the knowledge transfer process and shows that they are connected via a complex, multidirectional set of interactions. As such the framework allows for the individual components to occur simultaneously or in any given order and to occur more than once during the knowledge transfer process. Conclusion Our framework provides a foundation for gathering evidence from case studies of knowledge transfer interventions. We propose that future empirical work is designed to test and refine the relevant importance and applicability of each of the components in order to build more useful models of knowledge transfer which can serve as a practical checklist for planning or evaluating knowledge transfer activities. PMID:19541874
NASA Technical Reports Server (NTRS)
1980-01-01
The design, fabrication, and installation of an experimental process system development unit (EPSDU) were analyzed. Supporting research and development were performed to provide an information data base usable for the EPSDU and for technological design and economical analysis for potential scale-up of the process. Iterative economic analyses were conducted for the estimated product cost for the production of semiconductor grade silicon in a facility capable of producing 1000-MT/Yr.
Analysis and Development of a Web-Enabled Planning and Scheduling Database Application
2013-09-01
establishes an entity—relationship diagram for the desired process, constructs an operable database using MySQL , and provides a web- enabled interface for...development, develop, design, process, re- engineering, reengineering, MySQL , structured query language, SQL, myPHPadmin. 15. NUMBER OF PAGES 107 16...relationship diagram for the desired process, constructs an operable database using MySQL , and provides a web-enabled interface for the population of
1994-03-25
metrics [DISA93b]. " The Software Engineering Institute (SET) has developed a domain analysis process (Feature-Oriented Domain Analysis - FODA ) and is...and expresses the range of variability of these decisions. 3.2.2.3 Feature Oriented Domain Analysis Feature Oriented Domain Analysis ( FODA ) is a domain...documents created in this phase. From a purely profit-oriented business point of view, a company may develop its own analysis of a government or commercial
Gene network analysis: from heart development to cardiac therapy.
Ferrazzi, Fulvia; Bellazzi, Riccardo; Engel, Felix B
2015-03-01
Networks offer a flexible framework to represent and analyse the complex interactions between components of cellular systems. In particular gene networks inferred from expression data can support the identification of novel hypotheses on regulatory processes. In this review we focus on the use of gene network analysis in the study of heart development. Understanding heart development will promote the elucidation of the aetiology of congenital heart disease and thus possibly improve diagnostics. Moreover, it will help to establish cardiac therapies. For example, understanding cardiac differentiation during development will help to guide stem cell differentiation required for cardiac tissue engineering or to enhance endogenous repair mechanisms. We introduce different methodological frameworks to infer networks from expression data such as Boolean and Bayesian networks. Then we present currently available temporal expression data in heart development and discuss the use of network-based approaches in published studies. Collectively, our literature-based analysis indicates that gene network analysis constitutes a promising opportunity to infer therapy-relevant regulatory processes in heart development. However, the use of network-based approaches has so far been limited by the small amount of samples in available datasets. Thus, we propose to acquire high-resolution temporal expression data to improve the mathematical descriptions of regulatory processes obtained with gene network inference methodologies. Especially probabilistic methods that accommodate the intrinsic variability of biological systems have the potential to contribute to a deeper understanding of heart development.
Knowledge Representation Artifacts for Use in Sensemaking Support Systems
2015-03-12
and manual processing must be replaced by automated processing wherever it makes sense and is possible. Clearly, given the data and cognitive...knowledge-centric view to situation analysis and decision-making as previously discussed, has lead to the development of several automated processing components...for use in sensemaking support systems [6-11]. In turn, automated processing has required the development of appropriate knowledge
NASA Technical Reports Server (NTRS)
Consoli, Robert David; Sobieszczanski-Sobieski, Jaroslaw
1990-01-01
Advanced multidisciplinary analysis and optimization methods, namely system sensitivity analysis and non-hierarchical system decomposition, are applied to reduce the cost and improve the visibility of an automated vehicle design synthesis process. This process is inherently complex due to the large number of functional disciplines and associated interdisciplinary couplings. Recent developments in system sensitivity analysis as applied to complex non-hierarchic multidisciplinary design optimization problems enable the decomposition of these complex interactions into sub-processes that can be evaluated in parallel. The application of these techniques results in significant cost, accuracy, and visibility benefits for the entire design synthesis process.
The Model of Career Anchors as a Tool in the Analysis of Instructional Developers.
ERIC Educational Resources Information Center
Miller, Carol
1981-01-01
Examines the importance of human systems as a relevant aspect of development processes and looks at the career anchor model proposed by Schein as a possible area in the analysis of the instructional developer/client relationships. Fourteen references are listed. (Author/LLS)
Mazza, Monica; Mariano, Melania; Peretti, Sara; Masedu, Francesco; Pino, Maria Chiara; Valenti, Marco
2017-05-01
Individuals with autism spectrum disorders (ASD) show significant impairments in social skills and theory of mind (ToM). The aim of this study was to evaluate ToM and social information processing abilities in 52 children with ASD compared to 55 typically developing (TD) children. A mediation analysis evaluated whether social information processing abilities can be mediated by ToM competences. In our results, children with autism showed a deficit in social skills and ToM components. The innovative results of our study applying mediation analysis demonstrate that ToM plays a key role in the development of social abilities, and the lack of ToM competences in children with autism impairs their competent social behavior.
Dai, Heng; Ye, Ming; Walker, Anthony P.; ...
2017-03-28
A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Heng; Ye, Ming; Walker, Anthony P.
A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less
INTEGRATED ENVIRONMENTAL ASSESSMENT OF THE MID-ATLANTIC REGION WITH ANALYTICAL NETWORK PROCESS
A decision analysis method for integrating environmental indicators was developed. This was a combination of Principal Component Analysis (PCA) and the Analytic Network Process (ANP). Being able to take into account interdependency among variables, the method was capable of ran...
Determinants of job stress in chemical process industry: A factor analysis approach.
Menon, Balagopal G; Praveensal, C J; Madhu, G
2015-01-01
Job stress is one of the active research domains in industrial safety research. The job stress can result in accidents and health related issues in workers in chemical process industries. Hence it is important to measure the level of job stress in workers so as to mitigate the same to avoid the worker's safety related problems in the industries. The objective of this study is to determine the job stress factors in the chemical process industry in Kerala state, India. This study also aims to propose a comprehensive model and an instrument framework for measuring job stress levels in the chemical process industries in Kerala, India. The data is collected through a questionnaire survey conducted in chemical process industries in Kerala. The collected data out of 1197 surveys is subjected to principal component and confirmatory factor analysis to develop the job stress factor structure. The factor analysis revealed 8 factors that influence the job stress in process industries. It is also found that the job stress in employees is most influenced by role ambiguity and the least by work environment. The study has developed an instrument framework towards measuring job stress utilizing exploratory factor analysis and structural equation modeling.
Development of the nervous system occurs through a series of critical processes, each of which may be sensitive to disruption by environmental contaminants. In vitro culture of neurons can be used to model these processes and evaluate the potential of chemicals to act as develop...
Low-cost digital image processing at the University of Oklahoma
NASA Technical Reports Server (NTRS)
Harrington, J. A., Jr.
1981-01-01
Computer assisted instruction in remote sensing at the University of Oklahoma involves two separate approaches and is dependent upon initial preprocessing of a LANDSAT computer compatible tape using software developed for an IBM 370/158 computer. In-house generated preprocessing algorithms permits students or researchers to select a subset of a LANDSAT scene for subsequent analysis using either general purpose statistical packages or color graphic image processing software developed for Apple II microcomputers. Procedures for preprocessing the data and image analysis using either of the two approaches for low-cost LANDSAT data processing are described.
NASA Astrophysics Data System (ADS)
Saldan, Yosyp R.; Pavlov, Sergii V.; Vovkotrub, Dina V.; Saldan, Yulia Y.; Vassilenko, Valentina B.; Mazur, Nadia I.; Nikolaichuk, Daria V.; Wójcik, Waldemar; Romaniuk, Ryszard; Suleimenov, Batyrbek; Bainazarov, Ulan
2017-08-01
Process of eye tomogram obtaining by means of optical coherent tomography is studied. Stages of idiopathic macula holes formation in the process of eye grounds diagnostics are considered. Main stages of retina pathology progression are determined: Fuzzy logic units for obtaining reliable conclusions regarding the result of diagnosis are developed. By the results of theoretical and practical research system and technique of retinal macular region of the eye state analysis is developed ; application of the system, based on fuzzy logic device, improves the efficiency of eye retina complex.
Meta-analysis using Dirichlet process.
Muthukumarana, Saman; Tiwari, Ram C
2016-02-01
This article develops a Bayesian approach for meta-analysis using the Dirichlet process. The key aspect of the Dirichlet process in meta-analysis is the ability to assess evidence of statistical heterogeneity or variation in the underlying effects across study while relaxing the distributional assumptions. We assume that the study effects are generated from a Dirichlet process. Under a Dirichlet process model, the study effects parameters have support on a discrete space and enable borrowing of information across studies while facilitating clustering among studies. We illustrate the proposed method by applying it to a dataset on the Program for International Student Assessment on 30 countries. Results from the data analysis, simulation studies, and the log pseudo-marginal likelihood model selection procedure indicate that the Dirichlet process model performs better than conventional alternative methods. © The Author(s) 2012.
ERIC Educational Resources Information Center
Galliher, Renee V.; Kerpelman, Jennifer L.
2012-01-01
This analysis of the papers in the special section on the intersection of identity development and peer relationship processes calls attention to conceptual contribution this collection of papers makes to the literature on identity development. Together these ten papers build on strong theoretical foundations in identity development, which posit…
ERIC Educational Resources Information Center
Nordtveit, Bjorn Harald
2010-01-01
Development is often understood as a linear process of change towards Western modernity, a vision that is challenged by this paper, arguing that development efforts should rather be connected to the local stakeholders' sense of their own development. Further, the paper contends that Complexity Theory is more effective than a linear theory of…
NASA Technical Reports Server (NTRS)
1980-01-01
Technical activities are reported in the design of process, facilities, and equipment for producing silicon at a rate and price comensurate with production goals for low cost solar cell modules. The silane-silicone process has potential for providing high purity poly-silicon on a commercial scale at a price of fourteen dollars per kilogram by 1986, (1980 dollars). Commercial process, economic analysis, process support research and development, and quality control are discussed.
Research on the EDM Technology for Micro-holes at Complex Spatial Locations
NASA Astrophysics Data System (ADS)
Y Liu, J.; Guo, J. M.; Sun, D. J.; Cai, Y. H.; Ding, L. T.; Jiang, H.
2017-12-01
For the demands on machining micro-holes at complex spatial location, several key technical problems are conquered such as micro-Electron Discharge Machining (micro-EDM) power supply system’s development, the host structure’s design and machining process technical. Through developing low-voltage power supply circuit, high-voltage circuit, micro and precision machining circuit and clearance detection system, the narrow pulse and high frequency six-axis EDM machining power supply system is developed to meet the demands on micro-hole discharging machining. With the method of combining the CAD structure design, CAE simulation analysis, modal test, ODS (Operational Deflection Shapes) test and theoretical analysis, the host construction and key axes of the machine tool are optimized to meet the position demands of the micro-holes. Through developing the special deionized water filtration system to make sure that the machining process is stable enough. To verify the machining equipment and processing technical developed in this paper through developing the micro-hole’s processing flow and test on the real machine tool. As shown in the final test results: the efficient micro-EDM machining pulse power supply system, machine tool host system, deionized filtration system and processing method developed in this paper meet the demands on machining micro-holes at complex spatial locations.
Meat Processor. Ohio's Competency Analysis Profile.
ERIC Educational Resources Information Center
Ohio State Univ., Columbus. Vocational Instructional Materials Lab.
Developed through a modified DACUM (Developing a Curriculum) process involving business, industry, labor, and community agency representatives in Ohio, this document is a comprehensive and verified employer competency profile for meat processing occupations. The list contains units (with and without subunits), competencies, and competency builders…
NASA Astrophysics Data System (ADS)
Telipenko, E.; Chernysheva, T.; Zakharova, A.; Dumchev, A.
2015-10-01
The article represents research results about the knowledge base development for the intellectual information system for the bankruptcy risk assessment of the enterprise. It is described the process analysis of the knowledge base development; the main process stages, some problems and their solutions are given. The article introduces the connectionist model for the bankruptcy risk assessment based on the analysis of industrial enterprise financial accounting. The basis for this connectionist model is a three-layer perceptron with the back propagation of error algorithm. The knowledge base for the intellectual information system consists of processed information and the processing operation method represented as the connectionist model. The article represents the structure of the intellectual information system, the knowledge base, and the information processing algorithm for neural network training. The paper shows mean values of 10 indexes for industrial enterprises; with the help of them it is possible to carry out a financial analysis of industrial enterprises and identify correctly the current situation for well-timed managerial decisions. Results are given about neural network testing on the data of both bankrupt and financially strong enterprises, which were not included into training and test sets.
NASA Astrophysics Data System (ADS)
Hempelmann, Nils; Ehbrecht, Carsten; Alvarez-Castro, Carmen; Brockmann, Patrick; Falk, Wolfgang; Hoffmann, Jörg; Kindermann, Stephan; Koziol, Ben; Nangini, Cathy; Radanovics, Sabine; Vautard, Robert; Yiou, Pascal
2018-01-01
Analyses of extreme weather events and their impacts often requires big data processing of ensembles of climate model simulations. Researchers generally proceed by downloading the data from the providers and processing the data files ;at home; with their own analysis processes. However, the growing amount of available climate model and observation data makes this procedure quite awkward. In addition, data processing knowledge is kept local, instead of being consolidated into a common resource of reusable code. These drawbacks can be mitigated by using a web processing service (WPS). A WPS hosts services such as data analysis processes that are accessible over the web, and can be installed close to the data archives. We developed a WPS named 'flyingpigeon' that communicates over an HTTP network protocol based on standards defined by the Open Geospatial Consortium (OGC), to be used by climatologists and impact modelers as a tool for analyzing large datasets remotely. Here, we present the current processes we developed in flyingpigeon relating to commonly-used processes (preprocessing steps, spatial subsets at continent, country or region level, and climate indices) as well as methods for specific climate data analysis (weather regimes, analogues of circulation, segetal flora distribution, and species distribution models). We also developed a novel, browser-based interactive data visualization for circulation analogues, illustrating the flexibility of WPS in designing custom outputs. Bringing the software to the data instead of transferring the data to the code is becoming increasingly necessary, especially with the upcoming massive climate datasets.
Djuris, Jelena; Medarevic, Djordje; Krstic, Marko; Djuric, Zorica; Ibric, Svetlana
2013-06-01
This study illustrates the application of experimental design and multivariate data analysis in defining design space for granulation and tableting processes. According to the quality by design concepts, critical quality attributes (CQAs) of granules and tablets, as well as critical parameters of granulation and tableting processes, were identified and evaluated. Acetaminophen was used as the model drug, and one of the study aims was to investigate the possibility of the development of immediate- or extended-release acetaminophen tablets. Granulation experiments were performed in the fluid bed processor using polyethylene oxide polymer as a binder in the direct granulation method. Tablets were compressed in the laboratory excenter tablet press. The first set of experiments was organized according to Plackett-Burman design, followed by the full factorial experimental design. Principal component analysis and partial least squares regression were applied as the multivariate analysis techniques. By using these different methods, CQAs and process parameters were identified and quantified. Furthermore, an in-line method was developed to monitor the temperature during the fluidized bed granulation process, to foresee possible defects in granules CQAs. Various control strategies that are based on the process understanding and assure desired quality attributes of the product are proposed. Copyright © 2013 Wiley Periodicals, Inc.
Lindström, Nils O; De Sena Brandine, Guilherme; Tran, Tracy; Ransick, Andrew; Suh, Gio; Guo, Jinjin; Kim, Albert D; Parvez, Riana K; Ruffins, Seth W; Rutledge, Elisabeth A; Thornton, Matthew E; Grubbs, Brendan; McMahon, Jill A; Smith, Andrew D; McMahon, Andrew P
2018-06-04
Mammalian nephrons arise from a limited nephron progenitor pool through a reiterative inductive process extending over days (mouse) or weeks (human) of kidney development. Here, we present evidence that human nephron patterning reflects a time-dependent process of recruitment of mesenchymal progenitors into an epithelial nephron precursor. Progressive recruitment predicted from high-resolution image analysis and three-dimensional reconstruction of human nephrogenesis was confirmed through direct visualization and cell fate analysis of mouse kidney organ cultures. Single-cell RNA sequencing of the human nephrogenic niche provided molecular insights into these early patterning processes and predicted developmental trajectories adopted by nephron progenitor cells in forming segment-specific domains of the human nephron. The temporal-recruitment model for nephron polarity and patterning suggested by direct analysis of human kidney development provides a framework for integrating signaling pathways driving mammalian nephrogenesis. Copyright © 2018 Elsevier Inc. All rights reserved.
Solar energy concentrator system for crystal growth and zone refining in space
NASA Technical Reports Server (NTRS)
Mcdermit, J. H.
1975-01-01
The technological feasibility of using solar concentrators for crystal growth and zone refining in space has been performed. Previous studies of space-deployed solar concentrators were reviewed for their applicability to materials processing and a new state-of-the-art concentrator-receiver radiation analysis was developed. The radiation analysis is in the form of a general purpose computer program. It was concluded from this effort that the technology for fabricating, orbiting and deploying large solar concentrators has been developed. It was also concluded that the technological feasibility of space processing materials in the focal region of a solar concentrator depends primarily on two factors: (1) the ability of a solar concentrator to provide sufficient thermal energy for the process and (2) the ability of a solar concentrator to provide a thermal environment that is conductive to the processes of interest. The analysis indicate that solar concentrators can satisfactorily provide both of these factors.
NASA Astrophysics Data System (ADS)
Bianchetti, Raechel Anne
Remotely sensed images have become a ubiquitous part of our daily lives. From novice users, aiding in search and rescue missions using tools such as TomNod, to trained analysts, synthesizing disparate data to address complex problems like climate change, imagery has become central to geospatial problem solving. Expert image analysts are continually faced with rapidly developing sensor technologies and software systems. In response to these cognitively demanding environments, expert analysts develop specialized knowledge and analytic skills to address increasingly complex problems. This study identifies the knowledge, skills, and analytic goals of expert image analysts tasked with identification of land cover and land use change. Analysts participating in this research are currently working as part of a national level analysis of land use change, and are well versed with the use of TimeSync, forest science, and image analysis. The results of this study benefit current analysts as it improves their awareness of their mental processes used during the image interpretation process. The study also can be generalized to understand the types of knowledge and visual cues that analysts use when reasoning with imagery for purposes beyond land use change studies. Here a Cognitive Task Analysis framework is used to organize evidence from qualitative knowledge elicitation methods for characterizing the cognitive aspects of the TimeSync image analysis process. Using a combination of content analysis, diagramming, semi-structured interviews, and observation, the study highlights the perceptual and cognitive elements of expert remote sensing interpretation. Results show that image analysts perform several standard cognitive processes, but flexibly employ these processes in response to various contextual cues. Expert image analysts' ability to think flexibly during their analysis process was directly related to their amount of image analysis experience. Additionally, results show that the basic Image Interpretation Elements continue to be important despite technological augmentation of the interpretation process. These results are used to derive a set of design guidelines for developing geovisual analytic tools and training to support image analysis.
Li, Yongxin; Kikuchi, Mani; Li, Xueyan; Gao, Qionghua; Xiong, Zijun; Ren, Yandong; Zhao, Ruoping; Mao, Bingyu; Kondo, Mariko; Irie, Naoki; Wang, Wen
2018-01-01
Sea cucumbers, one main class of Echinoderms, have a very fast and drastic metamorphosis process during their development. However, the molecular basis under this process remains largely unknown. Here we systematically examined the gene expression profiles of Japanese common sea cucumber (Apostichopus japonicus) for the first time by RNA sequencing across 16 developmental time points from fertilized egg to juvenile stage. Based on the weighted gene co-expression network analysis (WGCNA), we identified 21 modules. Among them, MEdarkmagenta was highly expressed and correlated with the early metamorphosis process from late auricularia to doliolaria larva. Furthermore, gene enrichment and differentially expressed gene analysis identified several genes in the module that may play key roles in the metamorphosis process. Our results not only provide a molecular basis for experimentally studying the development and morphological complexity of sea cucumber, but also lay a foundation for improving its emergence rate. Copyright © 2017 Elsevier Inc. All rights reserved.
Micromechanical Characterization and Texture Analysis of Direct Cast Titanium Alloys Strips
NASA Technical Reports Server (NTRS)
2000-01-01
This research was conducted to determine a post-processing technique to optimize mechanical and material properties of a number of Titanium based alloys and aluminides processed via Melt Overflow Solidification Technique (MORST). This technique was developed by NASA for the development of thin sheet titanium and titanium aluminides used in high temperature applications. The materials investigated in this study included conventional titanium alloy strips and foils, Ti-1100, Ti-24Al-11Nb (Alpha-2), and Ti-48Al-2Ta (Gamma). The methodology used included micro-characterization, heat-treatment, mechanical processing and mechanical testing. Characterization techniques included optical, electron microscopy, and x-ray texture analysis. The processing included heat-treatment and mechanical deformation through cold rolling. The initial as-cast materials were evaluated for their microstructure and mechanical properties. Different heat-treatment and rolling steps were chosen to process these materials. The properties were evaluated further and a processing relationship was established in order to obtain an optimum processing condition. The results showed that the as-cast material exhibited a Widmanstatten (fine grain) microstructure that developed into a microstructure with larger grains through processing steps. The texture intensity showed little change for all processing performed in this investigation.
Emerging and recurrent issues in drug development.
Anello, C
This paper reviews several emerging and recurrent issues relating to the drug development process. These emerging issues include changes to the FDA regulatory environment, internationalization of drug development, advances in computer technology and visualization tools, and efforts to incorporate meta-analysis methodology. Recurrent issues include: renewed interest in statistical methods for handling subgroups in the design and analysis of clinical trials; renewed interest in alternatives to the 'intention-to-treat' analysis in the presence of non-compliance in randomized clinical trials; renewed interest in methodology to address the multiplicities resulting from a variety of sources inherent in the drug development process, and renewed interest in methods to assure data integrity. These emerging and recurrent issues provide a continuing challenge to the international community of statisticians involved in drug development. Moreover, the involvement of statisticians with different perspectives continues to enrich the field and contributes to improvement in the public health.
NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.
Zhang, Bo; Dai, Ji; Zhang, Tao
2017-11-13
In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided by NeoAnalysis, users can easily obtain publication-quality figures without writing complex codes. NeoAnalysis is a powerful and valuable toolbox for users doing electrophysiological experiments.
Heuristic Task Analysis on E-Learning Course Development: A Formative Research Study
ERIC Educational Resources Information Center
Lee, Ji-Yeon; Reigeluth, Charles M.
2009-01-01
Utilizing heuristic task analysis (HTA), a method developed for eliciting, analyzing, and representing expertise in complex cognitive tasks, a formative research study was conducted on the task of e-learning course development to further improve the HTA process. Three instructional designers from three different post-secondary institutions in the…
Operation, Modeling and Analysis of the Reverse Water Gas Shift Process
NASA Technical Reports Server (NTRS)
Whitlow, Jonathan E.
2001-01-01
The Reverse Water Gas Shift process is a candidate technology for water and oxygen production on Mars under the In-Situ Propellant Production project. This report focuses on the operation and analysis of the Reverse Water Gas Shift (RWGS) process, which has been constructed at Kennedy Space Center. A summary of results from the initial operation of the RWGS, process along with an analysis of these results is included in this report. In addition an evaluation of a material balance model developed from the work performed previously under the summer program is included along with recommendations for further experimental work.
Ott, Denise; Kralisch, Dana; Denčić, Ivana; Hessel, Volker; Laribi, Yosra; Perrichon, Philippe D; Berguerand, Charline; Kiwi-Minsker, Lioubov; Loeb, Patrick
2014-12-01
As the demand for new drugs is rising, the pharmaceutical industry faces the quest of shortening development time, and thus, reducing the time to market. Environmental aspects typically still play a minor role within the early phase of process development. Nevertheless, it is highly promising to rethink, redesign, and optimize process strategies as early as possible in active pharmaceutical ingredient (API) process development, rather than later at the stage of already established processes. The study presented herein deals with a holistic life-cycle-based process optimization and intensification of a pharmaceutical production process targeting a low-volume, high-value API. Striving for process intensification by transfer from batch to continuous processing, as well as an alternative catalytic system, different process options are evaluated with regard to their environmental impact to identify bottlenecks and improvement potentials for further process development activities. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Stocker, Elena; Becker, Karin; Hate, Siddhi; Hohl, Roland; Schiemenz, Wolfgang; Sacher, Stephan; Zimmer, Andreas; Salar-Behzadi, Sharareh
2017-01-01
This study aimed to apply quality risk management based on the The International Conference on Harmonisation guideline Q9 for the early development stage of hot melt coated multiparticulate systems for oral administration. N-acetylcysteine crystals were coated with a formulation composing tripalmitin and polysorbate 65. The critical quality attributes (CQAs) were initially prioritized using failure mode and effects analysis. The CQAs of the coated material were defined as particle size, taste-masking efficiency, and immediate release profile. The hot melt coated process was characterized via a flowchart, based on the identified potential critical process parameters (CPPs) and their impact on the CQAs. These CPPs were prioritized using a process failure mode, effects, and criticality analysis and their critical impact on the CQAs was experimentally confirmed using a statistical design of experiments. Spray rate, atomization air pressure, and air flow rate were identified as CPPs. Coating amount and content of polysorbate 65 in the coating formulation were identified as critical material attributes. A hazard and critical control points analysis was applied to define control strategies at the critical process points. A fault tree analysis evaluated causes for potential process failures. We successfully demonstrated that a standardized quality risk management approach optimizes the product development sustainability and supports the regulatory aspects. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Lillesand, T. M.; Meisner, D. E. (Principal Investigator)
1980-01-01
An investigation was conducted into ways to improve the involvement of state and local user personnel in the digital image analysis process by isolating those elements of the analysis process which require extensive involvement by field personnel and providing means for performing those activities apart from a computer facility. In this way, the analysis procedure can be converted from a centralized activity focused on a computer facility to a distributed activity in which users can interact with the data at the field office level or in the field itself. A general image processing software was developed on the University of Minnesota computer system (Control Data Cyber models 172 and 74). The use of color hardcopy image data as a primary medium in supervised training procedures was investigated and digital display equipment and a coordinate digitizer were procured.
Supporting Handoff in Asynchronous Collaborative Sensemaking Using Knowledge-Transfer Graphs.
Zhao, Jian; Glueck, Michael; Isenberg, Petra; Chevalier, Fanny; Khan, Azam
2018-01-01
During asynchronous collaborative analysis, handoff of partial findings is challenging because externalizations produced by analysts may not adequately communicate their investigative process. To address this challenge, we developed techniques to automatically capture and help encode tacit aspects of the investigative process based on an analyst's interactions, and streamline explicit authoring of handoff annotations. We designed our techniques to mediate awareness of analysis coverage, support explicit communication of progress and uncertainty with annotation, and implicit communication through playback of investigation histories. To evaluate our techniques, we developed an interactive visual analysis system, KTGraph, that supports an asynchronous investigative document analysis task. We conducted a two-phase user study to characterize a set of handoff strategies and to compare investigative performance with and without our techniques. The results suggest that our techniques promote the use of more effective handoff strategies, help increase an awareness of prior investigative process and insights, as well as improve final investigative outcomes.
The DACUM Job Analysis Process.
ERIC Educational Resources Information Center
Dofasco, Inc., Hamilton (Ontario).
This document explains the DACUM (Developing A Curriculum) process for analyzing task-based jobs to: identify where standard operating procedures are required; identify duplicated low value added tasks; develop performance standards; create job descriptions; and identify the elements that must be included in job-specific training programs. The…
Bifrost: a Modular Python/C++ Framework for Development of High-Throughput Data Analysis Pipelines
NASA Astrophysics Data System (ADS)
Cranmer, Miles; Barsdell, Benjamin R.; Price, Danny C.; Garsden, Hugh; Taylor, Gregory B.; Dowell, Jayce; Schinzel, Frank; Costa, Timothy; Greenhill, Lincoln J.
2017-01-01
Large radio interferometers have data rates that render long-term storage of raw correlator data infeasible, thus motivating development of real-time processing software. For high-throughput applications, processing pipelines are challenging to design and implement. Motivated by science efforts with the Long Wavelength Array, we have developed Bifrost, a novel Python/C++ framework that eases the development of high-throughput data analysis software by packaging algorithms as black box processes in a directed graph. This strategy to modularize code allows astronomers to create parallelism without code adjustment. Bifrost uses CPU/GPU ’circular memory’ data buffers that enable ready introduction of arbitrary functions into the processing path for ’streams’ of data, and allow pipelines to automatically reconfigure in response to astrophysical transient detection or input of new observing settings. We have deployed and tested Bifrost at the latest Long Wavelength Array station, in Sevilleta National Wildlife Refuge, NM, where it handles throughput exceeding 10 Gbps per CPU core.
ERIC Educational Resources Information Center
Marill, Thomas; And Others
The aim of the CYCLOPS Project research is the development of techniques for allowing computers to perform visual scene analysis, pre-processing of visual imagery, and perceptual learning. Work on scene analysis and learning has previously been described. The present report deals with research on pre-processing and with further work on scene…
29 CFR 1926.64 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2011 CFR
2011-07-01
... analysis methodology being used. (5) The employer shall establish a system to promptly address the team's... the decision as to the appropriate PHA methodology to use. All PHA methodologies are subject to... be developed in conjunction with the process hazard analysis in sufficient detail to support the...
29 CFR 1926.64 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2010 CFR
2010-07-01
... analysis methodology being used. (5) The employer shall establish a system to promptly address the team's... the decision as to the appropriate PHA methodology to use. All PHA methodologies are subject to... be developed in conjunction with the process hazard analysis in sufficient detail to support the...
Implementation of the Generic Safety Analysis Report - Lessons Learned
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blanchard, A.
1999-06-02
The Savannah River Site has completed the development, review and approval process for the Generic Safety Analysis Report (GSAR) and implemented this information in facility SARs and BIOs. This includes the yearly revision of the GSAR and the facility-specific SARs. The process has provided us with several lessons learned.
Trends in International Persuasion: Persuasion in the Arms Control Negotiations.
ERIC Educational Resources Information Center
Hopmann, P. Terrence; Walcott, Charles
An analysis of the bargaining process in international arms control negotiations is possible by developing a framework of interrelated hypotheses, by delineating and practicing interactions study called "Bargaining Process Analysis," and by formulating procedural steps that bridge the gap between laboratory studies and "real world" situations. In…
Severe storms and local weather research
NASA Technical Reports Server (NTRS)
1981-01-01
Developments in the use of space related techniques to understand storms and local weather are summarized. The observation of lightning, storm development, cloud development, mesoscale phenomena, and ageostrophic circulation are discussed. Data acquisition, analysis, and the development of improved sensor and computer systems capability are described. Signal processing and analysis and application of Doppler lidar data are discussed. Progress in numerous experiments is summarized.
ERIC Educational Resources Information Center
Callina, Kristina Schmid; Ryan, Diane; Murray, Elise D.; Colby, Anne; Damon, William; Matthews, Michael; Lerner, Richard M.
2017-01-01
A paucity of literature exists on the processes of character development within diverse contexts. In this article, the authors use the United States Military Academy at West Point (USMA) as a sample case for understanding character development processes within an institution of higher education. The authors present a discussion of relational…
Analysis and Characterization | Bioenergy | NREL
Analysis and Characterization Analysis and Characterization NREL's team of bioenergy analysts takes equipment in a lab Biomass Characterization Photo of NRELs Biochemical Process Development Unit showing a
Pedagogical issues for effective teaching of biosignal processing and analysis.
Sandham, William A; Hamilton, David J
2010-01-01
Biosignal processing and analysis is generally perceived by many students to be a challenging topic to understand, and to become adept with the necessary analytical skills. This is a direct consequence of the high mathematical content involved, and the many abstract features of the topic. The MATLAB and Mathcad software packages offer an excellent algorithm development environment for teaching biosignal processing and analysis modules, and can also be used effectively in many biosignal, and indeed bioengineering, research areas. In this paper, traditional introductory and advanced biosignal processing (and analysis) syllabi are reviewed, and the use of MATLAB and Mathcad for teaching and research is illustrated with a number of examples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Ba Nghiep; Bapanapalli, Satish K.; Smith, Mark T.
2008-09-01
The objective of our work is to enable the optimum design of lightweight automotive structural components using injection-molded long fiber thermoplastics (LFTs). To this end, an integrated approach that links process modeling to structural analysis with experimental microstructural characterization and validation is developed. First, process models for LFTs are developed and implemented into processing codes (e.g. ORIENT, Moldflow) to predict the microstructure of the as-formed composite (i.e. fiber length and orientation distributions). In parallel, characterization and testing methods are developed to obtain necessary microstructural data to validate process modeling predictions. Second, the predicted LFT composite microstructure is imported into amore » structural finite element analysis by ABAQUS to determine the response of the as-formed composite to given boundary conditions. At this stage, constitutive models accounting for the composite microstructure are developed to predict various types of behaviors (i.e. thermoelastic, viscoelastic, elastic-plastic, damage, fatigue, and impact) of LFTs. Experimental methods are also developed to determine material parameters and to validate constitutive models. Such a process-linked-structural modeling approach allows an LFT composite structure to be designed with confidence through numerical simulations. Some recent results of our collaborative research will be illustrated to show the usefulness and applications of this integrated approach.« less
High-throughput process development: I. Process chromatography.
Rathore, Anurag S; Bhambure, Rahul
2014-01-01
Chromatographic separation serves as "a workhorse" for downstream process development and plays a key role in removal of product-related, host cell-related, and process-related impurities. Complex and poorly characterized raw materials and feed material, low feed concentration, product instability, and poor mechanistic understanding of the processes are some of the critical challenges that are faced during development of a chromatographic step. Traditional process development is performed as trial-and-error-based evaluation and often leads to a suboptimal process. High-throughput process development (HTPD) platform involves an integration of miniaturization, automation, and parallelization and provides a systematic approach for time- and resource-efficient chromatography process development. Creation of such platforms requires integration of mechanistic knowledge of the process with various statistical tools for data analysis. The relevance of such a platform is high in view of the constraints with respect to time and resources that the biopharma industry faces today. This protocol describes the steps involved in performing HTPD of process chromatography step. It described operation of a commercially available device (PreDictor™ plates from GE Healthcare). This device is available in 96-well format with 2 or 6 μL well size. We also discuss the challenges that one faces when performing such experiments as well as possible solutions to alleviate them. Besides describing the operation of the device, the protocol also presents an approach for statistical analysis of the data that is gathered from such a platform. A case study involving use of the protocol for examining ion-exchange chromatography of granulocyte colony-stimulating factor (GCSF), a therapeutic product, is briefly discussed. This is intended to demonstrate the usefulness of this protocol in generating data that is representative of the data obtained at the traditional lab scale. The agreement in the data is indeed very significant (regression coefficient 0.93). We think that this protocol will be of significant value to those involved in performing high-throughput process development of process chromatography.
Framework for Multidisciplinary Analysis, Design, and Optimization with High-Fidelity Analysis Tools
NASA Technical Reports Server (NTRS)
Orr, Stanley A.; Narducci, Robert P.
2009-01-01
A plan is presented for the development of a high fidelity multidisciplinary optimization process for rotorcraft. The plan formulates individual disciplinary design problems, identifies practical high-fidelity tools and processes that can be incorporated in an automated optimization environment, and establishes statements of the multidisciplinary design problem including objectives, constraints, design variables, and cross-disciplinary dependencies. Five key disciplinary areas are selected in the development plan. These are rotor aerodynamics, rotor structures and dynamics, fuselage aerodynamics, fuselage structures, and propulsion / drive system. Flying qualities and noise are included as ancillary areas. Consistency across engineering disciplines is maintained with a central geometry engine that supports all multidisciplinary analysis. The multidisciplinary optimization process targets the preliminary design cycle where gross elements of the helicopter have been defined. These might include number of rotors and rotor configuration (tandem, coaxial, etc.). It is at this stage that sufficient configuration information is defined to perform high-fidelity analysis. At the same time there is enough design freedom to influence a design. The rotorcraft multidisciplinary optimization tool is built and substantiated throughout its development cycle in a staged approach by incorporating disciplines sequentially.
Reference Models for Structural Technology Assessment and Weight Estimation
NASA Technical Reports Server (NTRS)
Cerro, Jeff; Martinovic, Zoran; Eldred, Lloyd
2005-01-01
Previously the Exploration Concepts Branch of NASA Langley Research Center has developed techniques for automating the preliminary design level of launch vehicle airframe structural analysis for purposes of enhancing historical regression based mass estimating relationships. This past work was useful and greatly reduced design time, however its application area was very narrow in terms of being able to handle a large variety in structural and vehicle general arrangement alternatives. Implementation of the analysis approach presented herein also incorporates some newly developed computer programs. Loft is a program developed to create analysis meshes and simultaneously define structural element design regions. A simple component defining ASCII file is read by Loft to begin the design process. HSLoad is a Visual Basic implementation of the HyperSizer Application Programming Interface, which automates the structural element design process. Details of these two programs and their use are explained in this paper. A feature which falls naturally out of the above analysis paradigm is the concept of "reference models". The flexibility of the FEA based JAVA processing procedures and associated process control classes coupled with the general utility of Loft and HSLoad make it possible to create generic program template files for analysis of components ranging from something as simple as a stiffened flat panel, to curved panels, fuselage and cryogenic tank components, flight control surfaces, wings, through full air and space vehicle general arrangements.
ERIC Educational Resources Information Center
Gulaliyev, Mayis G.; Ok, Nuri I.; Musayeva, Fargana Q.; Efendiyev, Rufat J.; Musayeva, Jamila Q.; Agayeva, Samira R.
2016-01-01
The aim of the article is to study the nature of liberalization as a specific economic process, which is formed and developed under the influence of the changing conditions of the globalization and integration processes in the society, as well as to identify the characteristic differences in the processes of liberalization of Turkey and Azerbaijan…
ERIC Educational Resources Information Center
Santagata, Rossella; Bray, Wendy
2016-01-01
This study examined processes at the core of teacher professional development (PD) experiences that might positively impact teacher learning and more specifically teacher change. Four processes were considered in the context of a PD program focused on student mathematical errors: analysis of students' mathematical misconceptions as a lever for…
Development of pulsed processes for the manufacture of solar cells
NASA Technical Reports Server (NTRS)
Minnucci, J. A.
1978-01-01
The results of a 1-year program to develop the processes required for low-energy ion implantation for the automated production of silicon solar cells are described. The program included: (1) demonstrating state-of-the-art ion implantation equipment and designing an automated ion implanter, (2) making efforts to improve the performance of ion-implanted solar cells to 16.5 percent AM1, (3) developing a model of the pulse annealing process used in solar cell production, and (4) preparing an economic analysis of the process costs of ion implantation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gray, J; Lukose, R; Bronson, J
2015-06-15
Purpose: To conduct a failure mode and effects analysis (FMEA) as per AAPM Task Group 100 on clinical processes associated with teletherapy, and the development of mitigations for processes with identified high risk. Methods: A FMEA was conducted on clinical processes relating to teletherapy treatment plan development and delivery. Nine major processes were identified for analysis. These steps included CT simulation, data transfer, image registration and segmentation, treatment planning, plan approval and preparation, and initial and subsequent treatments. Process tree mapping was utilized to identify the steps contained within each process. Failure modes (FM) were identified and evaluated with amore » scale of 1–10 based upon three metrics: the severity of the effect, the probability of occurrence, and the detectability of the cause. The analyzed metrics were scored as follows: severity – no harm = 1, lethal = 10; probability – not likely = 1, certainty = 10; detectability – always detected = 1, undetectable = 10. The three metrics were combined multiplicatively to determine the risk priority number (RPN) which defined the overall score for each FM and the order in which process modifications should be deployed. Results: Eighty-nine procedural steps were identified with 186 FM accompanied by 193 failure effects with 213 potential causes. Eighty-one of the FM were scored with a RPN > 10, and mitigations were developed for FM with RPN values exceeding ten. The initial treatment had the most FM (16) requiring mitigation development followed closely by treatment planning, segmentation, and plan preparation with fourteen each. The maximum RPN was 400 and involved target delineation. Conclusion: The FMEA process proved extremely useful in identifying previously unforeseen risks. New methods were developed and implemented for risk mitigation and error prevention. Similar to findings reported for adult patients, the process leading to the initial treatment has an associated high risk.« less
Agricultural Production. Ohio's Competency Analysis Profile.
ERIC Educational Resources Information Center
Ohio State Univ., Columbus. Vocational Instructional Materials Lab.
This list consists of essential competencies from the following specialized Ohio Competency Analysis Profiles: Beef and Sheep Producers; Crop Producer; Dairy Producer; Poultry Producer; and Swine Producer. Developed through a modified DACUM (Developing a Curriculum) process involving business, industry, labor, and community agency representatives…
Agriculture Products Processing. Occupational Competency Analysis Profile.
ERIC Educational Resources Information Center
Ohio State Univ., Columbus. Vocational Instructional Materials Lab.
This Occupational Competency Analysis Profile (OCAP) contains a competency list verified by expert workers and developed through a modified DACUM (Developing a Curriculum) involving business, industry, labor, and community agency representatives from Ohio. This OCAP identifies the occupational, academic, and employability skills (competencies)…
Teaching Workflow Analysis and Lean Thinking via Simulation: A Formative Evaluation
Campbell, Robert James; Gantt, Laura; Congdon, Tamara
2009-01-01
This article presents the rationale for the design and development of a video simulation used to teach lean thinking and workflow analysis to health services and health information management students enrolled in a course on the management of health information. The discussion includes a description of the design process, a brief history of the use of simulation in healthcare, and an explanation of how video simulation can be used to generate experiential learning environments. Based on the results of a survey given to 75 students as part of a formative evaluation, the video simulation was judged effective because it allowed students to visualize a real-world process (concrete experience), contemplate the scenes depicted in the video along with the concepts presented in class in a risk-free environment (reflection), develop hypotheses about why problems occurred in the workflow process (abstract conceptualization), and develop solutions to redesign a selected process (active experimentation). PMID:19412533
Stakeholder analysis: a review.
Brugha, R; Varvasovszky, Z
2000-09-01
The growing popularity of stakeholder analysis reflects an increasing recognition of how the characteristics of stakeholders--individuals, groups and organizations--influence decision-making processes. This paper reviews the origins and uses of stakeholder analysis, as described in the policy, health care management and development literature. Its roots are in the political and policy sciences, and in management theory where it has evolved into a systematic tool with clearly defined steps and applications for scanning the current and future organizational environment. Stakeholder analysis can be used to generate knowledge about the relevant actors so as to understand their behaviour, intentions, interrelations, agendas, interests, and the influence or resources they have brought--or could bring--to bear on decision-making processes. This information can then be used to develop strategies for managing these stakeholders, to facilitate the implementation of specific decisions or organizational objectives, or to understand the policy context and assess the feasibility of future policy directions. Policy development is a complex process which frequently takes place in an unstable and rapidly changing context, subject to unpredictable internal and external factors. As a cross-sectional view of an evolving picture, the utility of stakeholder analysis for predicting and managing the future is time-limited and it should be complemented by other policy analysis approaches.
Preliminary results from the High Speed Airframe Integration Research project
NASA Technical Reports Server (NTRS)
Coen, Peter G.; Sobieszczanski-Sobieski, Jaroslaw; Dollyhigh, Samuel M.
1992-01-01
A review is presented of the accomplishment of the near term objectives of developing an analysis system and optimization methods during the first year of the NASA Langley High Speed Airframe Integration Research (HiSAIR) project. The characteristics of a Mach 3 HSCT transport have been analyzed utilizing the newly developed process. In addition to showing more detailed information about the aerodynamic and structural coupling for this type of vehicle, this exercise aided in further refining the data requirements for the analysis process.
NASA Technical Reports Server (NTRS)
1975-01-01
The development plans, analysis of required R and D and production resources, the costs of such resources, and finally, the potential profitability of a commercial space processing opportunity for containerless melting and resolidification of tungsten are discussed. The aim is to obtain a form of tungsten which, when fabricated into targets for X-ray tubes, provides at least, a 50 percent increase in service life.
Organizational Analysis of the United States Army Evaluation Center
2014-12-01
analysis of qualitative or quantitative data obtained from design reviews, hardware inspections, M&S, hardware and software testing , metrics review... Research Development Test & Evaluation (RDT&E) appropriation account. The Defense Acquisition Portal ACQuipedia website describes RDT&E as “ one of the... research , design , development, test and evaluation, production, installation, operation, and maintenance; data collection; processing and analysis
ERIC Educational Resources Information Center
Baki, Mujgan
2015-01-01
This study aims to explore the role of lesson analysis in the development of mathematical knowledge for teaching. For this purpose, a graduate course based on lesson analysis was designed for novice mathematics teachers. Throughout the course the teachers watched videos of group-mates and discussed the issues they identified in terms of…
NASA Technical Reports Server (NTRS)
1983-01-01
The process technology for the manufacture of semiconductor-grade silicon in a large commercial plant by 1986, at a price less than $14 per kilogram of silicon based on 1975 dollars is discussed. The engineering design, installation, checkout, and operation of an Experimental Process System Development unit was discussed. Quality control of scaling-up the process and an economic analysis of product and production costs are discussed.
Meanline Analysis of Turbines with Choked Flow in the Object-Oriented Turbomachinery Analysis Code
NASA Technical Reports Server (NTRS)
Hendricks, Eric S.
2016-01-01
The Object-Oriented Turbomachinery Analysis Code (OTAC) is a new meanline/streamline turbomachinery modeling tool being developed at NASA GRC. During the development process, a limitation of the code was discovered in relation to the analysis of choked flow in axial turbines. This paper describes the relevant physics for choked flow as well as the changes made to OTAC to enable analysis in this flow regime.
In-depth analysis and characterization of a dual damascene process with respect to different CD
NASA Astrophysics Data System (ADS)
Krause, Gerd; Hofmann, Detlef; Habets, Boris; Buhl, Stefan; Gutsch, Manuela; Lopez-Gomez, Alberto; Kim, Wan-Soo; Thrun, Xaver
2018-03-01
In a 200 mm high volume environment, we studied data from a dual damascene process. Dual damascene is a combination of lithography, etch and CMP that is used to create copper lines and contacts in one single step. During these process steps, different metal CD are measured by different measurement methods. In this study, we analyze the key numbers of the different measurements after different process steps and develop simple models to predict the electrical behavior* . In addition, radial profiles have been analyzed of both inline measurement parameters and electrical parameters. A matching method was developed based on inline and electrical data. Finally, correlation analysis for radial signatures is presented that can be used to predict excursions in electrical signatures.
Fajardo-Ortiz, David; Duran, Luis; Moreno, Laura; Ochoa, Hector; Castaño, Victor M
2014-09-03
We explored how the knowledge translation and innovation processes are structured when theyresult in innovations, as in the case of liposomal doxorubicin research. In order to map the processes, a literature network analysis was made through Cytoscape and semantic analysis was performed by GOPubmed which is based in the controlled vocabularies MeSH (Medical Subject Headings) and GO (Gene Ontology). We found clusters related to different stages of the technological development (invention, innovation and imitation) and the knowledge translation process (preclinical, translational and clinical research), and we were able to map the historic emergence of Doxil as a paradigmatic nanodrug. This research could be a powerful methodological tool for decision-making and innovation management in drug delivery research.
Research progress in Asia on methods of processing laser-induced breakdown spectroscopy data
NASA Astrophysics Data System (ADS)
Guo, Yang-Min; Guo, Lian-Bo; Li, Jia-Ming; Liu, Hong-Di; Zhu, Zhi-Hao; Li, Xiang-You; Lu, Yong-Feng; Zeng, Xiao-Yan
2016-10-01
Laser-induced breakdown spectroscopy (LIBS) has attracted much attention in terms of both scientific research and industrial application. An important branch of LIBS research in Asia, the development of data processing methods for LIBS, is reviewed. First, the basic principle of LIBS and the characteristics of spectral data are briefly introduced. Next, two aspects of research on and problems with data processing methods are described: i) the basic principles of data preprocessing methods are elaborated in detail on the basis of the characteristics of spectral data; ii) the performance of data analysis methods in qualitative and quantitative analysis of LIBS is described. Finally, a direction for future development of data processing methods for LIBS is also proposed.
Development and fabrication of a solar cell junction processing system
NASA Technical Reports Server (NTRS)
Banker, S.
1982-01-01
Development of a pulsed electron beam subsystem, wafer transport system, and ion implanter are discussed. A junction processing system integration and cost analysis are reviewed. Maintenance of the electron beam processor and the experimental test unit of the non-mass analyzed ion implanter is reviewed.
Integrated Structural Analysis and Test Program
NASA Technical Reports Server (NTRS)
Kaufman, Daniel
2005-01-01
An integrated structural-analysis and structure-testing computer program is being developed in order to: Automate repetitive processes in testing and analysis; Accelerate pre-test analysis; Accelerate reporting of tests; Facilitate planning of tests; Improve execution of tests; Create a vibration, acoustics, and shock test database; and Integrate analysis and test data. The software package includes modules pertaining to sinusoidal and random vibration, shock and time replication, acoustics, base-driven modal survey, and mass properties and static/dynamic balance. The program is commanded by use of ActiveX controls. There is minimal need to generate command lines. Analysis or test files are selected by opening a Windows Explorer display. After selecting the desired input file, the program goes to a so-called analysis data process or test data process, depending on the type of input data. The status of the process is given by a Windows status bar, and when processing is complete, the data are reported in graphical, tubular, and matrix form.
Sensitivity analysis of the add-on price estimate for the silicon web growth process
NASA Technical Reports Server (NTRS)
Mokashi, A. R.
1981-01-01
The web growth process, a silicon-sheet technology option, developed for the flat plate solar array (FSA) project, was examined. Base case data for the technical and cost parameters for the technical and commercial readiness phase of the FSA project are projected. The process add on price, using the base case data for cost parameters such as equipment, space, direct labor, materials and utilities, and the production parameters such as growth rate and run length, using a computer program developed specifically to do the sensitivity analysis with improved price estimation are analyzed. Silicon price, sheet thickness and cell efficiency are also discussed.
TESS Data Processing and Quick-look Pipeline
NASA Astrophysics Data System (ADS)
Fausnaugh, Michael; Huang, Xu; Glidden, Ana; Guerrero, Natalia; TESS Science Office
2018-01-01
We describe the data analysis procedures and pipelines for the Transiting Exoplanet Survey Satellite (TESS). We briefly review the processing pipeline developed and implemented by the Science Processing Operations Center (SPOC) at NASA Ames, including pixel/full-frame image calibration, photometric analysis, pre-search data conditioning, transiting planet search, and data validation. We also describe data-quality diagnostic analyses and photometric performance assessment tests. Finally, we detail a "quick-look pipeline" (QLP) that has been developed by the MIT branch of the TESS Science Office (TSO) to provide a fast and adaptable routine to search for planet candidates in the 30 minute full-frame images.
Tribology symposium 1995. PD-Volume 72
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masudi, H.
After the keynote presentation by Professor Aaron Cohen of Texas A and M University, entitled Processes Used in Design, the program is divided into five major sessions: Research and Development -- Recent research and development of tribological components; Tribology in Manufacturing -- The impact of tribology on modern manufacturing; Design/Design Representation -- Aspects of design related to tribological systems; Tribo-Chemistry/Tribo-Physics -- Discussion of chemical and physical behavior of substances as related to tribology; and Failure Analysis -- An analysis of failure, failure detection, and failure monitoring as related to manufacturing processes. Papers have been processed separately for inclusion on themore » data base.« less
Players and processes behind the national health insurance scheme: a case study of Uganda
2013-01-01
Background Uganda is the last East African country to adopt a National Health Insurance Scheme (NHIS). To lessen the inequitable burden of healthcare spending, health financing reform has focused on the establishment of national health insurance. The objective of this research is to depict how stakeholders and their power and interests have shaped the process of agenda setting and policy formulation for Uganda’s proposed NHIS. The study provides a contextual analysis of the development of NHIS policy within the context of national policies and processes. Methods The methodology is a single case study of agenda setting and policy formulation related to the proposed NHIS in Uganda. It involves an analysis of the real-life context, the content of proposals, the process, and a retrospective stakeholder analysis in terms of policy development. Data collection comprised a literature review of published documents, technical reports, policy briefs, and memos obtained from Uganda’s Ministry of Health and other unpublished sources. Formal discussions were held with ministry staff involved in the design of the scheme and some members of the task force to obtain clarification, verify events, and gain additional information. Results The process of developing the NHIS has been an incremental one, characterised by small-scale, gradual changes and repeated adjustments through various stakeholder engagements during the three phases of development: from 1995 to 1999; 2000 to 2005; and 2006 to 2011. Despite political will in the government, progress with the NHIS has been slow, and it has yet to be implemented. Stakeholders, notably the private sector, played an important role in influencing the pace of the development process and the currently proposed design of the scheme. Conclusions This study underscores the importance of stakeholder analysis in major health reforms. Early use of stakeholder analysis combined with an ongoing review and revision of NHIS policy proposals during stakeholder discussions would be an effective strategy for avoiding potential pitfalls and obstacles in policy implementation. Given the private sector’s influence on negotiations over health insurance design in Uganda, this paper also reviews the experience of two countries with similar stakeholder dynamics. PMID:24053551
Players and processes behind the national health insurance scheme: a case study of Uganda.
Basaza, Robert K; O'Connell, Thomas S; Chapčáková, Ivana
2013-09-22
Uganda is the last East African country to adopt a National Health Insurance Scheme (NHIS). To lessen the inequitable burden of healthcare spending, health financing reform has focused on the establishment of national health insurance. The objective of this research is to depict how stakeholders and their power and interests have shaped the process of agenda setting and policy formulation for Uganda's proposed NHIS. The study provides a contextual analysis of the development of NHIS policy within the context of national policies and processes. The methodology is a single case study of agenda setting and policy formulation related to the proposed NHIS in Uganda. It involves an analysis of the real-life context, the content of proposals, the process, and a retrospective stakeholder analysis in terms of policy development. Data collection comprised a literature review of published documents, technical reports, policy briefs, and memos obtained from Uganda's Ministry of Health and other unpublished sources. Formal discussions were held with ministry staff involved in the design of the scheme and some members of the task force to obtain clarification, verify events, and gain additional information. The process of developing the NHIS has been an incremental one, characterised by small-scale, gradual changes and repeated adjustments through various stakeholder engagements during the three phases of development: from 1995 to 1999; 2000 to 2005; and 2006 to 2011. Despite political will in the government, progress with the NHIS has been slow, and it has yet to be implemented. Stakeholders, notably the private sector, played an important role in influencing the pace of the development process and the currently proposed design of the scheme. This study underscores the importance of stakeholder analysis in major health reforms. Early use of stakeholder analysis combined with an ongoing review and revision of NHIS policy proposals during stakeholder discussions would be an effective strategy for avoiding potential pitfalls and obstacles in policy implementation. Given the private sector's influence on negotiations over health insurance design in Uganda, this paper also reviews the experience of two countries with similar stakeholder dynamics.
Development of the Clinical Teaching Effectiveness Questionnaire in the United States.
Wormley, Michelle E; Romney, Wendy; Greer, Anna E
2017-01-01
The purpose of this study was to develop a valid measure for assessing clinical teaching effectiveness within the field of physical therapy. The Clinical Teaching Effectiveness Questionnaire (CTEQ) was developed via a 4-stage process, including (1) initial content development, (2) content analysis with 8 clinical instructors with over 5 years of clinical teaching experience, (3) pilot testing with 205 clinical instructors from 2 universities in the Northeast of the United States, and (4) psychometric evaluation, including principal component analysis. The scale development process resulted in a 30-item questionnaire with 4 sections that relate to clinical teaching: learning experiences, learning environment, communication, and evaluation. The CTEQ provides a preliminary valid measure for assessing clinical teaching effectiveness in physical therapy practice.
Software technology testbed softpanel prototype
NASA Technical Reports Server (NTRS)
1991-01-01
The following subject areas are covered: analysis of using Ada for the development of real-time control systems for the Space Station; analysis of the functionality of the Application Generator; analysis of the User Support Environment criteria; analysis of the SSE tools and procedures which are to be used for the development of ground/flight software for the Space Station; analysis if the CBATS tutorial (an Ada tutorial package); analysis of Interleaf; analysis of the Integration, Test and Verification process of the Space Station; analysis of the DMS on-orbit flight architecture; analysis of the simulation architecture.
Huang, You-Jun; Zhou, Qin; Huang, Jian-Qin; Zeng, Yan-Ru; Wang, Zheng-Jia; Zhang, Qi-Xiang; Zhu, Yi-Hang; Shen, Chen; Zheng, Bing-Song
2015-06-01
Hickory (Carya cathayensis Sarg.) seed has one of the highest oil content and is rich in polyunsaturated fatty acids (PUFAs), which kernel is helpful to human health, particularly to human brain function. A better elucidation of lipid accumulation mechanism would help to improve hickory production and seed quality. DDRT-PCR analysis was used to examine gene expression in hickory at thirteen time points during seed development process. A total of 67 unique genes involved in seed development were obtained, and those expression patterns were further confirmed by semi-quantitative RT-PCR and real time RT-PCR analysis. Of them, the genes with known functions were involved in signal transduction, amino acid metabolism, nuclear metabolism, fatty acid metabolism, protein metabolism, carbon metabolism, secondary metabolism, oxidation of fatty acids and stress response, suggesting that hickory underwent a complex metabolism process in seed development. Furthermore, 6 genes related to fatty acid synthesis were explored, and their functions in seed development process were further discussed. The data obtained here would provide the first clues for guiding further functional studies of fatty acid synthesis in hickory. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
Stingray Failure Mode, Effects and Criticality Analysis: WEC Risk Registers
Ken Rhinefrank
2016-07-25
Analysis method to systematically identify all potential failure modes and their effects on the Stingray WEC system. This analysis is incorporated early in the development cycle such that the mitigation of the identified failure modes can be achieved cost effectively and efficiently. The FMECA can begin once there is enough detail to functions and failure modes of a given system, and its interfaces with other systems. The FMECA occurs coincidently with the design process and is an iterative process which allows for design changes to overcome deficiencies in the analysis.Risk Registers for major subsystems completed according to the methodology described in "Failure Mode Effects and Criticality Analysis Risk Reduction Program Plan.pdf" document below, in compliance with the DOE Risk Management Framework developed by NREL.
NASA Astrophysics Data System (ADS)
Shiklomanov, A. I.; Okladnikov, I.; Gordov, E. P.; Proussevitch, A. A.; Titov, A. G.
2016-12-01
Presented is a collaborative project carrying out by joint team of researchers from the Institute of Monitoring of Climatic and Ecological Systems, Russia and Earth Systems Research Center, University of New Hampshire, USA. Its main objective is development of a hardware and software prototype of Distributed Research Center (DRC) for monitoring and projecting of regional climatic and and their impacts on the environment over the Northern extratropical areas. In the framework of the project new approaches to "cloud" processing and analysis of large geospatial datasets (big geospatial data) are being developed. It will be deployed on technical platforms of both institutions and applied in research of climate change and its consequences. Datasets available at NCEI and IMCES include multidimensional arrays of climatic, environmental, demographic, and socio-economic characteristics. The project is aimed at solving several major research and engineering tasks: 1) structure analysis of huge heterogeneous climate and environmental geospatial datasets used in the project, their preprocessing and unification; 2) development of a new distributed storage and processing model based on a "shared nothing" paradigm; 3) development of a dedicated database of metadata describing geospatial datasets used in the project; 4) development of a dedicated geoportal and a high-end graphical frontend providing intuitive user interface, internet-accessible online tools for analysis of geospatial data and web services for interoperability with other geoprocessing software packages. DRC will operate as a single access point to distributed archives of spatial data and online tools for their processing. Flexible modular computational engine running verified data processing routines will provide solid results of geospatial data analysis. "Cloud" data analysis and visualization approach will guarantee access to the DRC online tools and data from all over the world. Additionally, exporting of data processing results through WMS and WFS services will be used to provide their interoperability. Financial support of this activity by the RF Ministry of Education and Science under Agreement 14.613.21.0037 (RFMEFI61315X0037) and by the Iola Hubbard Climate Change Endowment is acknowledged.
From scenarios to domain models: processes and representations
NASA Astrophysics Data System (ADS)
Haddock, Gail; Harbison, Karan
1994-03-01
The domain specific software architectures (DSSA) community has defined a philosophy for the development of complex systems. This philosophy improves productivity and efficiency by increasing the user's role in the definition of requirements, increasing the systems engineer's role in the reuse of components, and decreasing the software engineer's role to the development of new components and component modifications only. The scenario-based engineering process (SEP), the first instantiation of the DSSA philosophy, has been adopted by the next generation controller project. It is also the chosen methodology of the trauma care information management system project, and the surrogate semi-autonomous vehicle project. SEP uses scenarios from the user to create domain models and define the system's requirements. Domain knowledge is obtained from a variety of sources including experts, documents, and videos. This knowledge is analyzed using three techniques: scenario analysis, task analysis, and object-oriented analysis. Scenario analysis results in formal representations of selected scenarios. Task analysis of the scenario representations results in descriptions of tasks necessary for object-oriented analysis and also subtasks necessary for functional system analysis. Object-oriented analysis of task descriptions produces domain models and system requirements. This paper examines the representations that support the DSSA philosophy, including reference requirements, reference architectures, and domain models. The processes used to create and use the representations are explained through use of the scenario-based engineering process. Selected examples are taken from the next generation controller project.
Diversified Health Occupations. Ohio's Competency Analysis Profile.
ERIC Educational Resources Information Center
Ohio State Univ., Columbus. Vocational Instructional Materials Lab.
This list consists of essential competencies from the following specialized Ohio Competency Analysis Profile: Dental Assistant; Medical Assistant; and Nurse Aide. Developed through a modified DACUM (Developing a Curriculum) process involving business, industry, labor, and community agency representatives in Ohio, this document is a comprehensive…
Development of Cross-Platform Software for Well Logging Data Visualization
NASA Astrophysics Data System (ADS)
Akhmadulin, R. K.; Miraev, A. I.
2017-07-01
Well logging data processing is one of the main sources of information in the oil-gas field analysis and is of great importance in the process of its development and operation. Therefore, it is important to have the software which would accurately and clearly provide the user with processed data in the form of well logs. In this work, there have been developed a software product which not only has the basic functionality for this task (loading data from .las files, well log curves display, etc.), but can be run in different operating systems and on different devices. In the article a subject field analysis and task formulation have been performed, and the software design stage has been considered. At the end of the work the resulting software product interface has been described.
The Verification-based Analysis of Reliable Multicast Protocol
NASA Technical Reports Server (NTRS)
Wu, Yunqing
1996-01-01
Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.
NASA Astrophysics Data System (ADS)
Papanikolaou, Xanthos; Anastasiou, Demitris; Marinou, Aggeliki; Zacharis, Vangelis; Paradissis, Demitris
2015-04-01
Dionysos Satellite Observatory and Higher Geodesy Laboratory of the National Technical University of Athens, have developed an automated processing scheme to accommodate the daily analysis of all available continuous GNSS stations in Greece. For the moment, a total of approximately 150 regional stations are processed, divided in 4 subnetworks. GNSS data are processed routinely on a daily basis, via Bernese GNSS Software v5.0, developed by AIUB. Each network is solved twice, within a period of 20 days, first using ultra-rapid products (with a latency of ~10 hours) and then using final products (with a latency of ~20 days). Observations are processed using carrier phase, modelled to double differences in the ionosphere-free linear combination. Analysis results, include coordinate estimates, ionospheric corrections (TEC maps) and hourly tropospheric parameters (zenith delay). This processing scheme, has proved helpful in investigating in near real-time abrupt geophysical phenomena, as in the 2011 Santorini inflation episode and the 2014 Kephalonia earthquake events. All analysis results and products are made available via a dedicated webpage. Additionally, most of the GNSS data are hosted in a GSAC web platform, available to all interested parties. Data and results are made available through the laboratory's dedicated website: http://dionysos.survey.ntua.gr/.
The chromosomal analysis of teaching: the search for promoter genes.
Skeff, Kelley M
2007-01-01
The process of teaching is ubiquitous in medicine, both in the practice of medicine and the promotion of medical science. Yet, until the last 50 years, the process of medical teaching had been neglected. To improve this process, the research group at the Stanford Faculty Development Center for Medical Teachers developed an educational framework to assist teachers to analyze and improve the teaching process. Utilizing empirical data drawn from videotapes of actual clinical teaching and educational literature, we developed a seven-category systematic scheme for the analysis of medical teaching, identifying key areas and behaviors that could enable teachers to enhance their effectiveness. The organizational system of this scheme is similar to that used in natural sciences, such as genetics. Whereas geneticists originally identified chromosomes and ultimately individual and related genes, this classification system identifies major categories and specific teaching behaviors that can enhance teaching effectiveness. Over the past two decades, this organizational framework has provided the basis for a variety of faculty development programs for improving teaching effectiveness. Results of those programs have revealed several positive findings, including the usefulness of the methods for a wide variety of medical teachers in a variety of settings. This research indicates that the development of a framework for analysis has been, as in the natural sciences, an important way to improve the science of the art of teaching.
The TAME Project: Towards improvement-oriented software environments
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Rombach, H. Dieter
1988-01-01
Experience from a dozen years of analyzing software engineering processes and products is summarized as a set of software engineering and measurement principles that argue for software engineering process models that integrate sound planning and analysis into the construction process. In the TAME (Tailoring A Measurement Environment) project at the University of Maryland, such an improvement-oriented software engineering process model was developed that uses the goal/question/metric paradigm to integrate the constructive and analytic aspects of software development. The model provides a mechanism for formalizing the characterization and planning tasks, controlling and improving projects based on quantitative analysis, learning in a deeper and more systematic way about the software process and product, and feeding the appropriate experience back into the current and future projects. The TAME system is an instantiation of the TAME software engineering process model as an ISEE (integrated software engineering environment). The first in a series of TAME system prototypes has been developed. An assessment of experience with this first limited prototype is presented including a reassessment of its initial architecture.
Characterization and nultivariate analysis of physical properties of processing peaches
USDA-ARS?s Scientific Manuscript database
Characterization of physical properties of fruits represents the first vital step to ensure optimal performance of fruit processing operations and is also a prerequisite in the development of new processing equipment. In this study, physical properties of engineering significance to processing of th...
Optimization of Parameter Ranges for Composite Tape Winding Process Based on Sensitivity Analysis
NASA Astrophysics Data System (ADS)
Yu, Tao; Shi, Yaoyao; He, Xiaodong; Kang, Chao; Deng, Bo; Song, Shibo
2017-08-01
This study is focus on the parameters sensitivity of winding process for composite prepreg tape. The methods of multi-parameter relative sensitivity analysis and single-parameter sensitivity analysis are proposed. The polynomial empirical model of interlaminar shear strength is established by response surface experimental method. Using this model, the relative sensitivity of key process parameters including temperature, tension, pressure and velocity is calculated, while the single-parameter sensitivity curves are obtained. According to the analysis of sensitivity curves, the stability and instability range of each parameter are recognized. Finally, the optimization method of winding process parameters is developed. The analysis results show that the optimized ranges of the process parameters for interlaminar shear strength are: temperature within [100 °C, 150 °C], tension within [275 N, 387 N], pressure within [800 N, 1500 N], and velocity within [0.2 m/s, 0.4 m/s], respectively.
Exploring the role of auditory analysis in atypical compared to typical language development.
Grube, Manon; Cooper, Freya E; Kumar, Sukhbinder; Kelly, Tom; Griffiths, Timothy D
2014-02-01
The relationship between auditory processing and language skills has been debated for decades. Previous findings have been inconsistent, both in typically developing and impaired subjects, including those with dyslexia or specific language impairment. Whether correlations between auditory and language skills are consistent between different populations has hardly been addressed at all. The present work presents an exploratory approach of testing for patterns of correlations in a range of measures of auditory processing. In a recent study, we reported findings from a large cohort of eleven-year olds on a range of auditory measures and the data supported a specific role for the processing of short sequences in pitch and time in typical language development. Here we tested whether a group of individuals with dyslexic traits (DT group; n = 28) from the same year group would show the same pattern of correlations between auditory and language skills as the typically developing group (TD group; n = 173). Regarding the raw scores, the DT group showed a significantly poorer performance on the language but not the auditory measures, including measures of pitch, time and rhythm, and timbre (modulation). In terms of correlations, there was a tendency to decrease in correlations between short-sequence processing and language skills, contrasted by a significant increase in correlation for basic, single-sound processing, in particular in the domain of modulation. The data support the notion that the fundamental relationship between auditory and language skills might differ in atypical compared to typical language development, with the implication that merging data or drawing inference between populations might be problematic. Further examination of the relationship between both basic sound feature analysis and music-like sound analysis and language skills in impaired populations might allow the development of appropriate training strategies. These might include types of musical training to augment language skills via their common bases in sound sequence analysis. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.
Efficient processing of fluorescence images using directional multiscale representations.
Labate, D; Laezza, F; Negi, P; Ozcan, B; Papadakis, M
2014-01-01
Recent advances in high-resolution fluorescence microscopy have enabled the systematic study of morphological changes in large populations of cells induced by chemical and genetic perturbations, facilitating the discovery of signaling pathways underlying diseases and the development of new pharmacological treatments. In these studies, though, due to the complexity of the data, quantification and analysis of morphological features are for the vast majority handled manually, slowing significantly data processing and limiting often the information gained to a descriptive level. Thus, there is an urgent need for developing highly efficient automated analysis and processing tools for fluorescent images. In this paper, we present the application of a method based on the shearlet representation for confocal image analysis of neurons. The shearlet representation is a newly emerged method designed to combine multiscale data analysis with superior directional sensitivity, making this approach particularly effective for the representation of objects defined over a wide range of scales and with highly anisotropic features. Here, we apply the shearlet representation to problems of soma detection of neurons in culture and extraction of geometrical features of neuronal processes in brain tissue, and propose it as a new framework for large-scale fluorescent image analysis of biomedical data.
Efficient processing of fluorescence images using directional multiscale representations
Labate, D.; Laezza, F.; Negi, P.; Ozcan, B.; Papadakis, M.
2017-01-01
Recent advances in high-resolution fluorescence microscopy have enabled the systematic study of morphological changes in large populations of cells induced by chemical and genetic perturbations, facilitating the discovery of signaling pathways underlying diseases and the development of new pharmacological treatments. In these studies, though, due to the complexity of the data, quantification and analysis of morphological features are for the vast majority handled manually, slowing significantly data processing and limiting often the information gained to a descriptive level. Thus, there is an urgent need for developing highly efficient automated analysis and processing tools for fluorescent images. In this paper, we present the application of a method based on the shearlet representation for confocal image analysis of neurons. The shearlet representation is a newly emerged method designed to combine multiscale data analysis with superior directional sensitivity, making this approach particularly effective for the representation of objects defined over a wide range of scales and with highly anisotropic features. Here, we apply the shearlet representation to problems of soma detection of neurons in culture and extraction of geometrical features of neuronal processes in brain tissue, and propose it as a new framework for large-scale fluorescent image analysis of biomedical data. PMID:28804225
MO-E-9A-01: Risk Based Quality Management: TG100 In Action
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huq, M; Palta, J; Dunscombe, P
2014-06-15
One of the goals of quality management in radiation therapy is to gain high confidence that patients will receive the prescribed treatment correctly. To accomplish these goals professional societies such as the American Association of Physicists in Medicine (AAPM) has published many quality assurance (QA), quality control (QC), and quality management (QM) guidance documents. In general, the recommendations provided in these documents have emphasized on performing device-specific QA at the expense of process flow and protection of the patient against catastrophic errors. Analyses of radiation therapy incidents find that they are most often caused by flaws in the overall therapymore » process, from initial consult through final treatment, than by isolated hardware or computer failures detectable by traditional physics QA. This challenge is shared by many intrinsically hazardous industries. Risk assessment tools and analysis techniques have been developed to define, identify, and eliminate known and/or potential failures, problems, or errors, from a system, process and/or service before they reach the customer. These include, but are not limited to, process mapping, failure modes and effects analysis (FMEA), fault tree analysis (FTA), and establishment of a quality management program that best avoids the faults and risks that have been identified in the overall process. These tools can be easily adapted to radiation therapy practices because of their simplicity and effectiveness to provide efficient ways to enhance the safety and quality of treatment processes. Task group 100 (TG100) of AAPM has developed a risk-based quality management program that uses these tools. This session will be devoted to a discussion of these tools and how these tools can be used in a given radiotherapy clinic to develop a risk based QM program. Learning Objectives: Learn how to design a process map for a radiotherapy process. Learn how to perform a FMEA analysis for a given process. Learn what Fault tree analysis is all about. Learn how to design a quality management program based upon the information obtained from process mapping, FMEA and FTA.« less
A Project Team Analysis Using Tuckman's Model of Small-Group Development.
Natvig, Deborah; Stark, Nancy L
2016-12-01
Concerns about equitable workloads for nursing faculty have been well documented, yet a standardized system for workload management does not exist. A project team was challenged to establish an academic workload management system when two dissimilar universities were consolidated. Tuckman's model of small-group development was used as the framework for the analysis of processes and effectiveness of a workload project team. Agendas, notes, and meeting minutes were used as the primary sources of information. Analysis revealed the challenges the team encountered. Utilization of a team charter was an effective tool in guiding the team to become a highly productive group. Lessons learned from the analysis are discussed. Guiding a diverse group into a highly productive team is complex. The use of Tuckman's model of small-group development provided a systematic mechanism to review and understand group processes and tasks. [J Nurs Educ. 2016;55(12):675-681.]. Copyright 2016, SLACK Incorporated.
NASA Astrophysics Data System (ADS)
Yousif, Dilon
The purpose of this study was to improve the Quality Assurance (QA) System at the Nemak Windsor Aluminum Plant (WAP). The project used Six Sigma method based on Define, Measure, Analyze, Improve, and Control (DMAIC). Analysis of in process melt at WAP was based on chemical, thermal, and mechanical testing. The control limits for the W319 Al Alloy were statistically recalculated using the composition measured under stable conditions. The "Chemistry Viewer" software was developed for statistical analysis of alloy composition. This software features the Silicon Equivalency (SiBQ) developed by the IRC. The Melt Sampling Device (MSD) was designed and evaluated at WAP to overcome traditional sampling limitations. The Thermal Analysis "Filters" software was developed for cooling curve analysis of the 3XX Al Alloy(s) using IRC techniques. The impact of low melting point impurities on the start of melting was evaluated using the Universal Metallurgical Simulator and Analyzer (UMSA).
The Function of Semantics in Automated Language Processing.
ERIC Educational Resources Information Center
Pacak, Milos; Pratt, Arnold W.
This paper is a survey of some of the major semantic models that have been developed for automated semantic analysis of natural language. Current approaches to semantic analysis and logical interference are based mainly on models of human cognitive processes such as Quillian's semantic memory, Simmon's Protosynthex III and others. All existing…
AUTOMATED LITERATURE PROCESSING HANDLING AND ANALYSIS SYSTEM--FIRST GENERATION.
ERIC Educational Resources Information Center
Redstone Scientific Information Center, Redstone Arsenal, AL.
THE REPORT PRESENTS A SUMMARY OF THE DEVELOPMENT AND THE CHARACTERISTICS OF THE FIRST GENERATION OF THE AUTOMATED LITERATURE PROCESSING, HANDLING AND ANALYSIS (ALPHA-1) SYSTEM. DESCRIPTIONS OF THE COMPUTER TECHNOLOGY OF ALPHA-1 AND THE USE OF THIS AUTOMATED LIBRARY TECHNIQUE ARE PRESENTED. EACH OF THE SUBSYSTEMS AND MODULES NOW IN OPERATION ARE…
ERIC Educational Resources Information Center
Clayton, Thomas
2004-01-01
In recent years, many scholars have become fascinated by a contemporary, multidimensional process that has come to be known as "globalization." Globalization originally described economic developments at the world level. More specifically, scholars invoked the concept in reference to the process of global economic integration and the seemingly…
Oak Ridge Computerized Hierarchical Information System (ORCHIS) status report, July 1973
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brooks, A.A.
1974-01-01
This report summarizes the concepts, software, and contents of the Oak Ridge Computerized Hierarchical Information System. This data analysis and text processing system was developed as an integrated, comprehensive information processing capability to meet the needs of an on-going multidisciplinary research and development organization. (auth)
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-13
... Research and Development Center (FFRDC) to facilitate the modernization of business processes and... Health and Human Services (DHHS), intends to sponsor a study and analysis, delivery system, simulations... modernization of business processes and supporting systems and their operations. Some of the broad task areas...
Analyzing the Impact of a Data Analysis Process to Improve Instruction Using a Collaborative Model
ERIC Educational Resources Information Center
Good, Rebecca B.
2006-01-01
The Data Collaborative Model (DCM) assembles assessment literacy, reflective practices, and professional development into a four-component process. The sub-components include assessing students, reflecting over data, professional dialogue, professional development for the teachers, interventions for students based on data results, and re-assessing…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boggs, Paul T.; Althsuler, Alan; Larzelere, Alex R.
2005-08-01
The Design-through-Analysis Realization Team (DART) is chartered with reducing the time Sandia analysts require to complete the engineering analysis process. The DART system analysis team studied the engineering analysis processes employed by analysts in Centers 9100 and 8700 at Sandia to identify opportunities for reducing overall design-through-analysis process time. The team created and implemented a rigorous analysis methodology based on a generic process flow model parameterized by information obtained from analysts. They also collected data from analysis department managers to quantify the problem type and complexity distribution throughout Sandia's analyst community. They then used this information to develop a communitymore » model, which enables a simple characterization of processes that span the analyst community. The results indicate that equal opportunity for reducing analysis process time is available both by reducing the ''once-through'' time required to complete a process step and by reducing the probability of backward iteration. In addition, reducing the rework fraction (i.e., improving the engineering efficiency of subsequent iterations) offers approximately 40% to 80% of the benefit of reducing the ''once-through'' time or iteration probability, depending upon the process step being considered. Further, the results indicate that geometry manipulation and meshing is the largest portion of an analyst's effort, especially for structural problems, and offers significant opportunity for overall time reduction. Iteration loops initiated late in the process are more costly than others because they increase ''inner loop'' iterations. Identifying and correcting problems as early as possible in the process offers significant opportunity for time savings.« less
Yin, X-X; Zhang, Y; Cao, J; Wu, J-L; Hadjiloucas, S
2016-12-01
We provide a comprehensive account of recent advances in biomedical image analysis and classification from two complementary imaging modalities: terahertz (THz) pulse imaging and dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI). The work aims to highlight underlining commonalities in both data structures so that a common multi-channel data fusion framework can be developed. Signal pre-processing in both datasets is discussed briefly taking into consideration advances in multi-resolution analysis and model based fractional order calculus system identification. Developments in statistical signal processing using principal component and independent component analysis are also considered. These algorithms have been developed independently by the THz-pulse imaging and DCE-MRI communities, and there is scope to place them in a common multi-channel framework to provide better software standardization at the pre-processing de-noising stage. A comprehensive discussion of feature selection strategies is also provided and the importance of preserving textural information is highlighted. Feature extraction and classification methods taking into consideration recent advances in support vector machine (SVM) and extreme learning machine (ELM) classifiers and their complex extensions are presented. An outlook on Clifford algebra classifiers and deep learning techniques suitable to both types of datasets is also provided. The work points toward the direction of developing a new unified multi-channel signal processing framework for biomedical image analysis that will explore synergies from both sensing modalities for inferring disease proliferation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Milchenko, Mikhail; Snyder, Abraham Z; LaMontagne, Pamela; Shimony, Joshua S; Benzinger, Tammie L; Fouke, Sarah Jost; Marcus, Daniel S
2016-07-01
Neuroimaging research often relies on clinically acquired magnetic resonance imaging (MRI) datasets that can originate from multiple institutions. Such datasets are characterized by high heterogeneity of modalities and variability of sequence parameters. This heterogeneity complicates the automation of image processing tasks such as spatial co-registration and physiological or functional image analysis. Given this heterogeneity, conventional processing workflows developed for research purposes are not optimal for clinical data. In this work, we describe an approach called Heterogeneous Optimization Framework (HOF) for developing image analysis pipelines that can handle the high degree of clinical data non-uniformity. HOF provides a set of guidelines for configuration, algorithm development, deployment, interpretation of results and quality control for such pipelines. At each step, we illustrate the HOF approach using the implementation of an automated pipeline for Multimodal Glioma Analysis (MGA) as an example. The MGA pipeline computes tissue diffusion characteristics of diffusion tensor imaging (DTI) acquisitions, hemodynamic characteristics using a perfusion model of susceptibility contrast (DSC) MRI, and spatial cross-modal co-registration of available anatomical, physiological and derived patient images. Developing MGA within HOF enabled the processing of neuro-oncology MR imaging studies to be fully automated. MGA has been successfully used to analyze over 160 clinical tumor studies to date within several research projects. Introduction of the MGA pipeline improved image processing throughput and, most importantly, effectively produced co-registered datasets that were suitable for advanced analysis despite high heterogeneity in acquisition protocols.
Massively Parallel Processing for Fast and Accurate Stamping Simulations
NASA Astrophysics Data System (ADS)
Gress, Jeffrey J.; Xu, Siguang; Joshi, Ramesh; Wang, Chuan-tao; Paul, Sabu
2005-08-01
The competitive automotive market drives automotive manufacturers to speed up the vehicle development cycles and reduce the lead-time. Fast tooling development is one of the key areas to support fast and short vehicle development programs (VDP). In the past ten years, the stamping simulation has become the most effective validation tool in predicting and resolving all potential formability and quality problems before the dies are physically made. The stamping simulation and formability analysis has become an critical business segment in GM math-based die engineering process. As the simulation becomes as one of the major production tools in engineering factory, the simulation speed and accuracy are the two of the most important measures for stamping simulation technology. The speed and time-in-system of forming analysis becomes an even more critical to support the fast VDP and tooling readiness. Since 1997, General Motors Die Center has been working jointly with our software vendor to develop and implement a parallel version of simulation software for mass production analysis applications. By 2001, this technology was matured in the form of distributed memory processing (DMP) of draw die simulations in a networked distributed memory computing environment. In 2004, this technology was refined to massively parallel processing (MPP) and extended to line die forming analysis (draw, trim, flange, and associated spring-back) running on a dedicated computing environment. The evolution of this technology and the insight gained through the implementation of DM0P/MPP technology as well as performance benchmarks are discussed in this publication.
Process simulations for manufacturing of thick composites
NASA Astrophysics Data System (ADS)
Kempner, Evan A.
The availability of manufacturing simulations for composites can significantly reduce the costs associated with process development. Simulations provide a tool for evaluating the effect of processing conditions on the quality of parts produced without requiring numerous experiments. This is especially significant in parts that have troublesome features such as large thickness. The development of simulations for thick walled composites has been approached by examining the mechanics of resin flow and fiber deformation during processing, applying these evaluations to develop simulations, and evaluating the simulation with experimental results. A unified analysis is developed to describe the three-dimensional resin flow and fiber preform deformation during processing regardless of the manufacturing process used. It is shown how the generic governing evaluations in the unified analysis can be applied to autoclave molding, compression molding, pultrusion, filament winding, and resin transfer molding. A comparison is provided with earlier models derived individually for these processes. The evaluations described for autoclave curing were used to produce a one-dimensional cure simulation for autoclave curing of thick composites. The simulation consists of an analysis for heat transfer and resin flow in the composite as well as bleeder plies used to absorb resin removed from the part. Experiments were performed in a hot press to approximate curing in an autoclave. Graphite/epoxy laminates of 3 cm and 5 cm thickness were cured while monitoring temperatures at several points inside the laminate and thickness. The simulation predicted temperatures fairly closely, but difficulties were encountered in correlation of thickness results. This simulation was also used to study the effects of prepreg aging on processing of thick composites. An investigation was also performed on filament winding with prepreg tow. Cylinders were wound of approximately 12 mm thickness with pressure gages at the mandrel-composite interface. Cylinders were hoop wound with tensions ranging from 13-34 N. An analytical model was developed to calculate change in stress due to relaxation during winding. Although compressive circumferential stresses occurred throughout each of the cylinders, the magnitude was fairly low.
NASA Technical Reports Server (NTRS)
Goldman, H.; Wolf, M.
1978-01-01
Several experimental and projected Czochralski crystal growing process methods were studied and compared to available operations and cost-data of recent production Cz-pulling, in order to elucidate the role of the dominant cost contributing factors. From this analysis, it becomes apparent that the specific add-on costs of the Cz-process can be expected to be reduced by about a factor of three by 1982, and about a factor of five by 1986. A format to guide in the accumulation of the data needed for thorough techno-economic analysis of solar cell production processes was developed.
Data processing for a cosmic ray experiment onboard the solar probes Helios 1 and 2: Experiment 6
NASA Technical Reports Server (NTRS)
Mueller-Mellin, R.; Green, G.; Iwers, B.; Kunow, H.; Wibberenz, G.; Fuckner, J.; Hempe, H.; Witte, M.
1982-01-01
The data processing system for the Helios experiment 6, measuring energetic charged particles of solar, planetary and galactic origin in the inner solar system, is described. The aim of this experiment is to extend knowledge on origin and propagation of cosmic rays. The different programs for data reduction, analysis, presentation, and scientific evaluation are described as well as hardware and software of the data processing equipment. A chronological presentation of the data processing operation is given. Procedures and methods for data analysis which were developed can be used with minor modifications for analysis of other space research experiments.
Relativity Concept Inventory: Development, Analysis, and Results
ERIC Educational Resources Information Center
Aslanides, J. S.; Savage, C. M.
2013-01-01
We report on a concept inventory for special relativity: the development process, data analysis methods, and results from an introductory relativity class. The Relativity Concept Inventory tests understanding of relativistic concepts. An unusual feature is confidence testing for each question. This can provide additional information; for example,…
Estimation Of TMDLs And Margin Of Safety Under Conditions Of Uncertainty
In TMDL development, an adequate margin of safety (MOS) is required in the calculation process to provide a cushion needed because of uncertainties in the data and analysis. Current practices, however, rarely factor analysis' uncertainty in TMDL development and the MOS is largel...
The role of the PIRT process in identifying code improvements and executing code development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, G.E.; Boyack, B.E.
1997-07-01
In September 1988, the USNRC issued a revised ECCS rule for light water reactors that allows, as an option, the use of best estimate (BE) plus uncertainty methods in safety analysis. The key feature of this licensing option relates to quantification of the uncertainty in the determination that an NPP has a {open_quotes}low{close_quotes} probability of violating the safety criteria specified in 10 CFR 50. To support the 1988 licensing revision, the USNRC and its contractors developed the CSAU evaluation methodology to demonstrate the feasibility of the BE plus uncertainty approach. The PIRT process, Step 3 in the CSAU methodology, wasmore » originally formulated to support the BE plus uncertainty licensing option as executed in the CSAU approach to safety analysis. Subsequent work has shown the PIRT process to be a much more powerful tool than conceived in its original form. Through further development and application, the PIRT process has shown itself to be a robust means to establish safety analysis computer code phenomenological requirements in their order of importance to such analyses. Used early in research directed toward these objectives, PIRT results also provide the technical basis and cost effective organization for new experimental programs needed to improve the safety analysis codes for new applications. The primary purpose of this paper is to describe the generic PIRT process, including typical and common illustrations from prior applications. The secondary objective is to provide guidance to future applications of the process to help them focus, in a graded approach, on systems, components, processes and phenomena that have been common in several prior applications.« less
Evolutionary Capability Delivery of Coast Guard Manpower System
2014-06-01
Office IID iterative incremental development model IT information technology MA major accomplishment MRA manpower requirements analysis MRD manpower...CG will need to ensure that development is low risk. The CG uses Manpower Requirements Analysis ( MRAs ) to collect the necessary manpower data to...of users. The CG uses two business processes to manage human capital: Manpower Requirements Analysis ( MRA ) and Manpower Requirements
ERIC Educational Resources Information Center
Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse
2015-01-01
The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…
Development and Application of a Rubric for Analysis of Novice Students' Laboratory Flow Diagrams
ERIC Educational Resources Information Center
Davidowitz, Bette; Rollnick, Marissa; Fakudze, Cynthia
2005-01-01
The purpose of this study was to develop and apply a scheme for the analysis of flow diagrams. The flow diagrams in question are a schematic representation of written instructions that require students to process the text of their practical manual. It was hoped that an analysis of the flow diagrams would provide insight into students'…
Zhu, Xiao-Jing; Dai, Jie-Qiong; Tan, Xin; Zhao, Yang; Yang, Wei-Jun
2009-03-16
Cysts of Artemia can remain in a dormant state for long periods with a very low metabolic rate, and only resume their development with the approach of favorable conditions. The post-diapause development is a very complicated process involving a variety of metabolic and biochemical events. However, the intrinsic mechanisms that regulate this process are unclear. Herein we report the specific activation of an AMP-activated protein kinase (AMPK) in the post-diapause developmental process of Artemia. Using a phospho-AMPKalpha antibody, AMPK was shown to be phosphorylated in the post-diapause developmental process. Results of kinase assay analysis showed that this phosphorylation is essential for AMPK activation. Using whole-mount immunohistochemistry, phosphorylated AMPK was shown to be predominantly located in the ectoderm of the early developed embryos in a ring shape; however, the location and shape of the activation region changed as development proceeded. Additionally, Western blotting analysis on different portions of the cyst extracts showed that phosphorylated AMPKalpha localized to the nuclei and this location was not affected by intracellular pH. Confocal microscopy analysis of immunofluorescent stained cyst nuclei further showed that AMPKalpha localized to the nuclei when activated. Moreover, cellular AMP, ADP, and ATP levels in developing cysts were determined by HPLC, and the results showed that the activation of Artemia AMPK may not be associated with cellular AMP:ATP ratios, suggesting other pathways for regulation of Artemia AMPK activity. Together, we report evidence demonstrating the activation of AMPK in Artemia developing cysts and present an argument for its role in the development-related gene expression and energy control in certain cells during post-diapause development of Artemia.
An Ethical Principle for Social Justice in Community Development Practice.
ERIC Educational Resources Information Center
Sabre, Ru Michael
1980-01-01
Defines community development and shows how community development as an educational process embodies an ethical principle which, when applied to the analysis of community practices, promotes justice. (JOW)
Etiaba, Enyi; Uguru, Nkoli; Ebenso, Bassey; Russo, Giuliano; Ezumah, Nkoli; Uzochukwu, Benjamin; Onwujekwe, Obinna
2015-05-06
In Nigeria, there is a high burden of oral health diseases, poor coordination of health services and human resources for delivery of oral health services. Previous attempts to develop an Oral Health Policy (OHP) to decrease the oral disease burden failed. However, a policy was eventually developed in November 2012. This paper explores the role of contextual factors, actors and the policy process in the development of the OHP and possible reasons why the current approved OHP succeeded. The study was undertaken across Nigeria; information gathered through document reviews and in-depth interviews with five groups of purposively selected respondents. Analysis of the policy development process was guided by the policy triangle framework, examining context, policy process and actors involved in the policy development. The foremost enabling factor was the yearning among policy actors for a policy, having had four failed attempts. Other factors were the presence of a democratically elected government, a framework for health sector reform instituted by the Federal Ministry of Health (FMOH). The approved OHP went through all stages required for policy development unlike the previous attempts. Three groups of actors played crucial roles in the process, namely academics/researchers, development partners and policy makers. They either had decision making powers or influenced policy through funding or technical ability to generate credible research evidence, all sharing a common interest in developing the OHP. Although evidence was used to inform the development of the policy, the complex interactions between the context and actors facilitated its approval. The OHP development succeeded through a complex inter-relationship of context, process and actors, clearly illustrating that none of these factors could have, in isolation, catalyzed the policy development. Availability of evidence is necessary but not sufficient for developing policies in this area. Wider socio-political contexts in which actors develop policy can facilitate and/or constrain actors' roles and interests as well as policy process. These must be taken into consideration at stages of policy development in order to produce policies that will strengthen the health system, especially in low and middle-income countries, where policy processes and influences can be often less than transparent.
Analysis of Hospital Processes with Process Mining Techniques.
Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises
2015-01-01
Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.
Midwifery participatory curriculum development: Transformation through active partnership.
Sidebotham, Mary; Walters, Caroline; Chipperfield, Janine; Gamble, Jenny
2017-07-01
Evolving knowledge and professional practice combined with advances in pedagogy and learning technology create challenges for accredited professional programs. Internationally a sparsity of literature exists around curriculum development for professional programs responsive to regulatory and societal drivers. This paper evaluates a participatory curriculum development framework, adapted from the community development sector, to determine its applicability to promote engagement and ownership during the development of a Bachelor of Midwifery curriculum at an Australian University. The structures, processes and resulting curriculum development framework are described. A representative sample of key curriculum development team members were interviewed in relation to their participation. Qualitative analysis of transcribed interviews occurred through inductive, essentialist thematic analysis. Two main themes emerged: (1) 'it is a transformative journey' and (2) focused 'partnership in action'. Results confirmed the participatory curriculum development process provides symbiotic benefits to participants leading to individual and organisational growth and the perception of a shared curriculum. A final operational model using a participatory curriculum development process to guide the development of accredited health programs emerged. The model provides an appropriate structure to create meaningful collaboration with multiple stakeholders to produce a curriculum that is contemporary, underpinned by evidence and reflective of 'real world' practice. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Sanada, Masakazu; Tamada, Osamu; Ishikawa, Atsushi; Kawai, Akira
2005-05-01
Adhesion property of resist is characterized with DPAT (direct peeling with atomic force microscope (AFM) tip) method using 193 nm resist patterns of 180 nm dot shape which were developed for various developing time between 12 and 120 seconds in order to analyze the phenomenon which the short develop time process had led to suppress the pattern collapse. Surface free energy and refractive index of resist film treated with the developing time were also investigated from a thermodynamic point of view. The balance model among surface energy was adopted for analyzing intrusion phenomenon of developer solution into the resist-substrate interface. It can be explained quantitatively that the intrusion energy of developer solution acts to weaken the adhesion strength of resist pattern to the substrate. Furthermore, the intrusion energy became larger with increasing developing time. Analysis with the DPAT method indicates that the pattern collapse occurs accompanied with interface and cohesion destruction. Interface-scientifically speaking, the short develop time process proved to be effective to suppress the pattern collapse because of higher adhesion energy of the resist pattern to the substrate in shorter developing time.
Modeling and Analysis of Power Processing Systems (MAPPS). Volume 1: Technical report
NASA Technical Reports Server (NTRS)
Lee, F. C.; Rahman, S.; Carter, R. A.; Wu, C. H.; Yu, Y.; Chang, R.
1980-01-01
Computer aided design and analysis techniques were applied to power processing equipment. Topics covered include: (1) discrete time domain analysis of switching regulators for performance analysis; (2) design optimization of power converters using augmented Lagrangian penalty function technique; (3) investigation of current-injected multiloop controlled switching regulators; and (4) application of optimization for Navy VSTOL energy power system. The generation of the mathematical models and the development and application of computer aided design techniques to solve the different mathematical models are discussed. Recommendations are made for future work that would enhance the application of the computer aided design techniques for power processing systems.
NASA Technical Reports Server (NTRS)
Afjeh, Abdollah A.; Reed, John A.
2003-01-01
This research is aimed at developing a neiv and advanced simulation framework that will significantly improve the overall efficiency of aerospace systems design and development. This objective will be accomplished through an innovative integration of object-oriented and Web-based technologies ivith both new and proven simulation methodologies. The basic approach involves Ihree major areas of research: Aerospace system and component representation using a hierarchical object-oriented component model which enables the use of multimodels and enforces component interoperability. Collaborative software environment that streamlines the process of developing, sharing and integrating aerospace design and analysis models. . Development of a distributed infrastructure which enables Web-based exchange of models to simplify the collaborative design process, and to support computationally intensive aerospace design and analysis processes. Research for the first year dealt with the design of the basic architecture and supporting infrastructure, an initial implementation of that design, and a demonstration of its application to an example aircraft engine system simulation.
Development of an Aeroelastic Modeling Capability for Transient Nozzle Side Load Analysis
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Zhao, Xiang; Zhang, Sijun; Chen, Yen-Sen
2013-01-01
Lateral nozzle forces are known to cause severe structural damage to any new rocket engine in development. Currently there is no fully coupled computational tool to analyze this fluid/structure interaction process. The objective of this study was to develop a fully coupled aeroelastic modeling capability to describe the fluid/structure interaction process during the transient nozzle operations. The aeroelastic model composes of three components: the computational fluid dynamics component based on an unstructured-grid, pressure-based computational fluid dynamics formulation, the computational structural dynamics component developed in the framework of modal analysis, and the fluid-structural interface component. The developed aeroelastic model was applied to the transient nozzle startup process of the Space Shuttle Main Engine at sea level. The computed nozzle side loads and the axial nozzle wall pressure profiles from the aeroelastic nozzle are compared with those of the published rigid nozzle results, and the impact of the fluid/structure interaction on nozzle side loads is interrogated and presented.
An Analysis of Japanese University Students' Oral Performance in English Using Processability Theory
ERIC Educational Resources Information Center
Sakai, Hideki
2008-01-01
This paper presents a brief summary of processability theory as proposed by [Pienemann, M., 1998a. "Language Processing and Second Language Development: Processability Theory." John Benjamins, Amsterdam; Pienemann, M., 1998b. "Developmental dynamics in L1 and L2 acquisition: processability theory and generative entrenchment." "Bilingualism:…
Adaptation of in-situ microscopy for crystallization processes
NASA Astrophysics Data System (ADS)
Bluma, A.; Höpfner, T.; Rudolph, G.; Lindner, P.; Beutel, S.; Hitzmann, B.; Scheper, T.
2009-08-01
In biotechnological and pharmaceutical engineering, the study of crystallization processes gains importance. An efficient analytical inline sensor could help to improve the knowledge about these processes in order to increase efficiency and yields. The in-situ microscope (ISM) is an optical sensor developed for the monitoring of bioprocesses. A new application for this sensor is the monitoring in downstream processes, e.g. the crystallization of proteins and other organic compounds. This contribution shows new aspects of using in-situ microscopy to monitor crystallization processes. Crystals of different chemical compounds were precipitated from supersaturated solutions and the crystal growth was monitored. Exemplified morphological properties and different forms of crystals could be distinguished on the basis of offline experiments. For inline monitoring of crystallization processes, a special 0.5 L stirred tank reactor was developed and equipped with the in-situ microscope. This reactor was utilized to carry out batch experiments for crystallizations of O-acetylsalicyclic acid (ASS) and hen egg white lysozyme (HEWL). During the whole crystallization process, the in-situ microscope system acquired images directly from the crystallization broth. For the data evaluation, an image analysis algorithm was developed and implemented in the microscope analysis software.
Patel, Darshan C; Lyu, Yaqi Fara; Gandarilla, Jorge; Doherty, Steve
2018-04-03
In-process sampling and analysis is an important aspect of monitoring kinetic profiles and impurity formation or rejection, both in development and during commercial manufacturing. In pharmaceutical process development, the technology of choice for a substantial portion of this analysis is high-performance liquid chromatography (HPLC). Traditionally, the sample extraction and preparation for reaction characterization have been performed manually. This can be time consuming, laborious, and impractical for long processes. Depending on the complexity of the sample preparation, there can be variability introduced by different analysts, and in some cases, the integrity of the sample can be compromised during handling. While there are commercial instruments available for on-line monitoring with HPLC, they lack capabilities in many key areas. Some do not provide integration of the sampling and analysis, while others afford limited flexibility in sample preparation. The current offerings provide a limited number of unit operations available for sample processing and no option for workflow customizability. This work describes development of a microfluidic automated program (MAP) which fully automates the sample extraction, manipulation, and on-line LC analysis. The flexible system is controlled using an intuitive Microsoft Excel based user interface. The autonomous system is capable of unattended reaction monitoring that allows flexible unit operations and workflow customization to enable complex operations and on-line sample preparation. The automated system is shown to offer advantages over manual approaches in key areas while providing consistent and reproducible in-process data. Copyright © 2017 Elsevier B.V. All rights reserved.
Process development for automated solar cell and module production. Task 4: Automated array assembly
NASA Technical Reports Server (NTRS)
1980-01-01
A process sequence which can be used in conjunction with automated equipment for the mass production of solar cell modules for terrestrial use was developed. The process sequence was then critically analyzed from a technical and economic standpoint to determine the technological readiness of certain process steps for implementation. The steps receiving analysis were: back contact metallization, automated cell array layup/interconnect, and module edge sealing. For automated layup/interconnect, both hard automation and programmable automation (using an industrial robot) were studied. The programmable automation system was then selected for actual hardware development.
NASA Technical Reports Server (NTRS)
Falke, Stefan; Husar, Rudolf
2011-01-01
The goal of this REASoN applications and technology project is to deliver and use Earth Science Enterprise (ESE) data and tools in support of air quality management. Its scope falls within the domain of air quality management and aims to develop a federated air quality information sharing network that includes data from NASA, EPA, US States and others. Project goals were achieved through a access of satellite and ground observation data, web services information technology, interoperability standards, and air quality community collaboration. In contributing to a network of NASA ESE data in support of particulate air quality management, the project will develop access to distributed data, build Web infrastructure, and create tools for data processing and analysis. The key technologies used in the project include emerging web services for developing self describing and modular data access and processing tools, and service oriented architecture for chaining web services together to assemble customized air quality management applications. The technology and tools required for this project were developed within DataFed.net, a shared infrastructure that supports collaborative atmospheric data sharing and processing web services. Much of the collaboration was facilitated through community interactions through the Federation of Earth Science Information Partners (ESIP) Air Quality Workgroup. The main activities during the project that successfully advanced DataFed, enabled air quality applications and established community-oriented infrastructures were: develop access to distributed data (surface and satellite), build Web infrastructure to support data access, processing and analysis create tools for data processing and analysis foster air quality community collaboration and interoperability.
Launch COLA Gap Analysis for Protection of the International Space Station
NASA Astrophysics Data System (ADS)
Jenkin, Alan B.; McVey, John P.; Peterson, Glenn E.; Sorge, Marlon E.
2013-08-01
For launch missions in general, a collision avoidance (COLA) gap exists between the end of the time interval covered by standard launch COLA screening and the time that other spacecraft can clear a collision with the newly launched objects. To address this issue for the International Space Station (ISS), a COLA gap analysis process has been developed. The first part of the process, nodal separation analysis, identifies launch dates and launch window opportunities when the orbit traces of a launched object and the ISS could cross during the COLA gap. The second and newest part of the analysis process, Monte Carlo conjunction probability analysis, is performed closer to the launch dates of concern to reopen some of the launch window opportunities that would be closed by nodal separation analysis alone. Both parts of the process are described and demonstrated on sample missions.
Health system decentralisation in Nepal: identifying the issues.
Collins, Charles; Omar, Mayeh; Adhikari, Damodar; Dhakal, Ramji; Emmel, Nick; Dhakal, Megha Raj; Chand, Padam; Thapa, Druba; Singh, Arjun B
2007-01-01
The purpose of this paper is to describe and discuss policy analysis in Nepal and review the wide range of choices feasible in decentralisation decision making. In this paper an iterative qualitative method was developed and used in the research, which consisted of focus group interviews, key informant interviews, document analysis, including descriptive statistics, and analysis of the policy context. Participants in the research reflected the urban/rural mix of districts and the geography of Nepal. Analysis combined transcribed interviews with findings from document searches and analysis of the policy context. Coding was pre-determined during the training workshop and further codes were generated during and after the fieldwork. The paper finds that Nepal is in the process of decentralising public services from the central level to the local level, particularly to local bodies: District Development Committees (DDCs), Village Development Committees (VDCs) and Municipalities. Key contextual factors referred to are the overall structure of decentralisation, the social context of poverty and the political instability leading to a fluid political situation characterised by political tension, armed conflict, controversies and agreements while carrying out the research. The key issues identified and discussed in the paper are the policy process leading to decentralisation, the organisational structure and tension in the proposed system, the systems of resource generation, allocation, planning and management and lastly the forms of accountability, participation, public-private relations and collaborative strategies. The paper discusses the challenges faced in conducting such a policy analysis, the broad ranging and unremitting nature of the decentralisation process, and the contextual setting of the process of change.
Contractor relationships and inter-organizational strategies in NASA's R and D acquisition process
NASA Technical Reports Server (NTRS)
Guiltinan, J.
1976-01-01
Interorganizational analysis of NASA's acquisition process for research and development systems is discussed. The importance of understanding the contractor environment, constraints, and motives in selecting an acquisition strategy is demonstrated. By articulating clear project goals, by utilizing information about the contractor and his needs at each stage in the acquisition process, and by thorough analysis of the inter-organizational relationship, improved selection of acquisition strategies and business practices is possible.
An empirical analysis of the effects of consanguineous marriages on economic development.
Bildirici, Melike; Kökdener, Meltem; Ersin, Oezgür ömer
2010-01-01
In this study, development experiences toward economic development are investigated to provide an alternative analysis of economic development, human capital, and genetic inheritance in the light of consanguineous marriages. The countries analyzed in the study are discussed in accordance with consanguineous marriage practices and classified by their per capita gross domestic product (GDP) growth. A broad range of countries are analyzed in the study. Arab countries that experienced high rates of growth in their gross national income during the twentieth century but failed to fulfill adequate development measures as reflected in the growth in national income, countries undergoing transition from tight government regulation to free market democracy, and African nations that have experienced complications in the process of development show important differences in the process of economic development. It is shown that the countries that have reached high average development within the context of per capita GDP have overcome problems integral to consanguineous marriage.
SLS Navigation Model-Based Design Approach
NASA Technical Reports Server (NTRS)
Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas
2018-01-01
The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and management of design requirements to the development of usable models, model requirements, and model verification and validation efforts. The models themselves are represented in C/C++ code and accompanying data files. Under the idealized process, potential ambiguity in specification is reduced because the model must be implementable versus a requirement which is not necessarily subject to this constraint. Further, the models are shown to emulate the hardware during validation. For models developed by the Navigation Team, a common interface/standalone environment was developed. The common environment allows for easy implementation in design and analysis tools. Mechanisms such as unit test cases ensure implementation as the developer intended. The model verification and validation process provides a very high level of component design insight. The origin and implementation of the SLS variant of Model-based Design is described from the perspective of the SLS Navigation Team. The format of the models and the requirements are described. The Model-based Design approach has many benefits but is not without potential complications. Key lessons learned associated with the implementation of the Model Based Design approach and process from infancy to verification and certification are discussed
Characteristics study of the gears by the CAD/CAE
NASA Astrophysics Data System (ADS)
Wang, P. Y.; Chang, S. L.; Lee, B. Y.; Nguyen, D. H.; Cao, C. W.
2017-09-01
Gears are the most important transmission component in machines. The rapid development of the machines in industry requires a shorter time of the analysis process. In traditional, the gears are analyzed by setting up the complete mathematical model firstly, considering the profile of cutter and coordinate systems relationship between the machine and the cutter. It is a really complex and time-consuming process. Recently, the CAD/CAE software is well developed and useful in the mechanical design. In this paper, the Autodesk Inventor® software is introduced to model the spherical gears firstly, and then the models can also be transferred into ANSYS Workbench for the finite element analysis. The proposed process in this paper is helpful to the engineers to speed up the analyzing process of gears in the design stage.
Containerless processing of undercooled melts
NASA Technical Reports Server (NTRS)
Perepezko, J. H.
1993-01-01
The investigation focused on the control of microstructural evolution in Mn-Al, Fe-Ni, Ni-V, and Au-Pb-Sb alloys through the high undercooling levels provided by containerless processing, and provided fundamental new information on the control of nucleation. Solidification analysis was conducted by means of thermal analysis, x-ray diffraction, and metallographic characterization on samples processed in a laboratory scale drop tube system. The Mn-Al alloy system offers a useful model system with the capability of phase separation on an individual particle basis, thus permitting a more complete understanding of the operative kinetics and the key containerless processing variables. This system provided the opportunity of analyzing the nucleation rate as a function of processing conditions and allowed for the quantitative assessment of the relevant processing parameters. These factors are essential in the development of a containerless processing model which has a predictive capability. Similarly, Ni-V is a model system that was used to study duplex partitionless solidification, which is a structure possible only in high under cooling solidification processes. Nucleation kinetics for the competing bcc and fcc phases were studied to determine how this structure can develop and the conditions under which it may occur. The Fe-Ni alloy system was studied to identify microstructural transitions with controlled variations in sample size and composition during containerless solidification. This work was forwarded to develop a microstructure map which delineates regimes of structural evolution and provides a unified analysis of experimental observations. The Au-Pb-Sb system was investigated to characterize the thermodynamic properties of the undercooled liquid phase and to characterize the glass transition under a variety of processing conditions. By analyzing key containerless processing parameters in a ground based drop tube study, a carefully designed flight experiment may be planned to utilize the extended duration microgravity conditions of orbiting spacecraft.
Analysis of THG modes for femtosecond laser pulse
NASA Astrophysics Data System (ADS)
Trofimov, Vyacheslav A.; Sidorov, Pavel S.
2017-05-01
THG is used nowadays in many practical applications such as a substance diagnostics, and biological objects imaging, and etc. With developing of new materials and technology (for example, photonic crystal) an attention to THG process analysis grow. Therefore, THG features understanding are a modern problem. Early we have developed new analytical approach based on using the problem invariant for analytical solution construction of the THG process. It should be stressed that we did not use a basic wave non-depletion approximation. Nevertheless, a long pulse duration approximation and plane wave approximation has applied. The analytical solution demonstrates, in particular, an optical bistability property (and may other regimes of frequency tripling) for the third harmonic generation process. But, obviously, this approach does not reflect an influence of a medium dispersion on the frequency tripling. Therefore, in this paper we analyze THG efficiency of a femtosecond laser pulse taking into account a second order dispersion affect as well as self- and crossmodulation of the interacting waves affect on the frequency conversion process. Analysis is made using a computer simulation on the base of Schrödinger equations describing the process under consideration.
Analysis of stress-strain relationships in silicon ribbon
NASA Technical Reports Server (NTRS)
Dillon, O. W., Jr.
1984-01-01
An analysis of stress-strain relationships in silicon ribbon is presented. A model to present entire process, dynamical Transit Analysis is developed. It is found that knowledge of past-strain history is significant in modeling activities.
2007-05-01
of the current project was to unpack and develop the concept of sensemaking, principally by developing and testing a cognitive model of the processes...themselves. In Year 2, new Cognitive Task Analysis data collection methods were developed and used to further test the model. Cognitive Task Analysis is a...2004) to examine the phenomenon of "sensemaking," a concept initially formulated by Weick (1995), but not developed from a cognitive perspective
ERIC Educational Resources Information Center
Losinski, Mickey Lee
2013-01-01
Structural analysis (SA) is an assessment process developed to analyze hypothesized relationships between contextual variables and subsequent behaviors. In the present study, an alternating treatments design investigated the effectiveness of environmentally-based interventions to reduce disruptive behaviors and increase on-task behaviors of…
ERIC Educational Resources Information Center
MacNeela, Pádraig; Gannon, Niall
2014-01-01
Volunteering among university students is an important expression of civic engagement, but the impact of this experience on the development of emerging adults requires further contextualization. Adopting interpretative phenomenological analysis as a qualitative research approach, we carried out semistructured interviews with 10 students of one…
Meta-Analysis: An Approach to Interview Success.
ERIC Educational Resources Information Center
McCaslin, Mark; Carlson, Nancy M.
An initial research step, developing an effective interview strategy, presents unique challenges for novice and master research alike. To focus qualitative research in the human ecology of the study, the strategy presented in this paper used an initial interview protocol and preanalysis process, called meta-analysis, prior to developing the formal…
NASA Technical Reports Server (NTRS)
Davis, Frank W.; Quattrochi, Dale A.; Ridd, Merrill K.; Lam, Nina S.-N.; Walsh, Stephen J.
1991-01-01
This paper discusses some basic scientific issues and research needs in the joint processing of remotely sensed and GIS data for environmental analysis. Two general topics are treated in detail: (1) scale dependence of geographic data and the analysis of multiscale remotely sensed and GIS data, and (2) data transformations and information flow during data processing. The discussion of scale dependence focuses on the theory and applications of spatial autocorrelation, geostatistics, and fractals for characterizing and modeling spatial variation. Data transformations during processing are described within the larger framework of geographical analysis, encompassing sampling, cartography, remote sensing, and GIS. Development of better user interfaces between image processing, GIS, database management, and statistical software is needed to expedite research on these and other impediments to integrated analysis of remotely sensed and GIS data.
A cost-benefit analysis for materials management information systems.
Slapak-Iacobelli, L; Wilde, A H
1993-02-01
The cost-benefit analysis provided the system planners with valuable information that served many purposes. It answered the following questions: Why was the CCF undertaking this project? What were the alternatives? How much was it going to cost? And what was the expected outcome? The process of developing cost-benefit the document kept the project team focused. It also motivated them to involve additional individuals from materials management and accounts payable in its development. A byproduct of this involvement was buy-in and commitment to the project by everyone in these areas. Consequently, the project became a team effort championed by many and not just one. We were also able to introduce two new information system processes: 1) a management review process with goals and anticipated results, and 2) a quality assurance process that ensured the CCF had a better product in the end. The cost-benefit analysis provided a planning tool that assisted in successful implementation of an integrated materials management information system.
Modeling of laser transmission contour welding process using FEA and DoE
NASA Astrophysics Data System (ADS)
Acherjee, Bappa; Kuar, Arunanshu S.; Mitra, Souren; Misra, Dipten
2012-07-01
In this research, a systematic investigation on laser transmission contour welding process is carried out using finite element analysis (FEA) and design of experiments (DoE) techniques. First of all, a three-dimensional thermal model is developed to simulate the laser transmission contour welding process with a moving heat source. The commercial finite element code ANSYS® multi-physics is used to obtain the numerical results by implementing a volumetric Gaussian heat source, and combined convection-radiation boundary conditions. Design of experiments together with regression analysis is then employed to plan the experiments and to develop mathematical models based on simulation results. Four key process parameters, namely power, welding speed, beam diameter, and carbon black content in absorbing polymer, are considered as independent variables, while maximum temperature at weld interface, weld width, and weld depths in transparent and absorbing polymers are considered as dependent variables. Sensitivity analysis is performed to determine how different values of an independent variable affect a particular dependent variable.
Gubicza, Krisztina; Nieves, Ismael U; Sagues, William J; Barta, Zsolt; Shanmugam, K T; Ingram, Lonnie O
2016-05-01
A techno-economic analysis was conducted for a simplified lignocellulosic ethanol production process developed and proven by the University of Florida at laboratory, pilot, and demonstration scales. Data obtained from all three scales of development were used with Aspen Plus to create models for an experimentally-proven base-case and 5 hypothetical scenarios. The model input parameters that differed among the hypothetical scenarios were fermentation time, enzyme loading, enzymatic conversion, solids loading, and overall process yield. The minimum ethanol selling price (MESP) varied between 50.38 and 62.72 US cents/L. The feedstock and the capital cost were the main contributors to the production cost, comprising between 23-28% and 40-49% of the MESP, respectively. A sensitivity analysis showed that overall ethanol yield had the greatest effect on the MESP. These findings suggest that future efforts to increase the economic feasibility of a cellulosic ethanol process should focus on optimization for highest ethanol yield. Copyright © 2016 Elsevier Ltd. All rights reserved.
Eco-Efficiency Analysis of biotechnological processes.
Saling, Peter
2005-07-01
Eco-Efficiency has been variously defined and analytically implemented by several workers. In most cases, Eco-Efficiency is taken to mean the ecological optimization of overall systems while not disregarding economic factors. Eco-Efficiency should increase the positive ecological performance of a commercial company in relation to economic value creation--or to reduce negative effects. Several companies use Eco-Efficiency Analysis for decision-making processes; and industrial examples of best practices in developing and implementing Eco-Efficiency have been reviewed. They clearly demonstrate the environmental and business benefits of Eco-Efficiency. An instrument for the early recognition and systematic detection of economic and environmental opportunities and risks for production processes in the chemical industry began use in 1997, since when different new features have been developed, leading to many examples. This powerful Eco-Efficiency Analysis allows a feasibility evaluation of existing and future business activities and is applied by BASF. In many cases, decision-makers are able to choose among alternative processes for making a product.
An Overview of the Role of Systems Analysis in NASA's Hypersonics Project
NASA Technical Reports Server (NTRS)
Robinson, Jeffrey S.; Martin John G.; Bowles, Jeffrey V>
2006-01-01
NASA's Aeronautics Research Mission Directorate recently restructured its Vehicle Systems Program, refocusing it towards understanding the fundamental physics that govern flight in all speed regimes. Now called the Fundamental Aeronautics Program, it is comprised of four new projects, Subsonic Fixed Wing, Subsonic Rotary Wing, Supersonics, and Hypersonics. The Aeronautics Research Mission Directorate has charged the Hypersonics Project with having a basic understanding of all systems that travel at hypersonic speeds within the Earth's and other planets atmospheres. This includes both powered and unpowered systems, such as re-entry vehicles and vehicles powered by rocket or airbreathing propulsion that cruise in and accelerate through the atmosphere. The primary objective of the Hypersonics Project is to develop physics-based predictive tools that enable the design, analysis and optimization of such systems. The Hypersonics Project charges the systems analysis discipline team with providing it the decision-making information it needs to properly guide research and technology development. Credible, rapid, and robust multi-disciplinary system analysis processes and design tools are required in order to generate this information. To this end, the principal challenges for the systems analysis team are the introduction of high fidelity physics into the analysis process and integration into a design environment, quantification of design uncertainty through the use of probabilistic methods, reduction in design cycle time, and the development and implementation of robust processes and tools enabling a wide design space and associated technology assessment capability. This paper will discuss the roles and responsibilities of the systems analysis discipline team within the Hypersonics Project as well as the tools, methods, processes, and approach that the team will undertake in order to perform its project designated functions.
The road to smoke-free legislation in Ireland.
Currie, Laura M; Clancy, Luke
2011-01-01
To describe the process through which Ireland changed its policies towards smoking in work-places and distil lessons for others implementing or extending smoke-free laws. This analysis is informed by a review of secondary sources including a commissioned media analysis, documentary analysis and key informant interviews with policy actors who provide insight into the process of smoke-free policy development. The policy analysis techniques used include the development of a time-line for policy reform, stakeholder analysis, policy mapping techniques, impact analysis through use of secondary data and a review process. The policy analysis triangle, which highlights the importance of examining policy content, context, actors and processes, will be used as an analytical framework. The importance of the political, economic, social and cultural context emerged clearly. The interaction of the context with the policy process both in identification of need for policy and its formulation demonstrated the opportunity for advocates to exert influence at all points of the process. The campaign to support the legislation had the following characteristics: a sustained consistent simple health message, sustained political leadership/commitment, a strong coalition between the Health Alliance, the Office of Tobacco Control and the Department of Health and Children, with cross-party political support and trade union support. The public and the media support clearly defined the benefit of deliberate and consistent planning and organization of a communication strategy. The Irish smoke-free legislation was a success as a policy initiative because of timing, dedication, planning, implementation and the existence of strong leadership and a powerful convinced credible political champion. © 2010 The Authors, Addiction © 2010 Society for the Study of Addiction.
Development of Integrated Programs for Aerospace-vehicle design (IPAD): Reference design process
NASA Technical Reports Server (NTRS)
Meyer, D. D.
1979-01-01
The airplane design process and its interfaces with manufacturing and customer operations are documented to be used as criteria for the development of integrated programs for the analysis, design, and testing of aerospace vehicles. Topics cover: design process management, general purpose support requirements, design networks, and technical program elements. Design activity sequences are given for both supersonic and subsonic commercial transports, naval hydrofoils, and military aircraft.
Ground Vehicle Condition Based Maintenance
2010-10-04
Diagnostic Process Map 32 FMEAs Developed : • Diesel Engine • Transmission • Alternators Analysis : • Identify failure modes • Derive design factors and...S&T Initiatives TARDEC P&D Process Map Component Testing ARL CBM Research AMSAA SDC & Terrain Modeling UNCLASSIFIED 3 CBM+ Overview...UNCLASSIFIED 4 RCM and CBM are core processes for CBM+ System Development • Army Regulation 750-1, 20 Sep 2007, p. 79 - Reliability Centered Maintenance (RCM
[Challenges in geriatric rehabilitation: the development of an integrated care pathway].
Everink, Irma Helga Johanna; van Haastregt, Jolanda C M; Kempen, Gertrudis I J M; Dielis, Leen M J; Maessen, José M C; Schols, Jos M G A
2015-04-01
Coordination and continuity of care within geriatric rehabilitation is challenging. To tackle these challenges, an integrated care pathway within geriatric rehabilitation care (hospital, geriatric rehabilitation and follow-up care in the home situation) has been developed. The aim of this article is to expound the process of developing the integrated care pathway, and to describe and discuss the results of this process (which is the integrated care pathway). Developing the integrated care pathway was done by the guidance of the first four steps of the theoretical framework for implementation of change from Grol and Wensing: (1) development of a specific proposal for change in practice; (2) analysis of current care practice; (3) analysis of the target group and setting; and (4) development and selection of interventions/strategies for change. The organizations involved in geriatric rehabilitation argued that the integrated care pathway should focus on improving the process of care, including transfer of patients, handovers and communication between care organizations. Current practice, barriers and incentives for change were analyzed through literature research, expert consultation, and interviews with the involved caregivers and by establishing working groups of health care professionals, patients and informal caregivers. This resulted in valuable proposals for improvement of the care process, which were gathered and combined in the integrated care pathway. The integrated care pathway entails agreements on (a) the triage process in the hospital; (b) active engagement of patients and informal caregivers in the care process; (c) timely and high quality handovers; and (d) improved communication between caregivers.
Boiling process modelling peculiarities analysis of the vacuum boiler
NASA Astrophysics Data System (ADS)
Slobodina, E. N.; Mikhailov, A. G.
2017-06-01
The analysis of the low and medium powered boiler equipment development was carried out, boiler units possible development directions with the purpose of energy efficiency improvement were identified. Engineering studies for the vacuum boilers applying are represented. Vacuum boiler heat-exchange processes where boiling water is the working body are considered. Heat-exchange intensification method under boiling at the maximum heat- transfer coefficient is examined. As a result of the conducted calculation studies, heat-transfer coefficients variation curves depending on the pressure, calculated through the analytical and numerical methodologies were obtained. The conclusion about the possibility of numerical computing method application through RPI ANSYS CFX for the boiling process description in boiler vacuum volume was given.
Viewpoints on Medical Image Processing: From Science to Application
Deserno (né Lehmann), Thomas M.; Handels, Heinz; Maier-Hein (né Fritzsche), Klaus H.; Mersmann, Sven; Palm, Christoph; Tolxdorff, Thomas; Wagenknecht, Gudrun; Wittenberg, Thomas
2013-01-01
Medical image processing provides core innovation for medical imaging. This paper is focused on recent developments from science to applications analyzing the past fifteen years of history of the proceedings of the German annual meeting on medical image processing (BVM). Furthermore, some members of the program committee present their personal points of views: (i) multi-modality for imaging and diagnosis, (ii) analysis of diffusion-weighted imaging, (iii) model-based image analysis, (iv) registration of section images, (v) from images to information in digital endoscopy, and (vi) virtual reality and robotics. Medical imaging and medical image computing is seen as field of rapid development with clear trends to integrated applications in diagnostics, treatment planning and treatment. PMID:24078804
Viewpoints on Medical Image Processing: From Science to Application.
Deserno Né Lehmann, Thomas M; Handels, Heinz; Maier-Hein Né Fritzsche, Klaus H; Mersmann, Sven; Palm, Christoph; Tolxdorff, Thomas; Wagenknecht, Gudrun; Wittenberg, Thomas
2013-05-01
Medical image processing provides core innovation for medical imaging. This paper is focused on recent developments from science to applications analyzing the past fifteen years of history of the proceedings of the German annual meeting on medical image processing (BVM). Furthermore, some members of the program committee present their personal points of views: (i) multi-modality for imaging and diagnosis, (ii) analysis of diffusion-weighted imaging, (iii) model-based image analysis, (iv) registration of section images, (v) from images to information in digital endoscopy, and (vi) virtual reality and robotics. Medical imaging and medical image computing is seen as field of rapid development with clear trends to integrated applications in diagnostics, treatment planning and treatment.
High temperature gradient cobalt based clad developed using microwave hybrid heating
NASA Astrophysics Data System (ADS)
Prasad, C. Durga; Joladarashi, Sharnappa; Ramesh, M. R.; Sarkar, Anunoy
2018-04-01
The development of cobalt based cladding on a titanium substrate using microwave cladding technique is benchmark in coating area. The developed cladding would serve the function of a corrosion resistant coating under high temperatures. Clads of thickness 500 µm have been developed by microwave hybrid heating. A microwave furnace of 2.45GHz frequency was used at a 900W power level for processing. Impact of processing time on melting and adhesion of clad has been discussed. The study also extended to static thermal analysis of simple parts with cladding using commercial Finite Element analysis (FEA) software. A comparative study is explored between four variants of the clad being developed. The analysis has been conducted using a square sample. Similar temperature gradient is also shown for a proposed multi-layer coating, which includes a thermal barrier coating yttria stabilized zirconia (YSZ) on top of the corrosion resistant clad. The YSZ coating would protect the corrosion resistant cladding and substrate from high temperatures.
Polyglot Programming in Applications Used for Genetic Data Analysis
Nowak, Robert M.
2014-01-01
Applications used for the analysis of genetic data process large volumes of data with complex algorithms. High performance, flexibility, and a user interface with a web browser are required by these solutions, which can be achieved by using multiple programming languages. In this study, I developed a freely available framework for building software to analyze genetic data, which uses C++, Python, JavaScript, and several libraries. This system was used to build a number of genetic data processing applications and it reduced the time and costs of development. PMID:25197633
Polyglot programming in applications used for genetic data analysis.
Nowak, Robert M
2014-01-01
Applications used for the analysis of genetic data process large volumes of data with complex algorithms. High performance, flexibility, and a user interface with a web browser are required by these solutions, which can be achieved by using multiple programming languages. In this study, I developed a freely available framework for building software to analyze genetic data, which uses C++, Python, JavaScript, and several libraries. This system was used to build a number of genetic data processing applications and it reduced the time and costs of development.
Tribology symposium -- 1994. PD-Volume 61
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masudi, H.
This year marks the first Tribology Symposium within the Energy-Sources Technology Conference, sponsored by the ASME Petroleum Division. The program was divided into five sessions: Tribology in High Technology, a historical discussion of some watershed events in tribology; Research/Development, design, research and development on modern manufacturing; Tribology in Manufacturing, the impact of tribology on modern manufacturing; Design/Design Representation, aspects of design related to tribological systems; and Failure Analysis, an analysis of failure, failure detection, and failure monitoring as relating to manufacturing processes. Eleven papers have been processed separately for inclusion on the data base.
Generic trending and analysis system
NASA Technical Reports Server (NTRS)
Keehan, Lori; Reese, Jay
1994-01-01
The Generic Trending and Analysis System (GTAS) is a generic spacecraft performance monitoring tool developed by NASA Code 511 and Loral Aerosys. It is designed to facilitate quick anomaly resolution and trend analysis. Traditionally, the job of off-line analysis has been performed using hardware and software systems developed for real-time spacecraft contacts; then, the systems were supplemented with a collection of tools developed by Flight Operations Team (FOT) members. Since the number of upcoming missions is increasing, NASA can no longer afford to operate in this manner. GTAS improves control center productivity and effectiveness because it provides a generic solution across multiple missions. Thus, GTAS eliminates the need for each individual mission to develop duplicate capabilities. It also allows for more sophisticated tools to be developed because it draws resources from several projects. In addition, the GTAS software system incorporates commercial off-the-shelf tools software (COTS) packages and reuses components of other NASA-developed systems wherever possible. GTAS has incorporated lessons learned from previous missions by involving the users early in the development process. GTAS users took a proactive role in requirements analysis, design, development, and testing. Because of user involvement, several special tools were designed and are now being developed. GTAS users expressed considerable interest in facilitating data collection for long term trending and analysis. As a result, GTAS provides easy access to large volumes of processed telemetry data directly in the control center. The GTAS archival and retrieval capabilities are supported by the integration of optical disk technology and a COTS relational database management system.
2014-06-01
and Coastal Data Information Program ( CDIP ). This User’s Guide includes step-by-step instructions for accessing the GLOS/GLCFS database via WaveNet...access, processing and analysis tool; part 3 – CDIP database. ERDC/CHL CHETN-xx-14. Vicksburg, MS: U.S. Army Engineer Research and Development Center
ERIC Educational Resources Information Center
Chen, Zhe; Honomichl, Ryan; Kennedy, Diane; Tan, Enda
2016-01-01
The present study examines 5- to 8-year-old children's relation reasoning in solving matrix completion tasks. This study incorporates a componential analysis, an eye-tracking method, and a microgenetic approach, which together allow an investigation of the cognitive processing strategies involved in the development and learning of children's…
Sirichai, S; de Mello, A J
2001-01-01
The separation and detection of both print and film developing agents (CD-3 and CD-4) in photographic processing solutions using chip-based capillary electrophoresis is presented. For simultaneous detection of both analytes under identical experimental conditions a buffer pH of 11.9 is used to partially ionise the analytes. Detection is made possible by indirect fluorescence, where the ions of the analytes displace the anionic fluorescing buffer ion to create negative peaks. Under optimal conditions, both analytes can be analyzed within 30 s. The limits of detection for CD-3 and CD-4 are 0.17 mM and 0.39 mM, respectively. The applicability of the method for the analysis of seasoned photographic processing developer solutions is also examined.
Methods for Maximizing the Learning Process: A Theoretical and Experimental Analysis.
ERIC Educational Resources Information Center
Atkinson, Richard C.
This research deals with optimizing the instructional process. The approach adopted was to limit consideration to simple learning tasks for which adequate mathematical models could be developed. Optimal or suitable suboptimal instructional strategies were developed for the models. The basic idea was to solve for strategies that either maximize the…
The Development and Validation of Scores on the Mathematics Information Processing Scale (MIPS).
ERIC Educational Resources Information Center
Bessant, Kenneth C.
1997-01-01
This study reports on the development and psychometric properties of a new 87-item Mathematics Information Processing Scale that explores learning strategies, metacognitive problem-solving skills, and attentional deployment. Results with 340 college students support the use of the instrument, for which factor analysis identified five theoretically…
ERIC Educational Resources Information Center
London, Manuel; Sessa, Valerie I.
2007-01-01
This article integrates the literature on group interaction process analysis and group learning, providing a framework for understanding how patterns of interaction develop. The model proposes how adaptive, generative, and transformative learning processes evolve and vary in their functionality. Environmental triggers for learning, the group's…
NASA Technical Reports Server (NTRS)
Johnson, Donald R.
1998-01-01
The goal of this research is the continued development and application of global isentropic modeling and analysis capabilities to describe hydrologic processes and energy exchange in the climate system, and discern regional climate change. This work involves a combination of modeling and analysis efforts involving 4DDA datasets and simulations from the University of Wisconsin (UW) hybrid isentropic-sigma (theta-sigma) coordinate model and the GEOS GCM.
Finite element analysis as a design tool for thermoplastic vulcanizate glazing seals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gase, K.M.; Hudacek, L.L.; Pesevski, G.T.
1998-12-31
There are three materials that are commonly used in commercial glazing seals: EPDM, silicone and thermoplastic vulcanizates (TPVs). TPVs are a high performance class of thermoplastic elastomers (TPEs), where TPEs have elastomeric properties with thermoplastic processability. TPVs have emerged as materials well suited for use in glazing seals due to ease of processing, economics and part design flexibility. The part design and development process is critical to ensure that the chosen TPV provides economics, quality and function in demanding environments. In the design and development process, there is great value in utilizing dual durometer systems to capitalize on the benefitsmore » of soft and rigid materials. Computer-aided design tools, such as Finite Element Analysis (FEA), are effective in minimizing development time and predicting system performance. Examples of TPV glazing seals will illustrate the benefits of utilizing FEA to take full advantage of the material characteristics, which results in functional performance and quality while reducing development iterations. FEA will be performed on two glazing seal profiles to confirm optimum geometry.« less
The Development of a Humanitarian Health Ethics Analysis Tool.
Fraser, Veronique; Hunt, Matthew R; de Laat, Sonya; Schwartz, Lisa
2015-08-01
Introduction Health care workers (HCWs) who participate in humanitarian aid work experience a range of ethical challenges in providing care and assistance to communities affected by war, disaster, or extreme poverty. Although there is increasing discussion of ethics in humanitarian health care practice and policy, there are very few resources available for humanitarian workers seeking ethical guidance in the field. To address this knowledge gap, a Humanitarian Health Ethics Analysis Tool (HHEAT) was developed and tested as an action-oriented resource to support humanitarian workers in ethical decision making. While ethical analysis tools increasingly have become prevalent in a variety of practice contexts over the past two decades, very few of these tools have undergone a process of empirical validation to assess their usefulness for practitioners. A qualitative study consisting of a series of six case-analysis sessions with 16 humanitarian HCWs was conducted to evaluate and refine the HHEAT. Participant feedback inspired the creation of a simplified and shortened version of the tool and prompted the development of an accompanying handbook. The study generated preliminary insight into the ethical deliberation processes of humanitarian health workers and highlighted different types of ethics support that humanitarian workers might find helpful in supporting the decision-making process.
Developing the national community health assistant strategy in Zambia: a policy analysis
2013-01-01
Background In 2010, the Ministry of Health in Zambia developed the National Community Health Assistant strategy, aiming to integrate community health workers (CHWs) into national health plans by creating a new group of workers, called community health assistants (CHAs). The aim of the paper is to analyse the CHA policy development process and the factors that influenced its evolution and content. A policy analysis approach was used to analyse the policy reform process. Methodology Data were gathered through review of documents, participant observation and key informant interviews with CHA strategic team members in Lusaka district, and senior officials at the district level in Kapiri Mposhi district where some CHAs have been deployed. Results The strategy was developed in order to address the human resources for health shortage and the challenges facing the community-based health workforce in Zambia. However, some actors within the strategic team were more influential than others in informing the policy agenda, determining the process, and shaping the content. These actors negotiated with professional/statutory bodies and health unions on the need to develop the new cadre which resulted in compromises that enabled the policy process to move forward. International agencies also indirectly influenced the course as well as the content of the strategy. Some actors classified the process as both insufficiently consultative and rushed. Due to limited consultation, it was suggested that the policy content did not adequately address key policy content issues such as management of staff attrition, general professional development, and progression matters. Analysis of the process also showed that the strategy might create a new group of workers whose mandate is unclear to the existing group of health workers. Conclusions This paper highlights the complex nature of policy-making processes for integrating CHWs into the health system. It reiterates the need for recognising the fact that actors’ power or position in the political hierarchy may, more than their knowledge and understanding of the issue, play a disproportionate role in shaping the process as well as content of health policy reform. PMID:23870454
Developing the national community health assistant strategy in Zambia: a policy analysis.
Zulu, Joseph Mumba; Kinsman, John; Michelo, Charles; Hurtig, Anna-Karin
2013-07-20
In 2010, the Ministry of Health in Zambia developed the National Community Health Assistant strategy, aiming to integrate community health workers (CHWs) into national health plans by creating a new group of workers, called community health assistants (CHAs). The aim of the paper is to analyse the CHA policy development process and the factors that influenced its evolution and content. A policy analysis approach was used to analyse the policy reform process. Data were gathered through review of documents, participant observation and key informant interviews with CHA strategic team members in Lusaka district, and senior officials at the district level in Kapiri Mposhi district where some CHAs have been deployed. The strategy was developed in order to address the human resources for health shortage and the challenges facing the community-based health workforce in Zambia. However, some actors within the strategic team were more influential than others in informing the policy agenda, determining the process, and shaping the content. These actors negotiated with professional/statutory bodies and health unions on the need to develop the new cadre which resulted in compromises that enabled the policy process to move forward. International agencies also indirectly influenced the course as well as the content of the strategy. Some actors classified the process as both insufficiently consultative and rushed. Due to limited consultation, it was suggested that the policy content did not adequately address key policy content issues such as management of staff attrition, general professional development, and progression matters. Analysis of the process also showed that the strategy might create a new group of workers whose mandate is unclear to the existing group of health workers. This paper highlights the complex nature of policy-making processes for integrating CHWs into the health system. It reiterates the need for recognising the fact that actors' power or position in the political hierarchy may, more than their knowledge and understanding of the issue, play a disproportionate role in shaping the process as well as content of health policy reform.
Rosati, Nicoletta
2002-04-01
Project selection and portfolio management are particularly challenging in the pharmaceutical industry due to the high risk - high stake nature of the drug development process. In the recent years, scholars and industry experts have agreed that traditional Net-Present-Value evaluation of the projects fails to capture the value of managerial flexibility, and encouraged adopting a real options approach to recover the missed value. In this paper, we take a closer look at the drug development process and at the indices currently used to rank projects. We discuss the economic value of information and of real options arising in drug development and present decision analysis as an ideal framework for the implementation of real options valuation.
The Systems Engineering Process for Human Support Technology Development
NASA Technical Reports Server (NTRS)
Jones, Harry
2005-01-01
Systems engineering is designing and optimizing systems. This paper reviews the systems engineering process and indicates how it can be applied in the development of advanced human support systems. Systems engineering develops the performance requirements, subsystem specifications, and detailed designs needed to construct a desired system. Systems design is difficult, requiring both art and science and balancing human and technical considerations. The essential systems engineering activity is trading off and compromising between competing objectives such as performance and cost, schedule and risk. Systems engineering is not a complete independent process. It usually supports a system development project. This review emphasizes the NASA project management process as described in NASA Procedural Requirement (NPR) 7120.5B. The process is a top down phased approach that includes the most fundamental activities of systems engineering - requirements definition, systems analysis, and design. NPR 7120.5B also requires projects to perform the engineering analyses needed to ensure that the system will operate correctly with regard to reliability, safety, risk, cost, and human factors. We review the system development project process, the standard systems engineering design methodology, and some of the specialized systems analysis techniques. We will discuss how they could apply to advanced human support systems development. The purpose of advanced systems development is not directly to supply human space flight hardware, but rather to provide superior candidate systems that will be selected for implementation by future missions. The most direct application of systems engineering is in guiding the development of prototype and flight experiment hardware. However, anticipatory systems engineering of possible future flight systems would be useful in identifying the most promising development projects.
Data processing has major impact on the outcome of quantitative label-free LC-MS analysis.
Chawade, Aakash; Sandin, Marianne; Teleman, Johan; Malmström, Johan; Levander, Fredrik
2015-02-06
High-throughput multiplexed protein quantification using mass spectrometry is steadily increasing in popularity, with the two major techniques being data-dependent acquisition (DDA) and targeted acquisition using selected reaction monitoring (SRM). However, both techniques involve extensive data processing, which can be performed by a multitude of different software solutions. Analysis of quantitative LC-MS/MS data is mainly performed in three major steps: processing of raw data, normalization, and statistical analysis. To evaluate the impact of data processing steps, we developed two new benchmark data sets, one each for DDA and SRM, with samples consisting of a long-range dilution series of synthetic peptides spiked in a total cell protein digest. The generated data were processed by eight different software workflows and three postprocessing steps. The results show that the choice of the raw data processing software and the postprocessing steps play an important role in the final outcome. Also, the linear dynamic range of the DDA data could be extended by an order of magnitude through feature alignment and a charge state merging algorithm proposed here. Furthermore, the benchmark data sets are made publicly available for further benchmarking and software developments.
NASA Astrophysics Data System (ADS)
Cannata, Massimiliano; Neumann, Jakob; Cardoso, Mirko; Rossetto, Rudy; Foglia, Laura; Borsi, Iacopo
2017-04-01
In situ time-series are an important aspect of environmental modelling, especially with the advancement of numerical simulation techniques and increased model complexity. In order to make use of the increasing data available through the requirements of the EU Water Framework Directive, the FREEWAT GIS environment incorporates the newly developed Observation Analysis Tool for time-series analysis. The tool is used to import time-series data into QGIS from local CSV files, online sensors using the istSOS service, or MODFLOW model result files and enables visualisation, pre-processing of data for model development, and post-processing of model results. OAT can be used as a pre-processor for calibration observations, integrating the creation of observations for calibration directly from sensor time-series. The tool consists in an expandable Python library of processing methods and an interface integrated in the QGIS FREEWAT plug-in which includes a large number of modelling capabilities, data management tools and calibration capacity.
BioconductorBuntu: a Linux distribution that implements a web-based DNA microarray analysis server.
Geeleher, Paul; Morris, Dermot; Hinde, John P; Golden, Aaron
2009-06-01
BioconductorBuntu is a custom distribution of Ubuntu Linux that automatically installs a server-side microarray processing environment, providing a user-friendly web-based GUI to many of the tools developed by the Bioconductor Project, accessible locally or across a network. System installation is via booting off a CD image or by using a Debian package provided to upgrade an existing Ubuntu installation. In its current version, several microarray analysis pipelines are supported including oligonucleotide, dual-or single-dye experiments, including post-processing with Gene Set Enrichment Analysis. BioconductorBuntu is designed to be extensible, by server-side integration of further relevant Bioconductor modules as required, facilitated by its straightforward underlying Python-based infrastructure. BioconductorBuntu offers an ideal environment for the development of processing procedures to facilitate the analysis of next-generation sequencing datasets. BioconductorBuntu is available for download under a creative commons license along with additional documentation and a tutorial from (http://bioinf.nuigalway.ie).
Managing Analysis Models in the Design Process
NASA Technical Reports Server (NTRS)
Briggs, Clark
2006-01-01
Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.
Efficient sensitivity analysis and optimization of a helicopter rotor
NASA Technical Reports Server (NTRS)
Lim, Joon W.; Chopra, Inderjit
1989-01-01
Aeroelastic optimization of a system essentially consists of the determination of the optimum values of design variables which minimize the objective function and satisfy certain aeroelastic and geometric constraints. The process of aeroelastic optimization analysis is illustrated. To carry out aeroelastic optimization effectively, one needs a reliable analysis procedure to determine steady response and stability of a rotor system in forward flight. The rotor dynamic analysis used in the present study developed inhouse at the University of Maryland is based on finite elements in space and time. The analysis consists of two major phases: vehicle trim and rotor steady response (coupled trim analysis), and aeroelastic stability of the blade. For a reduction of helicopter vibration, the optimization process requires the sensitivity derivatives of the objective function and aeroelastic stability constraints. For this, the derivatives of steady response, hub loads and blade stability roots are calculated using a direct analytical approach. An automated optimization procedure is developed by coupling the rotor dynamic analysis, design sensitivity analysis and constrained optimization code CONMIN.
Systems Engineering in NASA's R&TD Programs
NASA Technical Reports Server (NTRS)
Jones, Harry
2005-01-01
Systems engineering is largely the analysis and planning that support the design, development, and operation of systems. The most common application of systems engineering is in guiding systems development projects that use a phased process of requirements, specifications, design, and development. This paper investigates how systems engineering techniques should be applied in research and technology development programs for advanced space systems. These programs should include anticipatory engineering of future space flight systems and a project portfolio selection process, as well as systems engineering for multiple development projects.
Functional Fault Model Development Process to Support Design Analysis and Operational Assessment
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.; Maul, William A.; Hemminger, Joseph A.
2016-01-01
A functional fault model (FFM) is an abstract representation of the failure space of a given system. As such, it simulates the propagation of failure effects along paths between the origin of the system failure modes and points within the system capable of observing the failure effects. As a result, FFMs may be used to diagnose the presence of failures in the modeled system. FFMs necessarily contain a significant amount of information about the design, operations, and failure modes and effects. One of the important benefits of FFMs is that they may be qualitative, rather than quantitative and, as a result, may be implemented early in the design process when there is more potential to positively impact the system design. FFMs may therefore be developed and matured throughout the monitored system's design process and may subsequently be used to provide real-time diagnostic assessments that support system operations. This paper provides an overview of a generalized NASA process that is being used to develop and apply FFMs. FFM technology has been evolving for more than 25 years. The FFM development process presented in this paper was refined during NASA's Ares I, Space Launch System, and Ground Systems Development and Operations programs (i.e., from about 2007 to the present). Process refinement took place as new modeling, analysis, and verification tools were created to enhance FFM capabilities. In this paper, standard elements of a model development process (i.e., knowledge acquisition, conceptual design, implementation & verification, and application) are described within the context of FFMs. Further, newer tools and analytical capabilities that may benefit the broader systems engineering process are identified and briefly described. The discussion is intended as a high-level guide for future FFM modelers.
Prescott, Jeffrey William
2013-02-01
The importance of medical imaging for clinical decision making has been steadily increasing over the last four decades. Recently, there has also been an emphasis on medical imaging for preclinical decision making, i.e., for use in pharamaceutical and medical device development. There is also a drive towards quantification of imaging findings by using quantitative imaging biomarkers, which can improve sensitivity, specificity, accuracy and reproducibility of imaged characteristics used for diagnostic and therapeutic decisions. An important component of the discovery, characterization, validation and application of quantitative imaging biomarkers is the extraction of information and meaning from images through image processing and subsequent analysis. However, many advanced image processing and analysis methods are not applied directly to questions of clinical interest, i.e., for diagnostic and therapeutic decision making, which is a consideration that should be closely linked to the development of such algorithms. This article is meant to address these concerns. First, quantitative imaging biomarkers are introduced by providing definitions and concepts. Then, potential applications of advanced image processing and analysis to areas of quantitative imaging biomarker research are described; specifically, research into osteoarthritis (OA), Alzheimer's disease (AD) and cancer is presented. Then, challenges in quantitative imaging biomarker research are discussed. Finally, a conceptual framework for integrating clinical and preclinical considerations into the development of quantitative imaging biomarkers and their computer-assisted methods of extraction is presented.
Demner-Fushman, D; Elhadad, N
2016-11-10
This paper reviews work over the past two years in Natural Language Processing (NLP) applied to clinical and consumer-generated texts. We included any application or methodological publication that leverages text to facilitate healthcare and address the health-related needs of consumers and populations. Many important developments in clinical text processing, both foundational and task-oriented, were addressed in community- wide evaluations and discussed in corresponding special issues that are referenced in this review. These focused issues and in-depth reviews of several other active research areas, such as pharmacovigilance and summarization, allowed us to discuss in greater depth disease modeling and predictive analytics using clinical texts, and text analysis in social media for healthcare quality assessment, trends towards online interventions based on rapid analysis of health-related posts, and consumer health question answering, among other issues. Our analysis shows that although clinical NLP continues to advance towards practical applications and more NLP methods are used in large-scale live health information applications, more needs to be done to make NLP use in clinical applications a routine widespread reality. Progress in clinical NLP is mirrored by developments in social media text analysis: the research is moving from capturing trends to addressing individual health-related posts, thus showing potential to become a tool for precision medicine and a valuable addition to the standard healthcare quality evaluation tools.
DOT National Transportation Integrated Search
2007-09-01
Traditional state procurement processes are not well-suited to the procurement of Intelligent Transportation Systems (ITS). The objective of this study was to analyze Kentuckys existing procurement processes, identify strengths and weaknesses of e...
Experimental analysis of armouring process
NASA Astrophysics Data System (ADS)
Lamberti, Alberto; Paris, Ennio
Preliminary results from an experimental investigation on armouring processes are presented. Particularly, the process of development and formation of the armour layer under different steady flow conditions has been analyzed in terms of grain size variations and sediment transport rate associated to each size fraction.
Conducting On-orbit Gene Expression Analysis on ISS: WetLab-2
NASA Technical Reports Server (NTRS)
Parra, Macarena; Almeida, Eduardo; Boone, Travis; Jung, Jimmy; Lera, Matthew P.; Ricco, Antonio; Souza, Kenneth; Wu, Diana; Richey, C. Scott
2013-01-01
WetLab-2 will enable expanded genomic research on orbit by developing tools that support in situ sample collection, processing, and analysis on ISS. This capability will reduce the time-to-results for investigators and define new pathways for discovery on the ISS National Lab. The primary objective is to develop a research platform on ISS that will facilitate real-time quantitative gene expression analysis of biological samples collected on orbit. WetLab-2 will be capable of processing multiple sample types ranging from microbial cultures to animal tissues dissected on orbit. WetLab-2 will significantly expand the analytical capabilities onboard ISS and enhance science return from ISS.
Cost analysis of advanced turbine blade manufacturing processes
NASA Technical Reports Server (NTRS)
Barth, C. F.; Blake, D. E.; Stelson, T. S.
1977-01-01
A rigorous analysis was conducted to estimate relative manufacturing costs for high technology gas turbine blades prepared by three candidate materials process systems. The manufacturing costs for the same turbine blade configuration of directionally solidified eutectic alloy, an oxide dispersion strengthened superalloy, and a fiber reinforced superalloy were compared on a relative basis to the costs of the same blade currently in production utilizing the directional solidification process. An analytical process cost model was developed to quantitatively perform the cost comparisons. The impact of individual process yield factors on costs was also assessed as well as effects of process parameters, raw materials, labor rates and consumable items.
NASA Technical Reports Server (NTRS)
Shortle, John F.; Allocco, Michael
2005-01-01
This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.
Image analysis of multiple moving wood pieces in real time
NASA Astrophysics Data System (ADS)
Wang, Weixing
2006-02-01
This paper presents algorithms for image processing and image analysis of wood piece materials. The algorithms were designed for auto-detection of wood piece materials on a moving conveyor belt or a truck. When wood objects on moving, the hard task is to trace the contours of the objects in n optimal way. To make the algorithms work efficiently in the plant, a flexible online system was designed and developed, which mainly consists of image acquisition, image processing, object delineation and analysis. A number of newly-developed algorithms can delineate wood objects with high accuracy and high speed, and in the wood piece analysis part, each wood piece can be characterized by a number of visual parameters which can also be used for constructing experimental models directly in the system.
Techniques of EMG signal analysis: detection, processing, classification and applications
Hussain, M.S.; Mohd-Yasin, F.
2006-01-01
Electromyography (EMG) signals can be used for clinical/biomedical applications, Evolvable Hardware Chip (EHW) development, and modern human computer interaction. EMG signals acquired from muscles require advanced methods for detection, decomposition, processing, and classification. The purpose of this paper is to illustrate the various methodologies and algorithms for EMG signal analysis to provide efficient and effective ways of understanding the signal and its nature. We further point up some of the hardware implementations using EMG focusing on applications related to prosthetic hand control, grasp recognition, and human computer interaction. A comparison study is also given to show performance of various EMG signal analysis methods. This paper provides researchers a good understanding of EMG signal and its analysis procedures. This knowledge will help them develop more powerful, flexible, and efficient applications. PMID:16799694
NASA Technical Reports Server (NTRS)
Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.
1982-01-01
Documentation of the preliminary software developed as a framework for a generalized integrated robotic system simulation is presented. The program structure is composed of three major functions controlled by a program executive. The three major functions are: system definition, analysis tools, and post processing. The system definition function handles user input of system parameters and definition of the manipulator configuration. The analysis tools function handles the computational requirements of the program. The post processing function allows for more detailed study of the results of analysis tool function executions. Also documented is the manipulator joint model software to be used as the basis of the manipulator simulation which will be part of the analysis tools capability.
Expert system for web based collaborative CAE
NASA Astrophysics Data System (ADS)
Hou, Liang; Lin, Zusheng
2006-11-01
An expert system for web based collaborative CAE was developed based on knowledge engineering, relational database and commercial FEA (Finite element analysis) software. The architecture of the system was illustrated. In this system, the experts' experiences, theories and typical examples and other related knowledge, which will be used in the stage of pre-process in FEA, were categorized into analysis process and object knowledge. Then, the integrated knowledge model based on object-oriented method and rule based method was described. The integrated reasoning process based on CBR (case based reasoning) and rule based reasoning was presented. Finally, the analysis process of this expert system in web based CAE application was illustrated, and an analysis example of a machine tool's column was illustrated to prove the validity of the system.
A Multidisciplinary Approach to Mixer-Ejector Analysis and Design
NASA Technical Reports Server (NTRS)
Hendricks, Eric, S.; Seidel, Jonathan, A.
2012-01-01
The design of an engine for a civil supersonic aircraft presents a difficult multidisciplinary problem to propulsion system engineers. There are numerous competing requirements for the engine, such as to be efficient during cruise while yet quiet enough at takeoff to meet airport noise regulations. The use of mixer-ejector nozzles presents one possible solution to this challenge. However, designing a mixer-ejector which will successfully address both of these concerns is a difficult proposition. Presented in this paper is an integrated multidisciplinary approach to the analysis and design of these systems. A process that uses several low-fidelity tools to evaluate both the performance and acoustics of mixer-ejectors nozzles is described. This process is further expanded to include system-level modeling of engines and aircraft to determine the effects on mission performance and noise near airports. The overall process is developed in the OpenMDAO framework currently being developed by NASA. From the developed process, sample results are given for a notional mixer-ejector design, thereby demonstrating the capabilities of the method.
Lessons learned from trend analysis of Shuttle Payload Processing problem reports
NASA Technical Reports Server (NTRS)
Heuser, Robert E.; Pepper, Richard E., Jr.; Smith, Anthony M.
1989-01-01
In the wake of the Challenger accident, NASA has placed an increasing emphasis on trend analysis techniques. These analyses provide meaningful insights into system and hardware status, and also develop additional lessons learned from historical data to aid in the design and operation of future space systems. This paper presents selected results from such a trend analysis study that was conducted on the problem report data files for the Shuttle Payload Processing activities. Specifically, the results shown are for the payload canister system which interfaces with and transfers payloads from their processing facilities to the orbiter.
A software tool to analyze clinical workflows from direct observations.
Schweitzer, Marco; Lasierra, Nelia; Hoerbst, Alexander
2015-01-01
Observational data of clinical processes need to be managed in a convenient way, so that process information is reliable, valid and viable for further analysis. However, existing tools for allocating observations fail in systematic data collection of specific workflow recordings. We present a software tool which was developed to facilitate the analysis of clinical process observations. The tool was successfully used in the project OntoHealth, to build, store and analyze observations of diabetes routine consultations.
Automated Simulation For Analysis And Design
NASA Technical Reports Server (NTRS)
Cantwell, E.; Shenk, Tim; Robinson, Peter; Upadhye, R.
1992-01-01
Design Assistant Workstation (DAWN) software being developed to facilitate simulation of qualitative and quantitative aspects of behavior of life-support system in spacecraft, chemical-processing plant, heating and cooling system of large building, or any of variety of systems including interacting process streams and processes. Used to analyze alternative design scenarios or specific designs of such systems. Expert system will automate part of design analysis: reason independently by simulating design scenarios and return to designer with overall evaluations and recommendations.
A Study of Novice Systems Analysis Problem Solving Behaviors Using Protocol Analysis
1992-09-01
conducted. Each subject was given the same task to perform. The task involved a case study (Appendix B) of a utility company’s customer order processing system...behavior (Ramesh, 1989). The task was to design a customer order processing system that utilized a centralized telephone answering service center...of the utility company’s customer order processing system that was developed based on information obtained by a large systems consulting firm during
Wind Tunnel Simulations of the Mock Urban Setting Test - Experimental Procedures and Data Analysis
2004-07-01
depends on the subjective choice of points to include in the constant stress region. This is demonstrated by the marked difference in the slope for the two...designed explicitly for the analysis of time series and signal processing , particularly for atmospheric dispersion ex- periments. The scripts developed...below. Processing scripts are available for all these analyses in the /scripts directory. All files of figures and processed data resulting from these
NASA Technical Reports Server (NTRS)
Johnson, Donald R.
2001-01-01
This research was directed to the development and application of global isentropic modeling and analysis capabilities to describe hydrologic processes and energy exchange in the climate system, and discern regional climate change. An additional objective was to investigate the accuracy and theoretical limits of global climate predictability which are imposed by the inherent limitations of simulating trace constituent transport and the hydrologic processes of condensation, precipitation and cloud life cycles.
Neural classifier in the estimation process of maturity of selected varieties of apples
NASA Astrophysics Data System (ADS)
Boniecki, P.; Piekarska-Boniecka, H.; Koszela, K.; Zaborowicz, M.; Przybył, K.; Wojcieszak, D.; Zbytek, Z.; Ludwiczak, A.; Przybylak, A.; Lewicki, A.
2015-07-01
This paper seeks to present methods of neural image analysis aimed at estimating the maturity state of selected varieties of apples which are popular in Poland. An identification of the degree of maturity of selected varieties of apples has been conducted on the basis of information encoded in graphical form, presented in the digital photos. The above process involves the application of the BBCH scale, used to determine the maturity of apples. The aforementioned scale is widely used in the EU and has been developed for many species of monocotyledonous plants and dicotyledonous plants. It is also worth noticing that the given scale enables detailed determinations of development stage of a given plant. The purpose of this work is to identify maturity level of selected varieties of apples, which is supported by the use of image analysis methods and classification techniques represented by artificial neural networks. The analysis of graphical representative features based on image analysis method enabled the assessment of the maturity of apples. For the utilitarian purpose the "JabVis 1.1" neural IT system was created, in accordance with requirements of the software engineering dedicated to support the decision-making processes occurring in broadly understood production process and processing of apples.
Real-time feedback control of twin-screw wet granulation based on image analysis.
Madarász, Lajos; Nagy, Zsombor Kristóf; Hoffer, István; Szabó, Barnabás; Csontos, István; Pataki, Hajnalka; Démuth, Balázs; Szabó, Bence; Csorba, Kristóf; Marosi, György
2018-06-04
The present paper reports the first dynamic image analysis-based feedback control of continuous twin-screw wet granulation process. Granulation of the blend of lactose and starch was selected as a model process. The size and size distribution of the obtained particles were successfully monitored by a process camera coupled with an image analysis software developed by the authors. The validation of the developed system showed that the particle size analysis tool can determine the size of the granules with an error of less than 5 µm. The next step was to implement real-time feedback control of the process by controlling the liquid feeding rate of the pump through a PC, based on the real-time determined particle size results. After the establishment of the feedback control, the system could correct different real-life disturbances, creating a Process Analytically Controlled Technology (PACT), which guarantees the real-time monitoring and controlling of the quality of the granules. In the event of changes or bad tendencies in the particle size, the system can automatically compensate the effect of disturbances, ensuring proper product quality. This kind of quality assurance approach is especially important in the case of continuous pharmaceutical technologies. Copyright © 2018 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Cathcart, Stephen Michael
2016-01-01
This mixed method study examines HRD professionals' decision-making processes when making an organizational purchase of training. The study uses a case approach with a degrees of freedom analysis. The data to analyze will examine how HRD professionals in manufacturing select outside vendors human resource development programs for training,…
The development of participatory health research among incarcerated women in a Canadian prison
Murphy, K.; Hanson, D.; Hemingway, C.; Ramsden, V.; Buxton, J.; Granger-Brown, A.; Condello, L-L.; Buchanan, M.; Espinoza-Magana, N.; Edworthy, G.; Hislop, T. G.
2009-01-01
This paper describes the development of a unique prison participatory research project, in which incarcerated women formed a research team, the research activities and the lessons learned. The participatory action research project was conducted in the main short sentence minimum/medium security women's prison located in a Western Canadian province. An ethnographic multi-method approach was used for data collection and analysis. Quantitative data was collected by surveys and analysed using descriptive statistics. Qualitative data was collected from orientation package entries, audio recordings, and written archives of research team discussions, forums and debriefings, and presentations. These data and ethnographic observations were transcribed and analysed using iterative and interpretative qualitative methods and NVivo 7 software. Up to 15 women worked each day as prison research team members; a total of 190 women participated at some time in the project between November 2005 and August 2007. Incarcerated women peer researchers developed the research processes including opportunities for them to develop leadership and technical skills. Through these processes, including data collection and analysis, nine health goals emerged. Lessons learned from the research processes were confirmed by the common themes that emerged from thematic analysis of the research activity data. Incarceration provides a unique opportunity for engagement of women as expert partners alongside academic researchers and primary care workers in participatory research processes to improve their health. PMID:25759141
Automated processing pipeline for neonatal diffusion MRI in the developing Human Connectome Project.
Bastiani, Matteo; Andersson, Jesper L R; Cordero-Grande, Lucilio; Murgasova, Maria; Hutter, Jana; Price, Anthony N; Makropoulos, Antonios; Fitzgibbon, Sean P; Hughes, Emer; Rueckert, Daniel; Victor, Suresh; Rutherford, Mary; Edwards, A David; Smith, Stephen M; Tournier, Jacques-Donald; Hajnal, Joseph V; Jbabdi, Saad; Sotiropoulos, Stamatios N
2018-05-28
The developing Human Connectome Project is set to create and make available to the scientific community a 4-dimensional map of functional and structural cerebral connectivity from 20 to 44 weeks post-menstrual age, to allow exploration of the genetic and environmental influences on brain development, and the relation between connectivity and neurocognitive function. A large set of multi-modal MRI data from fetuses and newborn infants is currently being acquired, along with genetic, clinical and developmental information. In this overview, we describe the neonatal diffusion MRI (dMRI) image processing pipeline and the structural connectivity aspect of the project. Neonatal dMRI data poses specific challenges, and standard analysis techniques used for adult data are not directly applicable. We have developed a processing pipeline that deals directly with neonatal-specific issues, such as severe motion and motion-related artefacts, small brain sizes, high brain water content and reduced anisotropy. This pipeline allows automated analysis of in-vivo dMRI data, probes tissue microstructure, reconstructs a number of major white matter tracts, and includes an automated quality control framework that identifies processing issues or inconsistencies. We here describe the pipeline and present an exemplar analysis of data from 140 infants imaged at 38-44 weeks post-menstrual age. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Global Sensitivity Analysis for Process Identification under Model Uncertainty
NASA Astrophysics Data System (ADS)
Ye, M.; Dai, H.; Walker, A. P.; Shi, L.; Yang, J.
2015-12-01
The environmental system consists of various physical, chemical, and biological processes, and environmental models are always built to simulate these processes and their interactions. For model building, improvement, and validation, it is necessary to identify important processes so that limited resources can be used to better characterize the processes. While global sensitivity analysis has been widely used to identify important processes, the process identification is always based on deterministic process conceptualization that uses a single model for representing a process. However, environmental systems are complex, and it happens often that a single process may be simulated by multiple alternative models. Ignoring the model uncertainty in process identification may lead to biased identification in that identified important processes may not be so in the real world. This study addresses this problem by developing a new method of global sensitivity analysis for process identification. The new method is based on the concept of Sobol sensitivity analysis and model averaging. Similar to the Sobol sensitivity analysis to identify important parameters, our new method evaluates variance change when a process is fixed at its different conceptualizations. The variance considers both parametric and model uncertainty using the method of model averaging. The method is demonstrated using a synthetic study of groundwater modeling that considers recharge process and parameterization process. Each process has two alternative models. Important processes of groundwater flow and transport are evaluated using our new method. The method is mathematically general, and can be applied to a wide range of environmental problems.
Accelerating Commercialization of Algal Biofuels Through Partnerships (Brochure)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2011-10-01
This brochure describes National Renewable Energy Laboratory's (NREL's) algal biofuels research capabilities and partnership opportunities. NREL is accelerating algal biofuels commercialization through: (1) Advances in applied biology; (2) Algal strain development; (3) Development of fuel conversion pathways; (4) Techno-economic analysis; and (5) Development of high-throughput lipid analysis methodologies. NREL scientists and engineers are addressing challenges across the algal biofuels value chain, including algal biology, cultivation, harvesting and extraction, and fuel conversion. Through partnerships, NREL can share knowledge and capabilities in the following areas: (1) Algal Biology - A fundamental understanding of algal biology is key to developing cost-effective algal biofuelsmore » processes. NREL scientists are experts in the isolation and characterization of microalgal species. They are identifying genes and pathways involved in biofuel production. In addition, they have developed a high-throughput, non-destructive technique for assessing lipid production in microalgae. (2) Cultivation - NREL researchers study algal growth capabilities and perform compositional analysis of algal biomass. Laboratory-scale photobioreactors and 1-m2 open raceway ponds in an on-site greenhouse allow for year-round cultivation of algae under a variety of conditions. A bioenergy-focused algal strain collection is being established at NREL, and our laboratory houses a cryopreservation system for long-term maintenance of algal cultures and preservation of intellectual property. (3) Harvesting and Extraction - NREL is investigating cost-effective harvesting and extraction methods suitable for a variety of species and conditions. Areas of expertise include cell wall analysis and deconstruction and identification and utilization of co-products. (4) Fuel Conversion - NREL's excellent capabilities and facilities for biochemical and thermochemical conversion of biomass to biofuels are being applied to algal biofuels processes. Analysts are also testing algal fuel properties to measure energy content and ensure compatibility with existing fueling infrastructure. (5) Cross-Cutting Analysis - NREL scientists and engineers are conducting rigorous techno-economic analyses of algal biofuels processes. In addition, they are performing a full life cycle assessment of the entire algae-to-biofuels process.« less
Proposed Land Conveyance for Construction of Three Facilities at March Air Force Base, California
1988-09-01
identified would result from future development on the 845-acre parcel after it has been conveyed. Therefore, detailed development review and...Impact Analysis Process (EIAP) of the Air Force. This detailed development review is within the purview of the state and local government with...establishes the process under which subsequent detailed environmental review would be conducted. CEQA and its implementing regulations are administered by
ERIC Educational Resources Information Center
Logan, Robert S.
The authoring process and authoring aids which facilitate development of instructional materials have recently emerged as an area of concern in the field of instructional systems development (ISD). This process includes information gathering, its conversion to learning packages, its revision, and its formal publication. The purpose of this…
2010-01-01
Comparative Effectiveness Research, or other efforts to determine best practices and to develop guidelines based on meta-analysis and evidence - based medicine . An...authoritative reviews or other evidence - based medicine sources, but they have been made unambiguous and computable – a process which sounds...best practice recommendation created through an evidence - based medicine (EBM) development process. The lifecycle envisions four stages of refinement
NASA Technical Reports Server (NTRS)
1980-01-01
The design and development of an advanced Czochralski crystal grower are described. Several exhaust gas analysis system equipment specifications studied are discussed. Process control requirements were defined and design work began on the melt temperature, melt level, and continuous diameter control. Sensor development included assembly and testing of a bench prototype of a diameter scanner system.
A conceptual framework for managing clinical processes.
Buffone, G J; Moreau, D
1997-01-01
Reengineering of the health care delivery system is underway, as is the transformation of the processes and methods used for recording information describing patient care (i.e., the development of a computer-based record). This report describes the use of object-oriented analysis and design to develop and implement clinical process reengineering as well as the organization of clinical data. In addition, the facility of the proposed framework for implementing workflow computing is discussed.
Exploring Social Meaning in Online Bilingual Text through Social Network Analysis
2015-09-01
p. 1). 30 GATE development began in 1995. As techniques for natural language processing ( NLP ) are investigated by the research community and...become part of the NLP repetoire, developers incorporate them with wrappers, which allow the output from GATE processes to be recognized as input by...University NEE Named Entity Extraction NLP natural language processing OSD Office of the Secretary of Defense POS parts of speech SBIR Small Business
Tuomisto, Martti T; Parkkinen, Lauri
2012-05-01
Verbal behavior, as in the use of terms, is an important part of scientific activity in general and behavior analysis in particular. Many glossaries and dictionaries of behavior analysis have been published in English, but few in any other language. Here we review the area of behavior analytic terminology, its translations, and development in languages other than English. As an example, we use our own mother tongue, Finnish, which provides a suitable example of the process of translation and development of behavior analytic terminology, because it differs from Indo-European languages and entails specific advantages and challenges in the translation process. We have published three editions of a general dictionary of behavior analysis including 801 terms relevant to the experimental analysis of behavior and applied behavior analysis and one edition of a dictionary of applied and clinical behavior analysis containing 280 terms. Because this work has been important to us, we hope this review will encourage similar work by behavior analysts in other countries whose native language is not English. Behavior analysis as an advanced science deserves widespread international dissemination and proper translations are essential to that goal.
Tuomisto, Martti T; Parkkinen, Lauri
2012-01-01
Verbal behavior, as in the use of terms, is an important part of scientific activity in general and behavior analysis in particular. Many glossaries and dictionaries of behavior analysis have been published in English, but few in any other language. Here we review the area of behavior analytic terminology, its translations, and development in languages other than English. As an example, we use our own mother tongue, Finnish, which provides a suitable example of the process of translation and development of behavior analytic terminology, because it differs from Indo-European languages and entails specific advantages and challenges in the translation process. We have published three editions of a general dictionary of behavior analysis including 801 terms relevant to the experimental analysis of behavior and applied behavior analysis and one edition of a dictionary of applied and clinical behavior analysis containing 280 terms. Because this work has been important to us, we hope this review will encourage similar work by behavior analysts in other countries whose native language is not English. Behavior analysis as an advanced science deserves widespread international dissemination and proper translations are essential to that goal. PMID:22693363
ERIC Educational Resources Information Center
Kolleck, Nina
2016-01-01
This paper examines the implementation of Education for Sustainable Development (ESD) in Germany and explores the possibilities of Social Network Analysis (SNA) for uncovering influential actors in educational policy innovation processes. From the theoretical perspective, an actor's influence is inferred from its relative position within…
The School Counselor Leadership Survey: Instrument Development and Exploratory Factor Analysis
ERIC Educational Resources Information Center
Young, Anita; Bryan, Julia
2015-01-01
This study examined the factor structure of the School Counselor Leadership Survey (SCLS). Survey development was a threefold process that resulted in a 39-item survey of 801 school counselors and school counselor supervisors. The exploratory factor analysis indicated a five-factor structure that revealed five key dimensions of school counselor…
Enhancements to the Image Analysis Tool for Core Punch Experiments and Simulations (vs. 2014)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hogden, John Edward; Unal, Cetin
A previous paper (Hogden & Unal, 2012, Image Analysis Tool for Core Punch Experiments and Simulations) described an image processing computer program developed at Los Alamos National Laboratory. This program has proven useful so developement has been continued. In this paper we describe enhacements to the program as of 2014.
Analysis of rocket engine injection combustion processes
NASA Technical Reports Server (NTRS)
Salmon, J. W.; Saltzman, D. H.
1977-01-01
Mixing methodology improvement for the JANNAF DER and CICM injection/combustion analysis computer programs was accomplished. ZOM plane prediction model development was improved for installation into the new standardized DER computer program. An intra-element mixing model developing approach was recommended for gas/liquid coaxial injection elements for possible future incorporation into the CICM computer program.
Preservice Teachers' Identity Development during the Teaching Internship
ERIC Educational Resources Information Center
Nghia, Tran Le Huu; Tai, Huynh Ngoc
2017-01-01
This article reports the analysis of two preservice teachers' narratives to highlight the process of teacher identity development during their teaching internship. The analysis showed that their teacher identities had been shaped before they entered the teacher education program where it continued to be shaped by educational experts. In that way,…
Teaching With a Purpose: The Perry Scheme and the Teaching of Writing.
ERIC Educational Resources Information Center
Burnham, Christopher C.
While close analysis of individual composing processes has been the major accomplishment of recent writing research, this research has not yet sufficiently considered how students develop as learners. The work of William Perry, a developmental psychologist, can contribute to an understanding of that development. Based on an analysis of interviews…
NASA Astrophysics Data System (ADS)
Rahman, Nur Aira Abd; Yussup, Nolida; Salim, Nazaratul Ashifa Bt. Abdullah; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh@Shaari, Syirrazie Bin Che; Azman, Azraf B.; Ismail, Nadiah Binti
2015-04-01
Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on `Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)'. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.
Space system operations and support cost analysis using Markov chains
NASA Technical Reports Server (NTRS)
Unal, Resit; Dean, Edwin B.; Moore, Arlene A.; Fairbairn, Robert E.
1990-01-01
This paper evaluates the use of Markov chain process in probabilistic life cycle cost analysis and suggests further uses of the process as a design aid tool. A methodology is developed for estimating operations and support cost and expected life for reusable space transportation systems. Application of the methodology is demonstrated for the case of a hypothetical space transportation vehicle. A sensitivity analysis is carried out to explore the effects of uncertainty in key model inputs.
Current status of the real-time processing of complex radar signatures
NASA Astrophysics Data System (ADS)
Clay, E.
The real-time processing technique developed by ONERA to characterize radar signatures at the Brahms station is described. This technique is used for the real-time analysis of the RCS of airframes and rotating parts, the one-dimensional tomography of aircraft, and the RCS of electromagnetic decoys. Using this technique, it is also possible to optimize the experimental parameters, i.e., the analysis band, the microwave-network gain, and the electromagnetic window of the analysis.
Progress in Operational Analysis of Launch Vehicles in Nonstationary Flight
NASA Technical Reports Server (NTRS)
James, George; Kaouk, Mo; Cao, Timothy
2013-01-01
This paper presents recent results in an ongoing effort to understand and develop techniques to process launch vehicle data, which is extremely challenging for modal parameter identification. The primary source of difficulty is due to the nonstationary nature of the situation. The system is changing, the environment is not steady, and there is an active control system operating. Hence, the primary tool for producing clean operational results (significant data lengths and data averaging) is not available to the user. This work reported herein uses a correlation-based two step operational modal analysis approach to process the relevant data sets for understanding and development of processes. A significant drawback for such processing of short time histories is a series of beating phenomena due to the inability to average out random modal excitations. A recursive correlation process coupled to a new convergence metric (designed to mitigate the beating phenomena) is the object of this study. It has been found in limited studies that this process creates clean modal frequency estimates but numerically alters the damping.
Qualitative and quantitative interpretation of SEM image using digital image processing.
Saladra, Dawid; Kopernik, Magdalena
2016-10-01
The aim of the this study is improvement of qualitative and quantitative analysis of scanning electron microscope micrographs by development of computer program, which enables automatic crack analysis of scanning electron microscopy (SEM) micrographs. Micromechanical tests of pneumatic ventricular assist devices result in a large number of micrographs. Therefore, the analysis must be automatic. Tests for athrombogenic titanium nitride/gold coatings deposited on polymeric substrates (Bionate II) are performed. These tests include microshear, microtension and fatigue analysis. Anisotropic surface defects observed in the SEM micrographs require support for qualitative and quantitative interpretation. Improvement of qualitative analysis of scanning electron microscope images was achieved by a set of computational tools that includes binarization, simplified expanding, expanding, simple image statistic thresholding, the filters Laplacian 1, and Laplacian 2, Otsu and reverse binarization. Several modifications of the known image processing techniques and combinations of the selected image processing techniques were applied. The introduced quantitative analysis of digital scanning electron microscope images enables computation of stereological parameters such as area, crack angle, crack length, and total crack length per unit area. This study also compares the functionality of the developed computer program of digital image processing with existing applications. The described pre- and postprocessing may be helpful in scanning electron microscopy and transmission electron microscopy surface investigations. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.
NASA Technical Reports Server (NTRS)
O'Connor, Brian; Hernandez, Deborah; Hornsby, Linda; Brown, Maria; Horton-Mullins, Kathryn
2017-01-01
Outline: Background of ISS (International Space Station) Material Science Research Rack; NASA SCA (Sample Cartridge Assembly) Design; GEDS (Gravitational Effects in Distortion in Sintering) Experiment Ampoule Design; Development Testing Summary; Thermal Modeling and Analysis. Summary: GEDS design development challenging (GEDS Ampoule design developed through MUGS (Microgravity) testing; Short duration transient sample processing; Unable to measure sample temperatures); MUGS Development testing used to gather data (Actual LGF (Low Gradient Furnace)-like furnace response; Provided sample for sintering evaluation); Transient thermal model integral to successful GEDS experiment (Development testing provided furnace response; PI (Performance Indicator) evaluation of sintering anchored model evaluation of processing durations; Thermal transient model used to determine flight SCA sample processing profiles).
NASA Astrophysics Data System (ADS)
Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis
2013-10-01
Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.
NASA Technical Reports Server (NTRS)
Goldman, H.; Wolf, M.
1979-01-01
Analyses of slicing processes and junction formation processes are presented. A simple method for evaluation of the relative economic merits of competing process options with respect to the cost of energy produced by the system is described. An energy consumption analysis was developed and applied to determine the energy consumption in the solar module fabrication process sequence, from the mining of the SiO2 to shipping. The analysis shows that, in current technology practice, inordinate energy use in the purification step, and large wastage of the invested energy through losses, particularly poor conversion in slicing, as well as inadequate yields throughout. The cell process energy expenditures already show a downward trend based on increased throughput rates. The large improvement, however, depends on the introduction of a more efficient purification process and of acceptable ribbon growing techniques.
ERIC Educational Resources Information Center
Vanfretti, Luigi; Farrokhabadi, Mostafa
2015-01-01
This article presents the implementation of the constructive alignment theory (CAT) in a power system analysis course through a consensus-based course design process. The consensus-based design process involves both the instructor and graduate-level students and it aims to develop the CAT framework in a holistic manner with the goal of including…
ERIC Educational Resources Information Center
Pelin, Nicolae; Mironov, Vladimir
2008-01-01
In this article the problems of functioning algorithms development for system of the automated analysis of educational process rhythm in a higher educational institution are considered. Using the device of experiment planning for conducting the scientific researches, adapted methodologies, received by authors in the dissertational works at the…
Some aspects of mathematical and chemical modeling of complex chemical processes
NASA Technical Reports Server (NTRS)
Nemes, I.; Botar, L.; Danoczy, E.; Vidoczy, T.; Gal, D.
1983-01-01
Some theoretical questions involved in the mathematical modeling of the kinetics of complex chemical process are discussed. The analysis is carried out for the homogeneous oxidation of ethylbenzene in the liquid phase. Particular attention is given to the determination of the general characteristics of chemical systems from an analysis of mathematical models developed on the basis of linear algebra.
ERIC Educational Resources Information Center
Torres, Dalia
2012-01-01
The purpose of this deconstructive case study was to conduct a Foucauldian power/knowledge analysis constructed from the perceptions of three teachers at an intermediate school in South Texas regarding the role of the teacher evaluation process and its influence on instructional practices. Using Foucault's (1977a) work on power/knowledge, of…
ERIC Educational Resources Information Center
Mazza, Monica; Mariano, Melania; Peretti, Sara; Masedu, Francesco; Pino, Maria Chiara; Valenti, Marco
2017-01-01
Individuals with autism spectrum disorders (ASD) show significant impairments in social skills and theory of mind (ToM). The aim of this study was to evaluate ToM and social information processing abilities in 52 children with ASD compared to 55 typically developing (TD) children. A mediation analysis evaluated whether social information…
Task analysis exemplified: the process of resolving unfinished business.
Greenberg, L S; Foerster, F S
1996-06-01
The steps of a task-analytic research program designed to identify the in-session performances involved in resolving lingering bad feelings toward a significant other are described. A rational-empirical methodology of repeatedly cycling between rational conjecture and empirical observations is demonstrated as a method of developing an intervention manual and the components of client processes of resolution. A refined model of the change process developed by these procedures is validated by comparing 11 successful and 11 unsuccessful performances. Four performance components-intense expression of feeling, expression of need, shift in representation of other, and self-validation or understanding of the other-were found to discriminate between resolution and nonresolution performances. These components were measured on 4 process measures: the Structural Analysis of Social Behavior, the Experiencing Scale, the Client's Emotional Arousal Scale, and a need scale.
Modelling technological process of ion-exchange filtration of fluids in porous media
NASA Astrophysics Data System (ADS)
Ravshanov, N.; Saidov, U. M.
2018-05-01
Solution of an actual problem related to the process of filtration and dehydration of liquid and ionic solutions from gel particles and heavy ionic compounds is considered in the paper. This technological process is realized during the preparation and cleaning of chemical solutions, drinking water, pharmaceuticals, liquid fuels, products for public use, etc. For the analysis, research, determination of the main parameters of the technological process and operating modes of filter units and for support in managerial decision-making, a mathematical model is developed. Using the developed model, a series of computational experiments on a computer is carried out. The results of numerical calculations are illustrated in the form of graphs. Based on the analysis of numerical experiments, the conclusions are formulated that serve as the basis for making appropriate managerial decisions.
An Intelligent Automation Platform for Rapid Bioprocess Design.
Wu, Tianyi; Zhou, Yuhong
2014-08-01
Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user's inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. © 2013 Society for Laboratory Automation and Screening.
Discrete event simulation tool for analysis of qualitative models of continuous processing systems
NASA Technical Reports Server (NTRS)
Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)
1990-01-01
An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.
An Intelligent Automation Platform for Rapid Bioprocess Design
Wu, Tianyi
2014-01-01
Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user’s inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. PMID:24088579
Micromechanics Analysis Code Post-Processing (MACPOST) User Guide. 1.0
NASA Technical Reports Server (NTRS)
Goldberg, Robert K.; Comiskey, Michele D.; Bednarcyk, Brett A.
1999-01-01
As advanced composite materials have gained wider usage. the need for analytical models and computer codes to predict the thermomechanical deformation response of these materials has increased significantly. Recently, a micromechanics technique called the generalized method of cells (GMC) has been developed, which has the capability to fulfill this -oal. Tc provide a framework for GMC, the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) has been developed. As MAC/GMC has been updated, significant improvements have been made to the post-processing capabilities of the code. Through the MACPOST program, which operates directly within the MSC/PATRAN graphical pre- and post-processing package, a direct link between the analysis capabilities of MAC/GMC and the post-processing capabilities of MSC/PATRAN has been established. MACPOST has simplified the production, printing. and exportation of results for unit cells analyzed by MAC/GMC. MACPOST allows different micro-level quantities to be plotted quickly and easily in contour plots. In addition, meaningful data for X-Y plots can be examined. MACPOST thus serves as an important analysis and visualization tool for the macro- and micro-level data generated by MAC/GMC. This report serves as the user's manual for the MACPOST program.
Link Analysis in the Mission Planning Lab
NASA Technical Reports Server (NTRS)
McCarthy, Jessica A.; Cervantes, Benjamin W.; Daugherty, Sarah C.; Arroyo, Felipe; Mago, Divyang
2011-01-01
The legacy communications link analysis software currently used at Wallops Flight Facility involves processes that are different for command destruct, radar, and telemetry. There is a clear advantage to developing an easy-to-use tool that combines all the processes in one application. Link Analysis in the Mission Planning Lab (MPL) uses custom software and algorithms integrated with Analytical Graphics Inc. Satellite Toolkit (AGI STK). The MPL link analysis tool uses pre/post-mission data to conduct a dynamic link analysis between ground assets and the launch vehicle. Just as the legacy methods do, the MPL link analysis tool calculates signal strength and signal- to-noise according to the accepted processes for command destruct, radar, and telemetry assets. Graphs and other custom data are generated rapidly in formats for reports and presentations. STK is used for analysis as well as to depict plume angles and antenna gain patterns in 3D. The MPL has developed two interfaces with the STK software (see figure). The first interface is an HTML utility, which was developed in Visual Basic to enhance analysis for plume modeling and to offer a more user friendly, flexible tool. A graphical user interface (GUI) written in MATLAB (see figure upper right-hand corner) is also used to quickly depict link budget information for multiple ground assets. This new method yields a dramatic decrease in the time it takes to provide launch managers with the required link budgets to make critical pre-mission decisions. The software code used for these two custom utilities is a product of NASA's MPL.
ERIC Educational Resources Information Center
Sankey, Kim S.; Machin, M. Anthony
2014-01-01
With a focus on the self-initiated efforts of employees, this study examined a model of core proactive motivation processes for participation in non-mandatory professional development (PD) within a proactive motivation framework using the Self-Determination Theory perspective. A multi-group SEM analysis conducted across 439 academic and general…
Developing the Scale of Teacher Self-Efficacy in Teaching Process
ERIC Educational Resources Information Center
Korkmaz, Fahrettin; Unsal, Serkan
2016-01-01
The purpose of this study is to develop a reliable and valid measurement tool which will reveal teachers' self-competence in education process. Participants of the study are 300 teachers working at state primary schools in the province of Gaziantep. Results of the exploratory factor analysis administered to the scale in order to determine its…
The Effects of Poverty on Children's Socioemotional Development: An Ecological Systems Analysis.
ERIC Educational Resources Information Center
Eamon, Mary Keegan
2001-01-01
Bronfenbrenner's process-person-context-time model is used to examine theories that explain adverse effects of economic deprivation on children's socioemotional development. Processes of not only the family, but also those of the peer group and school, and in other levels of the ecological environment may also explain the relation between economic…
ERIC Educational Resources Information Center
Samigullina, Galina Savelevna; Gilmanchina, Syriya Irekovna; Gaisin, Ilgisar Timergalievich; Gilmanshin, Iskander Rafailevich; Rafailevna, Akchurina Ilsia
2015-01-01
The purpose of this article is to analyse the professional and creative development of natural geographic course teachers of the Republic of Tatarstan in the process of professional retraining. The method for work performing is a retrospective analysis of professional retraining of natural geographic course teachers within the higher professional…
USDA-ARS?s Scientific Manuscript database
A predictive mathematical model was developed to simulate heat transfer in a tomato undergoing double sided infrared (IR) heating in a dry-peeling process. The aims of this study were to validate the developed model using experimental data and to investigate different engineering parameters that mos...
ERIC Educational Resources Information Center
Bartram, Sharon; Gibson, Brenda
Designed as a practical tool for trainers, this manual contains 22 instruments and documents for gathering and processing information about training and development issues within an organization. Part one of the two-part manual examines the process of identifying and analyzing training needs. It reviews the different types of information the…
Development of Activity-based Cost Functions for Cellulase, Invertase, and Other Enzymes
NASA Astrophysics Data System (ADS)
Stowers, Chris C.; Ferguson, Elizabeth M.; Tanner, Robert D.
As enzyme chemistry plays an increasingly important role in the chemical industry, cost analysis of these enzymes becomes a necessity. In this paper, we examine the aspects that affect the cost of enzymes based upon enzyme activity. The basis for this study stems from a previously developed objective function that quantifies the tradeoffs in enzyme purification via the foam fractionation process (Cherry et al., Braz J Chem Eng 17:233-238, 2000). A generalized cost function is developed from our results that could be used to aid in both industrial and lab scale chemical processing. The generalized cost function shows several nonobvious results that could lead to significant savings. Additionally, the parameters involved in the operation and scaling up of enzyme processing could be optimized to minimize costs. We show that there are typically three regimes in the enzyme cost analysis function: the low activity prelinear region, the moderate activity linear region, and high activity power-law region. The overall form of the cost analysis function appears to robustly fit the power law form.
The development of Canadian nursing: professionalization and proletarianization.
Coburn, D
1988-01-01
In this article, the development of nursing in Canada is described in terms of three major time periods: the emergence of lay nursing, including organization and registration, 1870-1930; the move to the hospital, 1930-1950; and unionization and the routinization of health care, 1950 to the present. This development is viewed in the light of the orienting concepts of professionalization, proletarianization, and medical dominance (and gender analysis). This historical trajectory of nursing shows an increasing occupational autonomy but continuing struggles over control of the labor process. Nursing is now using theory, organizational changes in health care, and credentialism to help make nursing "separate from but equal to" medicine and to gain control over the day-to-day work of the nurse. Nursing can thus be viewed as undergoing processes of both professionalization and proletarianization. As nursing seeks to control the labor process, its occupational conflicts are joined to the class struggle of white-collar workers in general. Analysis of nursing indicates the problems involved in sorting out the meaning of concepts that are relevant to occupational or class analysis but which focus on the same empirical phenomenon.
NASA Technical Reports Server (NTRS)
Miller, R. E., Jr.; Hansen, S. D.; Redhed, D. D.; Southall, J. W.; Kawaguchi, A. S.
1974-01-01
Evaluation of the cost-effectiveness of integrated analysis/design systems with particular attention to Integrated Program for Aerospace-Vehicle Design (IPAD) project. An analysis of all the ingredients of IPAD indicates the feasibility of a significant cost and flowtime reduction in the product design process involved. It is also concluded that an IPAD-supported design process will provide a framework for configuration control, whereby the engineering costs for design, analysis and testing can be controlled during the air vehicle development cycle.
The Flight Optimization System Weights Estimation Method
NASA Technical Reports Server (NTRS)
Wells, Douglas P.; Horvath, Bryce L.; McCullers, Linwood A.
2017-01-01
FLOPS has been the primary aircraft synthesis software used by the Aeronautics Systems Analysis Branch at NASA Langley Research Center. It was created for rapid conceptual aircraft design and advanced technology impact assessments. FLOPS is a single computer program that includes weights estimation, aerodynamics estimation, engine cycle analysis, propulsion data scaling and interpolation, detailed mission performance analysis, takeoff and landing performance analysis, noise footprint estimation, and cost analysis. It is well known as a baseline and common denominator for aircraft design studies. FLOPS is capable of calibrating a model to known aircraft data, making it useful for new aircraft and modifications to existing aircraft. The weight estimation method in FLOPS is known to be of high fidelity for conventional tube with wing aircraft and a substantial amount of effort went into its development. This report serves as a comprehensive documentation of the FLOPS weight estimation method. The development process is presented with the weight estimation process.
Satellite image analysis using neural networks
NASA Technical Reports Server (NTRS)
Sheldon, Roger A.
1990-01-01
The tremendous backlog of unanalyzed satellite data necessitates the development of improved methods for data cataloging and analysis. Ford Aerospace has developed an image analysis system, SIANN (Satellite Image Analysis using Neural Networks) that integrates the technologies necessary to satisfy NASA's science data analysis requirements for the next generation of satellites. SIANN will enable scientists to train a neural network to recognize image data containing scenes of interest and then rapidly search data archives for all such images. The approach combines conventional image processing technology with recent advances in neural networks to provide improved classification capabilities. SIANN allows users to proceed through a four step process of image classification: filtering and enhancement, creation of neural network training data via application of feature extraction algorithms, configuring and training a neural network model, and classification of images by application of the trained neural network. A prototype experimentation testbed was completed and applied to climatological data.
Developing focused wellness programs: using concept analysis to increase business value.
Byczek, Lance; Kalina, Christine M; Levin, Pamela F
2003-09-01
Concept analysis is a useful tool in providing clarity to an abstract idea as well as an objective basis for developing wellness program products, goals, and outcomes. To plan for and develop successful wellness programs, it is critical for occupational health nurses to clearly understand a program concept as applied to a particular community or population. Occupational health nurses can use the outcome measures resulting from the concept analysis process to help demonstrate the business value of their wellness programs. This concept analysis demonstrates a predominance of the performance related attributes of fitness in the scientific literature.
NASA Astrophysics Data System (ADS)
Tsuchiya, Yuichiro; Kodera, Yoshie; Tanaka, Rie; Sanada, Shigeru
2007-03-01
Early detection and treatment of lung cancer is one of the most effective means to reduce cancer mortality; chest X-ray radiography has been widely used as a screening examination or health checkup. The new examination method and the development of computer analysis system allow obtaining respiratory kinetics by the use of flat panel detector (FPD), which is the expanded method of chest X-ray radiography. Through such changes functional evaluation of respiratory kinetics in chest has become available. Its introduction into clinical practice is expected in the future. In this study, we developed the computer analysis algorithm for the purpose of detecting lung nodules and evaluating quantitative kinetics. Breathing chest radiograph obtained by modified FPD was converted into 4 static images drawing the feature, by sequential temporal subtraction processing, morphologic enhancement processing, kinetic visualization processing, and lung region detection processing, after the breath synchronization process utilizing the diaphragmatic analysis of the vector movement. The artificial neural network used to analyze the density patterns detected the true nodules by analyzing these static images, and drew their kinetic tracks. For the algorithm performance and the evaluation of clinical effectiveness with 7 normal patients and simulated nodules, both showed sufficient detecting capability and kinetic imaging function without statistically significant difference. Our technique can quantitatively evaluate the kinetic range of nodules, and is effective in detecting a nodule on a breathing chest radiograph. Moreover, the application of this technique is expected to extend computer-aided diagnosis systems and facilitate the development of an automatic planning system for radiation therapy.
A risk-based approach to management of leachables utilizing statistical analysis of extractables.
Stults, Cheryl L M; Mikl, Jaromir; Whelehan, Oliver; Morrical, Bradley; Duffield, William; Nagao, Lee M
2015-04-01
To incorporate quality by design concepts into the management of leachables, an emphasis is often put on understanding the extractable profile for the materials of construction for manufacturing disposables, container-closure, or delivery systems. Component manufacturing processes may also impact the extractable profile. An approach was developed to (1) identify critical components that may be sources of leachables, (2) enable an understanding of manufacturing process factors that affect extractable profiles, (3) determine if quantitative models can be developed that predict the effect of those key factors, and (4) evaluate the practical impact of the key factors on the product. A risk evaluation for an inhalation product identified injection molding as a key process. Designed experiments were performed to evaluate the impact of molding process parameters on the extractable profile from an ABS inhaler component. Statistical analysis of the resulting GC chromatographic profiles identified processing factors that were correlated with peak levels in the extractable profiles. The combination of statistically significant molding process parameters was different for different types of extractable compounds. ANOVA models were used to obtain optimal process settings and predict extractable levels for a selected number of compounds. The proposed paradigm may be applied to evaluate the impact of material composition and processing parameters on extractable profiles and utilized to manage product leachables early in the development process and throughout the product lifecycle.
Data near processing support for climate data analysis
NASA Astrophysics Data System (ADS)
Kindermann, Stephan; Ehbrecht, Carsten; Hempelmann, Nils
2016-04-01
Climate data repositories grow in size exponentially. Scalable data near processing capabilities are required to meet future data analysis requirements and to replace current "data download and process at home" workflows and approaches. On one hand side, these processing capabilities should be accessible via standardized interfaces (e.g. OGC WPS), on the other side a large variety of processing tools, toolboxes and deployment alternatives have to be supported and maintained at the data/processing center. We present a community approach of a modular and flexible system supporting the development, deployment and maintenace of OGC-WPS based web processing services. This approach is organized in an open source github project (called "bird-house") supporting individual processing services ("birds", e.g. climate index calculations, model data ensemble calculations), which rely on basic common infrastructural components (e.g. installation and deployment recipes, analysis code dependencies management). To support easy deployment at data centers as well as home institutes (e.g. for testing and development) the system supports the management of the often very complex package dependency chain of climate data analysis packages as well as docker based packaging and installation. We present a concrete deployment scenario at the German Climate Computing Center (DKRZ). The DKRZ one hand side hosts a multi-petabyte climate archive which is integrated e.g. into the european ENES and worldwide ESGF data infrastructure, and on the other hand hosts an HPC center supporting (model) data production and data analysis. The deployment scenario also includes openstack based data cloud services to support data import and data distribution for bird-house based WPS web processing services. Current challenges for inter-institutionnal deployments of web processing services supporting the european and international climate modeling community as well as the climate impact community are highlighted. Also aspects supporting future WPS based cross community usage scenarios supporting data reuse and data provenance aspects are reflected.
A rule-based system for real-time analysis of control systems
NASA Astrophysics Data System (ADS)
Larson, Richard R.; Millard, D. Edward
1992-10-01
An approach to automate the real-time analysis of flight critical health monitoring and system status is being developed and evaluated at the NASA Dryden Flight Research Facility. A software package was developed in-house and installed as part of the extended aircraft interrogation and display system. This design features a knowledge-base structure in the form of rules to formulate interpretation and decision logic of real-time data. This technique has been applied for ground verification and validation testing and flight testing monitoring where quick, real-time, safety-of-flight decisions can be very critical. In many cases post processing and manual analysis of flight system data are not required. The processing is described of real-time data for analysis along with the output format which features a message stack display. The development, construction, and testing of the rule-driven knowledge base, along with an application using the X-31A flight test program, are presented.
Corvari, Vincent; Narhi, Linda O; Spitznagel, Thomas M; Afonina, Nataliya; Cao, Shawn; Cash, Patricia; Cecchini, Irene; DeFelippis, Michael R; Garidel, Patrick; Herre, Andrea; Koulov, Atanas V; Lubiniecki, Tony; Mahler, Hanns-Christian; Mangiagalli, Paolo; Nesta, Douglas; Perez-Ramirez, Bernardo; Polozova, Alla; Rossi, Mara; Schmidt, Roland; Simler, Robert; Singh, Satish; Weiskopf, Andrew; Wuchner, Klaus
2015-11-01
Measurement and characterization of subvisible particles (including proteinaceous and non-proteinaceous particulate matter) is an important aspect of the pharmaceutical development process for biotherapeutics. Health authorities have increased expectations for subvisible particle data beyond criteria specified in the pharmacopeia and covering a wider size range. In addition, subvisible particle data is being requested for samples exposed to various stress conditions and to support process/product changes. Consequently, subvisible particle analysis has expanded beyond routine testing of finished dosage forms using traditional compendial methods. Over the past decade, advances have been made in the detection and understanding of subvisible particle formation. This article presents industry case studies to illustrate the implementation of strategies for subvisible particle analysis as a characterization tool to assess the nature of the particulate matter and applications in drug product development, stability studies and post-marketing changes. Copyright © 2015 The International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.
A rule-based system for real-time analysis of control systems
NASA Technical Reports Server (NTRS)
Larson, Richard R.; Millard, D. Edward
1992-01-01
An approach to automate the real-time analysis of flight critical health monitoring and system status is being developed and evaluated at the NASA Dryden Flight Research Facility. A software package was developed in-house and installed as part of the extended aircraft interrogation and display system. This design features a knowledge-base structure in the form of rules to formulate interpretation and decision logic of real-time data. This technique has been applied for ground verification and validation testing and flight testing monitoring where quick, real-time, safety-of-flight decisions can be very critical. In many cases post processing and manual analysis of flight system data are not required. The processing is described of real-time data for analysis along with the output format which features a message stack display. The development, construction, and testing of the rule-driven knowledge base, along with an application using the X-31A flight test program, are presented.
Session on High Speed Civil Transport Design Capability Using MDO and High Performance Computing
NASA Technical Reports Server (NTRS)
Rehder, Joe
2000-01-01
Since the inception of CAS in 1992, NASA Langley has been conducting research into applying multidisciplinary optimization (MDO) and high performance computing toward reducing aircraft design cycle time. The focus of this research has been the development of a series of computational frameworks and associated applications that increased in capability, complexity, and performance over time. The culmination of this effort is an automated high-fidelity analysis capability for a high speed civil transport (HSCT) vehicle installed on a network of heterogeneous computers with a computational framework built using Common Object Request Broker Architecture (CORBA) and Java. The main focus of the research in the early years was the development of the Framework for Interdisciplinary Design Optimization (FIDO) and associated HSCT applications. While the FIDO effort was eventually halted, work continued on HSCT applications of ever increasing complexity. The current application, HSCT4.0, employs high fidelity CFD and FEM analysis codes. For each analysis cycle, the vehicle geometry and computational grids are updated using new values for design variables. Processes for aeroelastic trim, loads convergence, displacement transfer, stress and buckling, and performance have been developed. In all, a total of 70 processes are integrated in the analysis framework. Many of the key processes include automatic differentiation capabilities to provide sensitivity information that can be used in optimization. A software engineering process was developed to manage this large project. Defining the interactions among 70 processes turned out to be an enormous, but essential, task. A formal requirements document was prepared that defined data flow among processes and subprocesses. A design document was then developed that translated the requirements into actual software design. A validation program was defined and implemented to ensure that codes integrated into the framework produced the same results as their standalone counterparts. Finally, a Commercial Off the Shelf (COTS) configuration management system was used to organize the software development. A computational environment, CJOPT, based on the Common Object Request Broker Architecture, CORBA, and the Java programming language has been developed as a framework for multidisciplinary analysis and Optimization. The environment exploits the parallelisms inherent in the application and distributes the constituent disciplines on machines best suited to their needs. In CJOpt, a discipline code is "wrapped" as an object. An interface to the object identifies the functionality (services) provided by the discipline, defined in Interface Definition Language (IDL) and implemented using Java. The results of using the HSCT4.0 capability are described. A summary of lessons learned is also presented. The use of some of the processes, codes, and techniques by industry are highlighted. The application of the methodology developed in this research to other aircraft are described. Finally, we show how the experience gained is being applied to entirely new vehicles, such as the Reusable Space Transportation System. Additional information is contained in the original.
NASA Technical Reports Server (NTRS)
Erickson, W. K.; Hofman, L. B.; Donovan, W. E.
1984-01-01
Difficulties regarding the digital image analysis of remotely sensed imagery can arise in connection with the extensive calculations required. In the past, an expensive large to medium mainframe computer system was needed for performing these calculations. For image-processing applications smaller minicomputer-based systems are now used by many organizations. The costs for such systems are still in the range from $100K to $300K. Recently, as a result of new developments, the use of low-cost microcomputers for image processing and display systems appeared to have become feasible. These developments are related to the advent of the 16-bit microprocessor and the concept of the microcomputer workstation. Earlier 8-bit microcomputer-based image processing systems are briefly examined, and a computer workstation architecture is discussed. Attention is given to a microcomputer workstation developed by Stanford University, and the design and implementation of a workstation network.
OXSA: An open-source magnetic resonance spectroscopy analysis toolbox in MATLAB.
Purvis, Lucian A B; Clarke, William T; Biasiolli, Luca; Valkovič, Ladislav; Robson, Matthew D; Rodgers, Christopher T
2017-01-01
In vivo magnetic resonance spectroscopy provides insight into metabolism in the human body. New acquisition protocols are often proposed to improve the quality or efficiency of data collection. Processing pipelines must also be developed to use these data optimally. Current fitting software is either targeted at general spectroscopy fitting, or for specific protocols. We therefore introduce the MATLAB-based OXford Spectroscopy Analysis (OXSA) toolbox to allow researchers to rapidly develop their own customised processing pipelines. The toolbox aims to simplify development by: being easy to install and use; seamlessly importing Siemens Digital Imaging and Communications in Medicine (DICOM) standard data; allowing visualisation of spectroscopy data; offering a robust fitting routine; flexibly specifying prior knowledge when fitting; and allowing batch processing of spectra. This article demonstrates how each of these criteria have been fulfilled, and gives technical details about the implementation in MATLAB. The code is freely available to download from https://github.com/oxsatoolbox/oxsa.
Processing of Lunar Soil Simulant for Space Exploration Applications
NASA Technical Reports Server (NTRS)
Sen, Subhayu; Ray, Chandra S.; Reddy, Ramana
2005-01-01
NASA's long-term vision for space exploration includes developing human habitats and conducting scientific investigations on planetary bodies, especially on Moon and Mars. To reduce the level of up-mass processing and utilization of planetary in-situ resources is recognized as an important element of this vision. Within this scope and context, we have undertaken a general effort aimed primarily at extracting and refining metals, developing glass, glass-ceramic, or traditional ceramic type materials using lunar soil simulants. In this paper we will present preliminary results on our effort on carbothermal reduction of oxides for elemental extraction and zone refining for obtaining high purity metals. In additions we will demonstrate the possibility of developing glasses from lunar soil simulant for fixing nuclear waste from potential nuclear power generators on planetary bodies. Compositional analysis, x-ray diffraction patterns and differential thermal analysis of processed samples will be presented.
Diagnostics of Carbon Nanotube Formation in a Laser Produced Plume
NASA Technical Reports Server (NTRS)
DeBoer, Gary
2000-01-01
This research has involved the analysis and interpretation of spectroscopic data taken over a two year period from 1998 to 1999 at the Johnson Space Center. The data was taken in an attempt to perform diagnostic studies of the formation of carbon nanotubes in a laser produced plume. Carbon nanotubes hold great promise for the development of new materials with exciting properties. Current production processes are not sufficient to meet research and development needs. A better understanding of the chemical processes involved in carbon nanotube formation will suggest better production processes that would be more able to meet the demands of research and development. Our work has focused on analysis of the emission spectra and laser induced fluorescent spectra of the carbon dimer, C2, and the laser induced fluorescence spectra of the nickel atom, which is a necessary reagent in th formation of carbon nanotubes.
NASA Astrophysics Data System (ADS)
Utanto, Yuli; Widhanarto, Ghanis Putra; Maretta, Yoris Adi
2017-03-01
This study aims to develop a web-based portfolio model. The model developed in this study could reveal the effectiveness of the new model in experiments conducted at research respondents in the department of curriculum and educational technology FIP Unnes. In particular, the further research objectives to be achieved through this development of research, namely: (1) Describing the process of implementing a portfolio in a web-based model; (2) Assessing the effectiveness of web-based portfolio model for the final task, especially in Web-Based Learning courses. This type of research is the development of research Borg and Gall (2008: 589) says "educational research and development (R & D) is a process used to develop and validate educational production". The series of research and development carried out starting with exploration and conceptual studies, followed by testing and evaluation, and also implementation. For the data analysis, the technique used is simple descriptive analysis, analysis of learning completeness, which then followed by prerequisite test for normality and homogeneity to do T - test. Based on the data analysis, it was concluded that: (1) a web-based portfolio model can be applied to learning process in higher education; (2) The effectiveness of web-based portfolio model with field data from the respondents of large group trial participants (field trial), the number of respondents who reached mastery learning (a score of 60 and above) were 24 people (92.3%) in which it indicates that the web-based portfolio model is effective. The conclusion of this study is that a web-based portfolio model is effective. The implications of the research development of this model, the next researcher is expected to be able to use the guideline of the development model based on the research that has already been conducted to be developed on other subjects.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mata, Pedro; Fuente, Rafael de la; Iglesias, Javier
Iberdrola (spanish utility) and Iberdrola Ingenieria (engineering branch) have been developing during the last two years the 110% Extended Power Up-rate Project (EPU 110%) for Cofrentes BWR-6. IBERDROLA has available an in-house design and licensing reload methodology that has been approved by the Spanish Nuclear Regulatory Authority. This methodology has been already used to perform the nuclear design and the reload licensing analysis for Cofrentes cycles 12 to 14. The methodology has been also applied to develop a significant number of safety analysis of the Cofrentes Extended Power Up-rate including: Reactor Heat Balance, Core and Fuel performance, Thermal Hydraulic Stability,more » ECCS LOCA Evaluation, Transient Analysis, Anticipated Transient Without Scram (ATWS) and Station Blackout (SBO) Since the scope of the licensing process of the Cofrentes Extended Power Up-rate exceeds the range of analysis included in the Cofrentes generic reload licensing process, it has been required to extend the applicability of the Cofrentes licensing methodology to the analysis of new transients. This is the case of the TLFW transient. The content of this paper shows the benefits of having an in-house design and licensing methodology, and describes the process to extend the applicability of the methodology to the analysis of new transients. The case of analysis of Total Loss of Feedwater with the Cofrentes Retran Model is included as an example of this process. (authors)« less
Systems Engineering Provides Successful High Temperature Steam Electrolysis Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Charles V. Park; Emmanuel Ohene Opare, Jr.
2011-06-01
This paper describes two Systems Engineering Studies completed at the Idaho National Laboratory (INL) to support development of the High Temperature Stream Electrolysis (HTSE) process. HTSE produces hydrogen from water using nuclear power and was selected by the Department of Energy (DOE) for integration with the Next Generation Nuclear Plant (NGNP). The first study was a reliability, availability and maintainability (RAM) analysis to identify critical areas for technology development based on available information regarding expected component performance. An HTSE process baseline flowsheet at commercial scale was used as a basis. The NGNP project also established a process and capability tomore » perform future RAM analyses. The analysis identified which components had the greatest impact on HTSE process availability and indicated that the HTSE process could achieve over 90% availability. The second study developed a series of life-cycle cost estimates for the various scale-ups required to demonstrate the HTSE process. Both studies were useful in identifying near- and long-term efforts necessary for successful HTSE process deployment. The size of demonstrations to support scale-up was refined, which is essential to estimate near- and long-term cost and schedule. The life-cycle funding profile, with high-level allocations, was identified as the program transitions from experiment scale R&D to engineering scale demonstration.« less
Li, Liang; Wang, Yiying; Xu, Jiting; Flora, Joseph R V; Hoque, Shamia; Berge, Nicole D
2018-08-01
Hydrothermal carbonization (HTC) is a wet, low temperature thermal conversion process that continues to gain attention for the generation of hydrochar. The importance of specific process conditions and feedstock properties on hydrochar characteristics is not well understood. To evaluate this, linear and non-linear models were developed to describe hydrochar characteristics based on data collected from HTC-related literature. A Sobol analysis was subsequently conducted to identify parameters that most influence hydrochar characteristics. Results from this analysis indicate that for each investigated hydrochar property, the model fit and predictive capability associated with the random forest models is superior to both the linear and regression tree models. Based on results from the Sobol analysis, the feedstock properties and process conditions most influential on hydrochar yield, carbon content, and energy content were identified. In addition, a variational process parameter sensitivity analysis was conducted to determine how feedstock property importance changes with process conditions. Copyright © 2018 Elsevier Ltd. All rights reserved.
DDS-Suite - A Dynamic Data Acquisition, Processing, and Analysis System for Wind Tunnel Testing
NASA Technical Reports Server (NTRS)
Burnside, Jathan J.
2012-01-01
Wind Tunnels have optimized their steady-state data systems for acquisition and analysis and even implemented large dynamic-data acquisition systems, however development of near real-time processing and analysis tools for dynamic-data have lagged. DDS-Suite is a set of tools used to acquire, process, and analyze large amounts of dynamic data. Each phase of the testing process: acquisition, processing, and analysis are handled by separate components so that bottlenecks in one phase of the process do not affect the other, leading to a robust system. DDS-Suite is capable of acquiring 672 channels of dynamic data at rate of 275 MB / s. More than 300 channels of the system use 24-bit analog-to-digital cards and are capable of producing data with less than 0.01 of phase difference at 1 kHz. System architecture, design philosophy, and examples of use during NASA Constellation and Fundamental Aerodynamic tests are discussed.
Efficient Process Migration for Parallel Processing on Non-Dedicated Networks of Workstations
NASA Technical Reports Server (NTRS)
Chanchio, Kasidit; Sun, Xian-He
1996-01-01
This paper presents the design and preliminary implementation of MpPVM, a software system that supports process migration for PVM application programs in a non-dedicated heterogeneous computing environment. New concepts of migration point as well as migration point analysis and necessary data analysis are introduced. In MpPVM, process migrations occur only at previously inserted migration points. Migration point analysis determines appropriate locations to insert migration points; whereas, necessary data analysis provides a minimum set of variables to be transferred at each migration pint. A new methodology to perform reliable point-to-point data communications in a migration environment is also discussed. Finally, a preliminary implementation of MpPVM and its experimental results are presented, showing the correctness and promising performance of our process migration mechanism in a scalable non-dedicated heterogeneous computing environment. While MpPVM is developed on top of PVM, the process migration methodology introduced in this study is general and can be applied to any distributed software environment.
NASA Astrophysics Data System (ADS)
Dunn, Michael
2008-10-01
For over 30 years, the Oak Ridge National Laboratory (ORNL) has performed research and development to provide more accurate nuclear cross-section data in the resonance region. The ORNL Nuclear Data (ND) Program consists of four complementary areas of research: (1) cross-section measurements at the Oak Ridge Electron Linear Accelerator; (2) resonance analysis methods development with the SAMMY R-matrix analysis software; (3) cross-section evaluation development; and (4) cross-section processing methods development with the AMPX software system. The ND Program is tightly coupled with nuclear fuel cycle analyses and radiation transport methods development efforts at ORNL. Thus, nuclear data work is performed in concert with nuclear science and technology needs and requirements. Recent advances in each component of the ORNL ND Program have led to improvements in resonance region measurements, R-matrix analyses, cross-section evaluations, and processing capabilities that directly support radiation transport research and development. Of particular importance are the improvements in cross-section covariance data evaluation and processing capabilities. The benefit of these advances to nuclear science and technology research and development will be discussed during the symposium on Nuclear Physics Research Connections to Nuclear Energy.
NASA Astrophysics Data System (ADS)
Yu, Long; Druckenbrod, Markus; Greve, Martin; Wang, Ke-qi; Abdel-Maksoud, Moustafa
2015-10-01
A fully automated optimization process is provided for the design of ducted propellers under open water conditions, including 3D geometry modeling, meshing, optimization algorithm and CFD analysis techniques. The developed process allows the direct integration of a RANSE solver in the design stage. A practical ducted propeller design case study is carried out for validation. Numerical simulations and open water tests are fulfilled and proved that the optimum ducted propeller improves hydrodynamic performance as predicted.
Development of a data acquisition system using a RISC/UNIX TM workstation
NASA Astrophysics Data System (ADS)
Takeuchi, Y.; Tanimori, T.; Yasu, Y.
1993-05-01
We have developed a compact data acquisition system on RISC/UNIX workstations. A SUN TM SPARCstation TM IPC was used, in which an extension bus "SBus TM" was linked to a VMEbus. The transfer rate achieved was better than 7 Mbyte/s between the VMEbus and the SUN. A device driver for CAMAC was developed in order to realize an interruptive feature in UNIX. In addition, list processing has been incorporated in order to keep the high priority of the data handling process in UNIX. The successful developments of both device driver and list processing have made it possible to realize the good real-time feature on the RISC/UNIX system. Based on this architecture, a portable and versatile data taking system has been developed, which consists of a graphical user interface, I/O handler, user analysis process, process manager and a CAMAC device driver.
Environmental impact of mushroom compost production.
Leiva, Francisco; Saenz-Díez, Juan-Carlos; Martínez, Eduardo; Jiménez, Emilio; Blanco, Julio
2016-09-01
This research analyses the environmental impact of the creation of Agaricus bisporus compost packages. The composting process is the intermediate stage of the mushroom production process, subsequent to the mycelium cultivation stage and prior to the fruiting bodies cultivation stage. A full life cycle assessment model of the Agaricus bisporus composting process has been developed through the identification and analysis of the inputs-outputs and energy consumption of the activities involved in the production process. The study has been developed based on data collected from a plant during a 1 year campaign, thereby obtaining accurate information used to analyse the environmental impact of the process. A global analysis of the main stages of the process shows that the process that has the greatest impact in most categories is the compost batch preparation process. This is due to an increased consumption of energy resources by the machinery that mixes the raw materials to create the batch. At the composting process inside the tunnel stage, the activity that has the greatest impact in almost all categories studied is the initial stage of composting. This is due to higher energy consumption during the process compared to the other stages. © 2015 Society of Chemical Industry. © 2015 Society of Chemical Industry.
Wang, Lei; Sun, Xiaoliang; Weiszmann, Jakob; Weckwerth, Wolfram
2017-01-01
Grapevine is a fruit crop with worldwide economic importance. The grape berry undergoes complex biochemical changes from fruit set until ripening. This ripening process and production processes define the wine quality. Thus, a thorough understanding of berry ripening is crucial for the prediction of wine quality. For a systemic analysis of grape berry development we applied mass spectrometry based platforms to analyse the metabolome and proteome of Early Campbell at 12 stages covering major developmental phases. Primary metabolites involved in central carbon metabolism, such as sugars, organic acids and amino acids together with various bioactive secondary metabolites like flavonols, flavan-3-ols and anthocyanins were annotated and quantified. At the same time, the proteomic analysis revealed the protein dynamics of the developing grape berries. Multivariate statistical analysis of the integrated metabolomic and proteomic dataset revealed the growth trajectory and corresponding metabolites and proteins contributing most to the specific developmental process. K-means clustering analysis revealed 12 highly specific clusters of co-regulated metabolites and proteins. Granger causality network analysis allowed for the identification of time-shift correlations between metabolite-metabolite, protein- protein and protein-metabolite pairs which is especially interesting for the understanding of developmental processes. The integration of metabolite and protein dynamics with their corresponding biochemical pathways revealed an energy-linked metabolism before veraison with high abundances of amino acids and accumulation of organic acids, followed by protein and secondary metabolite synthesis. Anthocyanins were strongly accumulated after veraison whereas other flavonoids were in higher abundance at early developmental stages and decreased during the grape berry developmental processes. A comparison of the anthocyanin profile of Early Campbell to other cultivars revealed similarities to Concord grape and indicates the strong effect of genetic background on metabolic partitioning in primary and secondary metabolism.
Wang, Lei; Sun, Xiaoliang; Weiszmann, Jakob; Weckwerth, Wolfram
2017-01-01
Grapevine is a fruit crop with worldwide economic importance. The grape berry undergoes complex biochemical changes from fruit set until ripening. This ripening process and production processes define the wine quality. Thus, a thorough understanding of berry ripening is crucial for the prediction of wine quality. For a systemic analysis of grape berry development we applied mass spectrometry based platforms to analyse the metabolome and proteome of Early Campbell at 12 stages covering major developmental phases. Primary metabolites involved in central carbon metabolism, such as sugars, organic acids and amino acids together with various bioactive secondary metabolites like flavonols, flavan-3-ols and anthocyanins were annotated and quantified. At the same time, the proteomic analysis revealed the protein dynamics of the developing grape berries. Multivariate statistical analysis of the integrated metabolomic and proteomic dataset revealed the growth trajectory and corresponding metabolites and proteins contributing most to the specific developmental process. K-means clustering analysis revealed 12 highly specific clusters of co-regulated metabolites and proteins. Granger causality network analysis allowed for the identification of time-shift correlations between metabolite-metabolite, protein- protein and protein-metabolite pairs which is especially interesting for the understanding of developmental processes. The integration of metabolite and protein dynamics with their corresponding biochemical pathways revealed an energy-linked metabolism before veraison with high abundances of amino acids and accumulation of organic acids, followed by protein and secondary metabolite synthesis. Anthocyanins were strongly accumulated after veraison whereas other flavonoids were in higher abundance at early developmental stages and decreased during the grape berry developmental processes. A comparison of the anthocyanin profile of Early Campbell to other cultivars revealed similarities to Concord grape and indicates the strong effect of genetic background on metabolic partitioning in primary and secondary metabolism. PMID:28713396
Rocketdyne PSAM: In-house enhancement/application
NASA Technical Reports Server (NTRS)
Newell, J. F.; Rajagopal, K. R.; Ohara, K.
1991-01-01
The development was initiated of the Probabilistic Design Analysis (PDA) Process for rocket engines. This will enable engineers a quantitative assessment of calculated reliability during the design process. The PDA will help choose better designs, make them more robust, and help decide on critical tests to help demonstrate key reliability issues to aid in improving the confidence of the engine capabilities. Rockedyne's involvement with the Composite Loads Spectra (CLS) and Probabilistic Structural Analysis Methodology (PSAM) contracts started this effort and are key elements in the on-going developments. Internal development efforts and hardware applications complement and extend the CLS and PSAM efforts. The completion of the CLS option work and the follow-on PSAM developments will also be integral parts of this methodology. A brief summary of these efforts is presented.
Automated Student Aid Processing: The Challenge and Opportunity.
ERIC Educational Resources Information Center
St. John, Edward P.
1985-01-01
To utilize automated technology for student aid processing, it is necessary to work with multi-institutional offices (student aid, admissions, registration, and business) and to develop automated interfaces with external processing systems at state and federal agencies and perhaps at need-analysis organizations and lenders. (MLW)
Striepe, Meg I; Tolman, Deborah L
2003-12-01
Little attention has been given to how femininity and masculinity ideologies impact sexual-identity development. Differentiating violations of conventional femininity and masculinity ideologies as part of an overt process of sexual-identity development in sexual-minority adolescents suggested the possibility of a parallel process among heterosexual adolescents. Based on feminist theory and analysis of heterosexual adolescents narratives about relationships, the importance of negotiating femininity and masculinity ideologies as part of sexual-identity development for all adolescents is described.
USEPA EXAMPLE EXIT LEVEL ANALYSIS RESULTS
Developed by NERL/ERD for the Office of Solid Waste, the enclosed product provides an example uncertainty analysis (UA) and initial process-based sensitivity analysis (SA) of hazardous waste "exit" concentrations for 7 chemicals and metals using the 3MRA Version 1.0 Modeling Syst...
FASEA: A FPGA Acquisition System and Software Event Analysis for liquid scintillation counting
NASA Astrophysics Data System (ADS)
Steele, T.; Mo, L.; Bignell, L.; Smith, M.; Alexiev, D.
2009-10-01
The FASEA (FPGA based Acquisition and Software Event Analysis) system has been developed to replace the MAC3 for coincidence pulse processing. The system uses a National Instruments Virtex 5 FPGA card (PXI-7842R) for data acquisition and a purpose developed data analysis software for data analysis. Initial comparisons to the MAC3 unit are included based on measurements of 89Sr and 3H, confirming that the system is able to accurately emulate the behaviour of the MAC3 unit.
NASA Technical Reports Server (NTRS)
Dickinson, William B.
1995-01-01
An Earth Sciences Data and Information System (ESDIS) Project Management Plan (PMP) is prepared. An ESDIS Project Systems Engineering Management Plan (SEMP) consistent with the developed PMP is also prepared. ESDIS and related EOS program requirements developments, management and analysis processes are evaluated. Opportunities to improve the effectiveness of these processes and program/project responsiveness to requirements are identified. Overall ESDIS cost estimation processes are evaluated, and recommendations to improve cost estimating and modeling techniques are developed. ESDIS schedules and scheduling tools are evaluated. Risk assessment, risk mitigation strategies and approaches, and use of risk information in management decision-making are addressed.
NASA Technical Reports Server (NTRS)
Head, J. W. (Editor)
1978-01-01
Developments reported at a meeting of principal investigators for NASA's planetology geology program are summarized. Topics covered include: constraints on solar system formation; asteriods, comets, and satellites; constraints on planetary interiors; volatiles and regoliths; instrument development techniques; planetary cartography; geological and geochemical constraints on planetary evolution; fluvial processes and channel formation; volcanic processes; Eolian processes; radar studies of planetary surfaces; cratering as a process, landform, and dating method; and the Tharsis region of Mars. Activities at a planetary geology field conference on Eolian processes are reported and techniques recommended for the presentation and analysis of crater size-frequency data are included.
PLANiTS : structuring and supporting the intelligent transportation systems planning process
DOT National Transportation Integrated Search
1997-01-01
PLANiTS (Planning and Analysis Integration for Intelligent Transportation Systems) is a process-based computer system that supports a series of mutually interdependent steps progressing toward developing and programming transportation improvement pro...
Tang, Qi-Yi; Zhang, Chuan-Xi
2013-04-01
A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. © 2012 The Authors Insect Science © 2012 Institute of Zoology, Chinese Academy of Sciences.
Image processing and analysis of Saturn's rings
NASA Technical Reports Server (NTRS)
Yagi, G. M.; Jepsen, P. L.; Garneau, G. W.; Mosher, J. A.; Doyle, L. R.; Lorre, J. J.; Avis, C. C.; Korsmo, E. P.
1981-01-01
Processing of Voyager image data of Saturn's rings at JPL's Image Processing Laboratory is described. A software system to navigate the flight images, facilitate feature tracking, and to project the rings has been developed. This system has been used to make measurements of ring radii and to measure the velocities of the spoke features in the B-Ring. A projected ring movie to study the development of these spoke features has been generated. Finally, processing to facilitate comparison of the photometric properties of Saturn's rings at various phase angles is described.
NASA Technical Reports Server (NTRS)
1981-01-01
The engineering design, fabrication, assembly, operation, economic analysis, and process support R and D for an Experimental Process System Development Unit (EPSDU) are reported. About 95% of purchased equipment is received and will be reshipped to the West Coast location. The Data Collection System is completed. In the area of melting/consolidation, to the system using silicon powder transfer, melting and shotting on a pseudocontinuous basis is demonstrated. It is proposed to continue the very promising fluid bed work.
CFD Process Pre- and Post-processing Automation in Support of Space Propulsion
NASA Technical Reports Server (NTRS)
Dorney, Suzanne M.
2003-01-01
The use of Computational Fluid Dynamics or CFD has become standard practice in the design and analysis of the major components used for space propulsion. In an attempt to standardize and improve the CFD process a series of automated tools have been developed. Through the use of these automated tools the application of CFD to the design cycle has been improved and streamlined. This paper presents a series of applications in which deficiencies were identified in the CFD process and corrected through the development of automated tools.
Compact full-motion video hyperspectral cameras: development, image processing, and applications
NASA Astrophysics Data System (ADS)
Kanaev, A. V.
2015-10-01
Emergence of spectral pixel-level color filters has enabled development of hyper-spectral Full Motion Video (FMV) sensors operating in visible (EO) and infrared (IR) wavelengths. The new class of hyper-spectral cameras opens broad possibilities of its utilization for military and industry purposes. Indeed, such cameras are able to classify materials as well as detect and track spectral signatures continuously in real time while simultaneously providing an operator the benefit of enhanced-discrimination-color video. Supporting these extensive capabilities requires significant computational processing of the collected spectral data. In general, two processing streams are envisioned for mosaic array cameras. The first is spectral computation that provides essential spectral content analysis e.g. detection or classification. The second is presentation of the video to an operator that can offer the best display of the content depending on the performed task e.g. providing spatial resolution enhancement or color coding of the spectral analysis. These processing streams can be executed in parallel or they can utilize each other's results. The spectral analysis algorithms have been developed extensively, however demosaicking of more than three equally-sampled spectral bands has been explored scarcely. We present unique approach to demosaicking based on multi-band super-resolution and show the trade-off between spatial resolution and spectral content. Using imagery collected with developed 9-band SWIR camera we demonstrate several of its concepts of operation including detection and tracking. We also compare the demosaicking results to the results of multi-frame super-resolution as well as to the combined multi-frame and multiband processing.
Energy-absorption capability of composite tubes and beams. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Farley, Gary L.; Jones, Robert M.
1989-01-01
In this study the objective was to develop a method of predicting the energy-absorption capability of composite subfloor beam structures. Before it is possible to develop such an analysis capability, an in-depth understanding of the crushing process of composite materials must be achieved. Many variables affect the crushing process of composite structures, such as the constituent materials' mechanical properties, specimen geometry, and crushing speed. A comprehensive experimental evaluation of tube specimens was conducted to develop insight into how composite structural elements crush and what are the controlling mechanisms. In this study the four characteristic crushing modes, transverse shearing, brittle fracturing, lamina bending, and local buckling were identified and the mechanisms that control the crushing process defined. An in-depth understanding was developed of how material properties affect energy-absorption capability. For example, an increase in fiber and matrix stiffness and failure strain can, depending upon the configuration of the tube, increase energy-absorption capability. An analysis to predict the energy-absorption capability of composite tube specimens was developed and verified. Good agreement between experiment and prediction was obtained.
The Challenges of Credible Thermal Protection System Reliability Quantification
NASA Technical Reports Server (NTRS)
Green, Lawrence L.
2013-01-01
The paper discusses several of the challenges associated with developing a credible reliability estimate for a human-rated crew capsule thermal protection system. The process of developing such a credible estimate is subject to the quantification, modeling and propagation of numerous uncertainties within a probabilistic analysis. The development of specific investment recommendations, to improve the reliability prediction, among various potential testing and programmatic options is then accomplished through Bayesian analysis.
The development of a science process assessment for fourth-grade students
NASA Astrophysics Data System (ADS)
Smith, Kathleen A.; Welliver, Paul W.
In this study, a multiple-choice test entitled the Science Process Assessment was developed to measure the science process skills of students in grade four. Based on the Recommended Science Competency Continuum for Grades K to 6 for Pennsylvania Schools, this instrument measured the skills of (1) observing, (2) classifying, (3) inferring, (4) predicting, (5) measuring, (6) communicating, (7) using space/time relations, (8) defining operationally, (9) formulating hypotheses, (10) experimenting, (11) recognizing variables, (12) interpreting data, and (13) formulating models. To prepare the instrument, classroom teachers and science educators were invited to participate in two science education workshops designed to develop an item bank of test questions applicable to measuring process skill learning. Participants formed writing teams and generated 65 test items representing the 13 process skills. After a comprehensive group critique of each item, 61 items were identified for inclusion into the Science Process Assessment item bank. To establish content validity, the item bank was submitted to a select panel of science educators for the purpose of judging item acceptability. This analysis yielded 55 acceptable test items and produced the Science Process Assessment, Pilot 1. Pilot 1 was administered to 184 fourth-grade students. Students were given a copy of the test booklet; teachers read each test aloud to the students. Upon completion of this first administration, data from the item analysis yielded a reliability coefficient of 0.73. Subsequently, 40 test items were identified for the Science Process Assessment, Pilot 2. Using the test-retest method, the Science Process Assessment, Pilot 2 (Test 1 and Test 2) was administered to 113 fourth-grade students. Reliability coefficients of 0.80 and 0.82, respectively, were ascertained. The correlation between Test 1 and Test 2 was 0.77. The results of this study indicate that (1) the Science Process Assessment, Pilot 2, is a valid and reliable instrument applicable to measuring the science process skills of students in grade four, (2) using educational workshops as a means of developing item banks of test questions is viable and productive in the test development process, and (3) involving classroom teachers and science educators in the test development process is educationally efficient and effective.
1-G Human Factors for Optimal Processing and Operability of Ground Systems Up to CxP GOP PDR
NASA Technical Reports Server (NTRS)
Stambolian, Damon B.; Henderson, Gena; Miller, Darcy; Prevost, Gary; Tran, Donald; Barth, Tim
2011-01-01
This slide presentation reviews the development and use of a process and tool for developing these requirements and improve the design for ground operations. A Human Factors Engineering Analysis (HFEA) Tool was developed to create a dedicated subset of requirements from the FAA requirements for each subsystem. As an example the use of the human interface with an actuator motor is considered.
Standardization of pitch-range settings in voice acoustic analysis.
Vogel, Adam P; Maruff, Paul; Snyder, Peter J; Mundt, James C
2009-05-01
Voice acoustic analysis is typically a labor-intensive, time-consuming process that requires the application of idiosyncratic parameters tailored to individual aspects of the speech signal. Such processes limit the efficiency and utility of voice analysis in clinical practice as well as in applied research and development. In the present study, we analyzed 1,120 voice files, using standard techniques (case-by-case hand analysis), taking roughly 10 work weeks of personnel time to complete. The results were compared with the analytic output of several automated analysis scripts that made use of preset pitch-range parameters. After pitch windows were selected to appropriately account for sex differences, the automated analysis scripts reduced processing time of the 1,120 speech samples to less than 2.5 h and produced results comparable to those obtained with hand analysis. However, caution should be exercised when applying the suggested preset values to pathological voice populations.
Hydrotreater/Distillation Column Hazard Analysis Report Rev. 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
This project Hazard and Risk Analysis Report contains the results of several hazard analyses and risk assessments. An initial assessment was conducted in 2012, which included a multi-step approach ranging from design reviews to a formal What-If hazard analysis. A second What-If hazard analysis was completed during February 2013 to evaluate the operation of the hydrotreater/distillation column processes to be installed in a process enclosure within the Process Development Laboratory West (PDL-West) facility located on the PNNL campus. The qualitative analysis included participation of project and operations personnel and applicable subject matter experts. The analysis identified potential hazardous scenarios, eachmore » based on an initiating event coupled with a postulated upset condition. The unmitigated consequences of each hazardous scenario were generally characterized as a process upset; the exposure of personnel to steam, vapors or hazardous material; a spray or spill of hazardous material; the creation of a flammable atmosphere; or an energetic release from a pressure boundary.« less
A method for identifying EMI critical circuits during development of a large C3
NASA Astrophysics Data System (ADS)
Barr, Douglas H.
The circuit analysis methods and process Boeing Aerospace used on a large, ground-based military command, control, and communications (C3) system are described. This analysis was designed to help identify electromagnetic interference (EMI) critical circuits. The methodology used the MIL-E-6051 equipment criticality categories as the basis for defining critical circuits, relational database technology to help sort through and account for all of the approximately 5000 system signal cables, and Macintosh Plus personal computers to predict critical circuits based on safety margin analysis. The EMI circuit analysis process systematically examined all system circuits to identify which ones were likely to be EMI critical. The process used two separate, sequential safety margin analyses to identify critical circuits (conservative safety margin analysis, and detailed safety margin analysis). These analyses used field-to-wire and wire-to-wire coupling models using both worst-case and detailed circuit parameters (physical and electrical) to predict circuit safety margins. This process identified the predicted critical circuits that could then be verified by test.
Teaching concept analysis to graduate nursing students.
Schiller, Catharine J
2018-04-01
To provide guidance to educators who use the Wilson (1963) concept analysis method, as modified by Walker and Avant (2011), in their graduate nursing curriculum BACKGROUND: While graduate nursing curricula often include a concept analysis assignment, there is a paucity of literature to assist educators in guiding students through this challenging process. This article details one way for educators to assist graduate nursing students in learning how to undertake each step of the Wilson (1963) concept analysis method, as modified by Walker and Avant (2011). Wilson (1963) concept analysis method, as modified by Walker and Avant (2011). Using examples, this article walks the reader through the Walker and Avant (2011) concept analysis process and addresses those issues commonly encountered by educators during this process. This article presented one way of walking students through a Walker and Avant (2011) concept analysis. Having clear information about the steps involved in developing a concept analysis will make it easier for educators to incorporate it into their graduate nursing curriculum and to effectively guide students on their journey through this process. © 2018 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Mizell, Carolyn Barrett; Malone, Linda
2007-01-01
The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.
Evaluation of rail test frequencies using risk analysis
DOT National Transportation Integrated Search
2009-03-03
Several industries now use risk analysis to develop : inspection programs to ensure acceptable mechanical integrity : and reliability. These industries include nuclear and electric : power generation, oil refining, gas processing, onshore and : offsh...
Effective approach to spectroscopy and spectral analysis techniques using Matlab
NASA Astrophysics Data System (ADS)
Li, Xiang; Lv, Yong
2017-08-01
With the development of electronic information, computer and network, modern education technology has entered new era, which would give a great impact on teaching process. Spectroscopy and spectral analysis is an elective course for Optoelectronic Information Science and engineering. The teaching objective of this course is to master the basic concepts and principles of spectroscopy, spectral analysis and testing of basic technical means. Then, let the students learn the principle and technology of the spectrum to study the structure and state of the material and the developing process of the technology. MATLAB (matrix laboratory) is a multi-paradigm numerical computing environment and fourth-generation programming language. A proprietary programming language developed by MathWorks, MATLAB allows matrix manipulations, plotting of functions and data, Based on the teaching practice, this paper summarizes the new situation of applying Matlab to the teaching of spectroscopy. This would be suitable for most of the current school multimedia assisted teaching
A Thematic Analysis of Theoretical Models for Translational Science in Nursing: Mapping the Field
Mitchell, Sandra A.; Fisher, Cheryl A.; Hastings, Clare E.; Silverman, Leanne B.; Wallen, Gwenyth R.
2010-01-01
Background The quantity and diversity of conceptual models in translational science may complicate rather than advance the use of theory. Purpose This paper offers a comparative thematic analysis of the models available to inform knowledge development, transfer, and utilization. Method Literature searches identified 47 models for knowledge translation. Four thematic areas emerged: (1) evidence-based practice and knowledge transformation processes; (2) strategic change to promote adoption of new knowledge; (3) knowledge exchange and synthesis for application and inquiry; (4) designing and interpreting dissemination research. Discussion This analysis distinguishes the contributions made by leaders and researchers at each phase in the process of discovery, development, and service delivery. It also informs the selection of models to guide activities in knowledge translation. Conclusions A flexible theoretical stance is essential to simultaneously develop new knowledge and accelerate the translation of that knowledge into practice behaviors and programs of care that support optimal patient outcomes. PMID:21074646
Human Factors Operability Timeline Analysis to Improve the Processing Flow of the Orion Spacecraft
NASA Technical Reports Server (NTRS)
Stambolian, Damon B.; Schlierf, Roland; Miller, Darcy; Posada, Juan; Haddock, Mike; Haddad, Mike; Tran, Donald; Henderon, Gena; Barth, Tim
2011-01-01
This slide presentation reviews the use of Human factors and timeline analysis to have a more efficient and effective processing flow. The solution involved developing a written timeline of events that included each activity within each functional flow block. Each activity had computer animation videos and pictures of the people involved and the hardware. The Human Factors Engineering Analysis Tool (HFEAT) was improved by modifying it to include the timeline of events. The HFEAT was used to define the human factors requirements and design solutions were developed for these requirements. An example of a functional flow block diagram is shown, and a view from one of the animations (i.e., short stack pallet) is shown and explained.
NASA Technical Reports Server (NTRS)
Diorio, Kimberly A.; Voska, Ned (Technical Monitor)
2002-01-01
This viewgraph presentation provides information on Human Factors Process Failure Modes and Effects Analysis (HF PFMEA). HF PFMEA includes the following 10 steps: Describe mission; Define System; Identify human-machine; List human actions; Identify potential errors; Identify factors that effect error; Determine likelihood of error; Determine potential effects of errors; Evaluate risk; Generate solutions (manage error). The presentation also describes how this analysis was applied to a liquid oxygen pump acceptance test.
Transcriptome analysis during ripening of table grape berry cv. Thompson Seedless
Balic, Iván; Vizoso, Paula; Nilo-Poyanco, Ricardo; Sanhueza, Dayan; Olmedo, Patricio; Sepúlveda, Pablo; Arriagada, Cesar; Defilippi, Bruno G.; Meneses, Claudio
2018-01-01
Ripening is one of the key processes associated with the development of major organoleptic characteristics of the fruit. This process has been extensively characterized in climacteric fruit, in contrast with non-climacteric fruit such as grape, where the process is less understood. With the aim of studying changes in gene expression during ripening of non-climacteric fruit, an Illumina based RNA-Seq transcriptome analysis was performed on four developmental stages, between veraison and harvest, on table grapes berries cv Thompson Seedless. Functional analysis showed a transcriptional increase in genes related with degradation processes of chlorophyll, lipids, macromolecules recycling and nucleosomes organization; accompanied by a decrease in genes related with chloroplasts integrity and amino acid synthesis pathways. It was possible to identify several processes described during leaf senescence, particularly close to harvest. Before this point, the results suggest a high transcriptional activity associated with the regulation of gene expression, cytoskeletal organization and cell wall metabolism, which can be related to growth of berries and firmness loss characteristic to this stage of development. This high metabolic activity could be associated with an increase in the transcription of genes related with glycolysis and respiration, unexpected for a non-climacteric fruit ripening. PMID:29320527
Quantitative CMMI Assessment for Offshoring through the Analysis of Project Management Repositories
NASA Astrophysics Data System (ADS)
Sunetnanta, Thanwadee; Nobprapai, Ni-On; Gotel, Olly
The nature of distributed teams and the existence of multiple sites in offshore software development projects pose a challenging setting for software process improvement. Often, the improvement and appraisal of software processes is achieved through a turnkey solution where best practices are imposed or transferred from a company’s headquarters to its offshore units. In so doing, successful project health checks and monitoring for quality on software processes requires strong project management skills, well-built onshore-offshore coordination, and often needs regular onsite visits by software process improvement consultants from the headquarters’ team. This paper focuses on software process improvement as guided by the Capability Maturity Model Integration (CMMI) and proposes a model to evaluate the status of such improvement efforts in the context of distributed multi-site projects without some of this overhead. The paper discusses the application of quantitative CMMI assessment through the collection and analysis of project data gathered directly from project repositories to facilitate CMMI implementation and reduce the cost of such implementation for offshore-outsourced software development projects. We exemplify this approach to quantitative CMMI assessment through the analysis of project management data and discuss the future directions of this work in progress.
Transcriptome analysis during ripening of table grape berry cv. Thompson Seedless.
Balic, Iván; Vizoso, Paula; Nilo-Poyanco, Ricardo; Sanhueza, Dayan; Olmedo, Patricio; Sepúlveda, Pablo; Arriagada, Cesar; Defilippi, Bruno G; Meneses, Claudio; Campos-Vargas, Reinaldo
2018-01-01
Ripening is one of the key processes associated with the development of major organoleptic characteristics of the fruit. This process has been extensively characterized in climacteric fruit, in contrast with non-climacteric fruit such as grape, where the process is less understood. With the aim of studying changes in gene expression during ripening of non-climacteric fruit, an Illumina based RNA-Seq transcriptome analysis was performed on four developmental stages, between veraison and harvest, on table grapes berries cv Thompson Seedless. Functional analysis showed a transcriptional increase in genes related with degradation processes of chlorophyll, lipids, macromolecules recycling and nucleosomes organization; accompanied by a decrease in genes related with chloroplasts integrity and amino acid synthesis pathways. It was possible to identify several processes described during leaf senescence, particularly close to harvest. Before this point, the results suggest a high transcriptional activity associated with the regulation of gene expression, cytoskeletal organization and cell wall metabolism, which can be related to growth of berries and firmness loss characteristic to this stage of development. This high metabolic activity could be associated with an increase in the transcription of genes related with glycolysis and respiration, unexpected for a non-climacteric fruit ripening.
QUAGOL: a guide for qualitative data analysis.
Dierckx de Casterlé, Bernadette; Gastmans, Chris; Bryon, Els; Denier, Yvonne
2012-03-01
Data analysis is a complex and contested part of the qualitative research process, which has received limited theoretical attention. Researchers are often in need of useful instructions or guidelines on how to analyze the mass of qualitative data, but face the lack of clear guidance for using particular analytic methods. The aim of this paper is to propose and discuss the Qualitative Analysis Guide of Leuven (QUAGOL), a guide that was developed in order to be able to truly capture the rich insights of qualitative interview data. The article describes six major problems researchers are often struggling with during the process of qualitative data analysis. Consequently, the QUAGOL is proposed as a guide to facilitate the process of analysis. Challenges emerged and lessons learned from own extensive experiences with qualitative data analysis within the Grounded Theory Approach, as well as from those of other researchers (as described in the literature), were discussed and recommendations were presented. Strengths and pitfalls of the proposed method were discussed in detail. The Qualitative Analysis Guide of Leuven (QUAGOL) offers a comprehensive method to guide the process of qualitative data analysis. The process consists of two parts, each consisting of five stages. The method is systematic but not rigid. It is characterized by iterative processes of digging deeper, constantly moving between the various stages of the process. As such, it aims to stimulate the researcher's intuition and creativity as optimal as possible. The QUAGOL guide is a theory and practice-based guide that supports and facilitates the process of analysis of qualitative interview data. Although the method can facilitate the process of analysis, it cannot guarantee automatic quality. The skills of the researcher and the quality of the research team remain the most crucial components of a successful process of analysis. Additionally, the importance of constantly moving between the various stages throughout the research process cannot be overstated. Copyright © 2011 Elsevier Ltd. All rights reserved.
Thread concept for automatic task parallelization in image analysis
NASA Astrophysics Data System (ADS)
Lueckenhaus, Maximilian; Eckstein, Wolfgang
1998-09-01
Parallel processing of image analysis tasks is an essential method to speed up image processing and helps to exploit the full capacity of distributed systems. However, writing parallel code is a difficult and time-consuming process and often leads to an architecture-dependent program that has to be re-implemented when changing the hardware. Therefore it is highly desirable to do the parallelization automatically. For this we have developed a special kind of thread concept for image analysis tasks. Threads derivated from one subtask may share objects and run in the same context but may process different threads of execution and work on different data in parallel. In this paper we describe the basics of our thread concept and show how it can be used as basis of an automatic task parallelization to speed up image processing. We further illustrate the design and implementation of an agent-based system that uses image analysis threads for generating and processing parallel programs by taking into account the available hardware. The tests made with our system prototype show that the thread concept combined with the agent paradigm is suitable to speed up image processing by an automatic parallelization of image analysis tasks.
New technologies for solar energy silicon - Cost analysis of BCL process
NASA Technical Reports Server (NTRS)
Yaws, C. L.; Li, K.-Y.; Fang, C. S.; Lutwack, R.; Hsu, G.; Leven, H.
1980-01-01
New technologies for producing polysilicon are being developed to provide lower cost material for solar cells which convert sunlight into electricity. This article presents results for the BCL Process, which produces the solar-cell silicon by reduction of silicon tetrachloride with zinc vapor. Cost, sensitivity, and profitability analysis results are presented based on a preliminary process design of a plant to produce 1000 metric tons/year of silicon by the BCL Process. Profitability analysis indicates a sales price of $12.1-19.4 per kg of silicon (1980 dollars) at a 0-25 per cent DCF rate of return on investment after taxes. These results indicate good potential for meeting the goal of providing lower cost material for silicon solar cells.
A qualitative approach to systemic diagnosis of the SSME
NASA Technical Reports Server (NTRS)
Bickmore, Timothy W.; Maul, William A.
1993-01-01
A generic software architecture has been developed for posttest diagnostics of rocket engines, and is presently being applied to the posttest analysis of the SSME. This investigation deals with the Systems Section module of the architecture, which is presently under development. Overviews of the manual SSME systems analysis process and the overall SSME diagnostic system architecture are presented.
ERIC Educational Resources Information Center
Chang, Liang-Te; And Others
A study was conducted to develop the electronic technical competencies of duty and task analysis by using a revised DACUM (Developing a Curriculum) method, a questionnaire survey, and a fuzzy synthesis operation. The revised DACUM process relied on inviting electronics trade professionals to analyze electronic technology for entry-level…
The College Football Student-Athlete's Academic Experience: Network Analysis and Model Development
ERIC Educational Resources Information Center
Young, Kyle McLendon
2010-01-01
A grounded theory research study employing network analysis as a means of facilitating the latter stages of the coding process was conducted at a selective university that competes at the highest level of college football. The purpose of the study was to develop a better understanding of how interactive dynamics and controlling mechanisms, such as…
ERIC Educational Resources Information Center
Martin, James L.
This paper reports on attempts by the author to construct a theoretical framework of adult education participation using a theory development process and the corresponding multivariate statistical techniques. Two problems are identified: the lack of theoretical framework in studying problems, and the limiting of statistical analysis to univariate…
The Role of the Company in Generating Skills. The Learning Effects of Work Organization. Denmark.
ERIC Educational Resources Information Center
Kristensen, Peer Hull; Petersen, James Hopner
The impact of developments in work organizations on the skilling process in Denmark was studied through a macro analysis of available statistical information about the development of workplace training in Denmark and case studies of three Danish firms. The macro analysis focused on the following: Denmark's vocational training system; the Danish…
ERIC Educational Resources Information Center
Meerbaum-Salant, Orni; Hazzan, Orit
2009-01-01
This paper focuses on challenges in mentoring software development projects in the high school and analyzes difficulties encountered by Computer Science teachers in the mentoring process according to Shulman's Teacher Knowledge Base Model. The main difficulties that emerged from the data analysis belong to the following knowledge sources of…
Development Context Driven Change Awareness and Analysis Framework
NASA Technical Reports Server (NTRS)
Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha
2014-01-01
Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, De- CAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.
Development Context Driven Change Awareness and Analysis Framework
NASA Technical Reports Server (NTRS)
Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha; Wang, Yurong; Elbaum, Sebastian
2014-01-01
Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, DeCAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.
A "Rainmaker" Process for Developing Internet-Based Retail Businesses
ERIC Educational Resources Information Center
Abrahams, Alan S.; Singh, Tirna
2011-01-01
Various systems development life cycles and business development models have been popularized by information systems researchers and practitioners over a number of decades. In the case of systems development life cycles, these have been targeted at software development projects within an organization, typically involving analysis, design,…
Updating National Topographic Data Base Using Change Detection Methods
NASA Astrophysics Data System (ADS)
Keinan, E.; Felus, Y. A.; Tal, Y.; Zilberstien, O.; Elihai, Y.
2016-06-01
The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA), the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS) classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.
Nancy Diaz; Dean Apostol
1992-01-01
This publication presents a Landscape Design and Analysis Process, along with some simple methods and tools for describing landscapes and their function. The information is qualitative in nature and highlights basic concepts, but does not address landscape ecology in great depth. Readers are encouraged to consult the list of selected references in Chapter 2 if they...
NASA Technical Reports Server (NTRS)
Hsieh, Shang-Hsien
1993-01-01
The principal objective of this research is to develop, test, and implement coarse-grained, parallel-processing strategies for nonlinear dynamic simulations of practical structural problems. There are contributions to four main areas: finite element modeling and analysis of rotational dynamics, numerical algorithms for parallel nonlinear solutions, automatic partitioning techniques to effect load-balancing among processors, and an integrated parallel analysis system.
Mathematical Sciences Division 1992 Programs
1992-10-01
statistical theory that underlies modern signal analysis . There is a strong emphasis on stochastic processes and time series , particularly those which...include optimal resource planning and real- time scheduling of stochastic shop-floor processes. Scheduling systems will be developed that can adapt to...make forecasts for the length-of-service time series . Protocol analysis of these sessions will be used to idenify relevant contextual features and to
ERIC Educational Resources Information Center
Shintani, Natsuko
2015-01-01
This article reports a meta-analysis of 42 experiments in 33 published studies involving processing instruction (PI) and production-based instruction (PB) used in the PI studies. The comparative effectiveness of PI and PB showed that although PI was more effective than PB for developing receptive knowledge, PB was just as effective as PI for…
ERIC Educational Resources Information Center
Raven, Rob P. J. M.; Heiskanen, Eva; Lovio, Raimo; Hodson, Mike; Brohmann, Bettina
2008-01-01
This article examines how local experiments and negotiation processes contribute to social and field-level learning. The analysis is framed within the niche development literature, which offers a framework for analyzing the relation between projects in local contexts and the transfer of local experiences into generally applicable rules. The…
Integration of sustainability into process simulaton of a dairy process
USDA-ARS?s Scientific Manuscript database
Life cycle analysis, a method used to quantify the energy and environmental flows of a process or product on the environment, is increasingly utilized by food processors to develop strategies to lessen the carbon footprint of their operations. In the case of the milk supply chain, the method requir...
A Practical Decision-Analysis Process for Forest Ecosystem Management
H. Michael Rauscher; F. Thomas Lloyd; David L. Loftis; Mark J. Twery
2000-01-01
Many authors have pointed out the need to firm up the 'fuzzy' ecosystem management paradigm and develop operationally practical processes to allow forest managers to accommodate more effectively the continuing rapid change in societal perspectives and goals. There are three spatial scales where clear, precise, practical ecosystem management processes are...
Communicative Interaction Processes Involving Non-Vocal Physically Handicapped Children.
ERIC Educational Resources Information Center
Harris, Deberah
1982-01-01
Communication prostheses are critical components of the nonvocal child's communication process, but are only one component. This article focuses on the steps involved in communicative interaction processes and the potential barriers to the development of effective interaction and analysis of nonvocal communicative interactions. A discussion of the…
An Individual Differences Analysis of Double-Aspect Stimulus Perception.
ERIC Educational Resources Information Center
Forsyth, G. Alfred; Huber, R. John
Any theory of information processing must address both what is processed and how that processing takes place. Most studies investigating variables which alter physical dimension utilization have ignored the large individual differences in selective attention or cue utilization. A paradigm was developed using an individual focus on information…
Developing a Methodology for Designing Systems of Instruction.
ERIC Educational Resources Information Center
Carpenter, Polly
This report presents a description of a process for instructional system design, identification of the steps in the design process, and determination of their sequence and interrelationships. As currently envisioned, several interrelated steps must be taken, five of which provide the inputs to the final design process. There are analysis of…
Scampicchio, Matteo; Mimmo, Tanja; Capici, Calogero; Huck, Christian; Innocente, Nadia; Drusch, Stephan; Cesco, Stefano
2012-11-14
Stable isotope values were used to develop a new analytical approach enabling the simultaneous identification of milk samples either processed with different heating regimens or from different geographical origins. The samples consisted of raw, pasteurized (HTST), and ultrapasteurized (UHT) milk from different Italian origins. The approach consisted of the analysis of the isotope ratio of δ¹³C and δ¹⁵N for the milk samples and their fractions (fat, casein, and whey). The main finding of this work is that as the heat processing affects the composition of the milk fractions, changes in δ¹³C and δ¹⁵N were also observed. These changes were used as markers to develop pattern recognition maps based on principal component analysis and supervised classification models, such as linear discriminant analysis (LDA), multivariate regression (MLR), principal component regression (PCR), and partial least-squares (PLS). The results give proof of the concept that isotope ratio mass spectroscopy can discriminate simultaneously between milk samples according to their geographical origin and type of processing.
Ning, Tongbo; Cui, Hao; Sun, Feng; Zou, Jidian
2017-09-05
Glioblastoma represents one of the most aggressive malignant brain tumors with high morbidity and motility. Demethylation drugs have been developed for its treatment with little efficacy has been observed. The purpose of this study was to screen therapeutic targets of demethylation drugs or bioactive molecules for glioblastoma through systemic bioinformatics analysis. We firstly downloaded genome-wide expression profiles from the Gene Expression Omnibus (GEO) and conducted the primary analysis through R software, mainly including preprocessing of raw microarray data, transformation between probe ID and gene symbol and identification of differential expression genes (DEGs). Secondly, functional enrichment analysis was conducted via the Database for Annotation, Visualization and Integrated Discovery (DAVID) to explore biological processes involved in the development of glioblastoma. Thirdly, we constructed protein-protein interaction (PPI) network of interested genes and conducted cross analysis for multi datasets to obtain potential therapeutic targets for glioblastoma. Finally, we further confirmed the therapeutic targets through real-time RT-PCR. As a result, biological processes that related to cancer development, amino metabolism, immune response and etc. were found to be significantly enriched in genes that differential expression in glioblastoma and regulated by 5'aza-dC. Besides, network and cross analysis identified ACAT2, UFC1 and CYB5R1 as novel therapeutic targets of demethylation drugs which also confirmed by real time RT-PCR. In conclusions, our study identified several biological processes and genes that involved in the development of glioblastoma and regulated by 5'aza-dC, which would be helpful for the treatment of glioblastoma. Copyright © 2017 Elsevier B.V. All rights reserved.