Sample records for detailed process analysis

  1. GIS-assisted spatial analysis for urban regulatory detailed planning: designer's dimension in the Chinese code system

    NASA Astrophysics Data System (ADS)

    Yu, Yang; Zeng, Zheng

    2009-10-01

    By discussing the causes behind the high amendments ratio in the implementation of urban regulatory detailed plans in China despite its law-ensured status, the study aims to reconcile conflict between the legal authority of regulatory detailed planning and the insufficient scientific support in its decision-making and compilation by introducing into the process spatial analysis based on GIS technology and 3D modeling thus present a more scientific and flexible approach to regulatory detailed planning in China. The study first points out that the current compilation process of urban regulatory detailed plan in China employs mainly an empirical approach which renders it constantly subjected to amendments; the study then discusses the need and current utilization of GIS in the Chinese system and proposes the framework of a GIS-assisted 3D spatial analysis process from the designer's perspective which can be regarded as an alternating processes between the descriptive codes and physical design in the compilation of regulatory detailed planning. With a case study of the processes and results from the application of the framework, the paper concludes that the proposed framework can be an effective instrument which provides more rationality, flexibility and thus more efficiency to the compilation and decision-making process of urban regulatory detailed plan in China.

  2. An Improved Spectral Analysis Method for Fatigue Damage Assessment of Details in Liquid Cargo Tanks

    NASA Astrophysics Data System (ADS)

    Zhao, Peng-yuan; Huang, Xiao-ping

    2018-03-01

    Errors will be caused in calculating the fatigue damages of details in liquid cargo tanks by using the traditional spectral analysis method which is based on linear system, for the nonlinear relationship between the dynamic stress and the ship acceleration. An improved spectral analysis method for the assessment of the fatigue damage in detail of a liquid cargo tank is proposed in this paper. Based on assumptions that the wave process can be simulated by summing the sinusoidal waves in different frequencies and the stress process can be simulated by summing the stress processes induced by these sinusoidal waves, the stress power spectral density (PSD) is calculated by expanding the stress processes induced by the sinusoidal waves into Fourier series and adding the amplitudes of each harmonic component with the same frequency. This analysis method can take the nonlinear relationship into consideration and the fatigue damage is then calculated based on the PSD of stress. Take an independent tank in an LNG carrier for example, the accuracy of the improved spectral analysis method is proved much better than that of the traditional spectral analysis method by comparing the calculated damage results with the results calculated by the time domain method. The proposed spectral analysis method is more accurate in calculating the fatigue damages in detail of ship liquid cargo tanks.

  3. Abhijit Dutta | NREL

    Science.gov Websites

    Techno-economic analysis Process model development for existing and conceptual processes Detailed heat integration Economic analysis of integrated processes Integration of process simulation learnings into control ;Conceptual Process Design and Techno-Economic Assessment of Ex Situ Catalytic Fast Pyrolysis of Biomass: A

  4. Thermochemical Conversion Techno-Economic Analysis | Bioenergy | NREL

    Science.gov Websites

    Conversion Techno-Economic Analysis Thermochemical Conversion Techno-Economic Analysis NREL's Thermochemical Conversion Analysis team focuses on the conceptual process design and techno-economic analysis , detailed process models, and TEA developed under this project provide insights into the potential economic

  5. Proposed Land Conveyance for Construction of Three Facilities at March Air Force Base, California

    DTIC Science & Technology

    1988-09-01

    identified would result from future development on the 845-acre parcel after it has been conveyed. Therefore, detailed development review and...Impact Analysis Process (EIAP) of the Air Force. This detailed development review is within the purview of the state and local government with...establishes the process under which subsequent detailed environmental review would be conducted. CEQA and its implementing regulations are administered by

  6. Logistics Process Analysis ToolProcess Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2008-03-31

    LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fortmore » Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less

  7. A method for identifying EMI critical circuits during development of a large C3

    NASA Astrophysics Data System (ADS)

    Barr, Douglas H.

    The circuit analysis methods and process Boeing Aerospace used on a large, ground-based military command, control, and communications (C3) system are described. This analysis was designed to help identify electromagnetic interference (EMI) critical circuits. The methodology used the MIL-E-6051 equipment criticality categories as the basis for defining critical circuits, relational database technology to help sort through and account for all of the approximately 5000 system signal cables, and Macintosh Plus personal computers to predict critical circuits based on safety margin analysis. The EMI circuit analysis process systematically examined all system circuits to identify which ones were likely to be EMI critical. The process used two separate, sequential safety margin analyses to identify critical circuits (conservative safety margin analysis, and detailed safety margin analysis). These analyses used field-to-wire and wire-to-wire coupling models using both worst-case and detailed circuit parameters (physical and electrical) to predict circuit safety margins. This process identified the predicted critical circuits that could then be verified by test.

  8. Proposal on How To Conduct a Biopharmaceutical Process Failure Mode and Effect Analysis (FMEA) as a Risk Assessment Tool.

    PubMed

    Zimmermann, Hartmut F; Hentschel, Norbert

    2011-01-01

    With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here explains how a biopharmaceutical process FMEA can be conducted. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. With the help of this guideline, different details of the manufacturing process can be ranked according to their potential risks, and this can help pharmaceutical companies to identify aspects with high potential risks and to react accordingly to improve the safety of medicines.

  9. Clinical process cost analysis.

    PubMed

    Marrin, C A; Johnson, L C; Beggs, V L; Batalden, P B

    1997-09-01

    New systems of reimbursement are exerting enormous pressure on clinicians and hospitals to reduce costs. Using cheaper supplies or reducing the length of stay may be a satisfactory short-term solution, but the best strategy for long-term success is radical reduction of costs by reengineering the processes of care. However, few clinicians or institutions know the actual costs of medical care; nor do they understand, in detail, the activities involved in the delivery of care. Finally, there is no accepted method for linking the two. Clinical process cost analysis begins with the construction of a detailed flow diagram incorporating each activity in the process of care. The cost of each activity is then calculated, and the two are linked. This technique was applied to Diagnosis Related Group 75 to analyze the real costs of the operative treatment of lung cancer at one institution. Total costs varied between $6,400 and $7,700. The major driver of costs was personnel time, which accounted for 55% of the total. Forty percent of the total cost was incurred in the operating room. The cost of care decreased progressively during hospitalization. Clinical process cost analysis provides detailed information about the costs and processes of care. The insights thus obtained may be used to reduce costs by reengineering the process.

  10. Control of Technology Transfer at JPL

    NASA Technical Reports Server (NTRS)

    Oliver, Ronald

    2006-01-01

    Controlled Technology: 1) Design: preliminary or critical design data, schematics, technical flow charts, SNV code/diagnostics, logic flow diagrams, wirelist, ICDs, detailed specifications or requirements. 2) Development: constraints, computations, configurations, technical analyses, acceptance criteria, anomaly resolution, detailed test plans, detailed technical proposals. 3) Production: process or how-to: assemble, operated, repair, maintain, modify. 4) Manufacturing: technical instructions, specific parts, specific materials, specific qualities, specific processes, specific flow. 5) Operations: how-to operate, contingency or standard operating plans, Ops handbooks. 6) Repair: repair instructions, troubleshooting schemes, detailed schematics. 7) Test: specific procedures, data, analysis, detailed test plan and retest plans, detailed anomaly resolutions, detailed failure causes and corrective actions, troubleshooting, trended test data, flight readiness data. 8) Maintenance: maintenance schedules and plans, methods for regular upkeep, overhaul instructions. 9) Modification: modification instructions, upgrades kit parts, including software

  11. 29 CFR 1910.119 - Process safety management of highly hazardous chemicals.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...

  12. 29 CFR 1910.119 - Process safety management of highly hazardous chemicals.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...

  13. 76 FR 2853 - Approval and Promulgation of Air Quality Implementation Plans; Delaware; Infrastructure State...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-18

    ... technical analysis submitted for parallel-processing by DNREC on December 9, 2010, to address significant... technical analysis submitted by DNREC for parallel-processing on December 9, 2010, to satisfy the... consists of a technical analysis that provides detailed support for Delaware's position that it has...

  14. Extended principle component analysis - a useful tool to understand processes governing water quality at catchment scales

    NASA Astrophysics Data System (ADS)

    Selle, B.; Schwientek, M.

    2012-04-01

    Water quality of ground and surface waters in catchments is typically driven by many complex and interacting processes. While small scale processes are often studied in great detail, their relevance and interplay at catchment scales remain often poorly understood. For many catchments, extensive monitoring data on water quality have been collected for different purposes. These heterogeneous data sets contain valuable information on catchment scale processes but are rarely analysed using integrated methods. Principle component analysis (PCA) has previously been applied to this kind of data sets. However, a detailed analysis of scores, which are an important result of a PCA, is often missing. Mathematically, PCA expresses measured variables on water quality, e.g. nitrate concentrations, as linear combination of independent, not directly observable key processes. These computed key processes are represented by principle components. Their scores are interpretable as process intensities which vary in space and time. Subsequently, scores can be correlated with other key variables and catchment characteristics, such as water travel times and land use that were not considered in PCA. This detailed analysis of scores represents an extension of the commonly applied PCA which could considerably improve the understanding of processes governing water quality at catchment scales. In this study, we investigated the 170 km2 Ammer catchment in SW Germany which is characterised by an above average proportion of agricultural (71%) and urban (17%) areas. The Ammer River is mainly fed by karstic springs. For PCA, we separately analysed concentrations from (a) surface waters of the Ammer River and its tributaries, (b) spring waters from the main aquifers and (c) deep groundwater from production wells. This analysis was extended by a detailed analysis of scores. We analysed measured concentrations on major ions and selected organic micropollutants. Additionally, redox-sensitive variables and environmental tracers indicating groundwater age were analysed for deep groundwater from production wells. For deep groundwater, we found that microbial turnover was stronger influenced by local availability of energy sources than by travel times of groundwater to the wells. Groundwater quality primarily reflected the input of pollutants determined by landuse, e.g. agrochemicals. We concluded that for water quality in the Ammer catchment, conservative mixing of waters with different origin is more important than reactive transport processes along the flow path.

  15. Oversize/overweight permitting practices review : phase II.

    DOT National Transportation Integrated Search

    2013-02-01

    This study explores a more detailed analysis of the permitting process in the Mid-Atlantic Region and : delves into operational practice, and theory and history of the practice among states. The states : practices examined in greater detail include C...

  16. A Homegrown Design for Data Warehousing: A District Customizes Its Own Process for Generating Detailed Information about Students in Real Time

    ERIC Educational Resources Information Center

    Thompson, Terry J.; Gould, Karen J.

    2005-01-01

    In recent years the Metropolitan School District of Wayne Township in Indianapolis has been awash in data. In attempts to improve levels of student achievement, the authors collected all manner of statistical details about students and schools and attempted to perform data analysis as part of the school improvement process. The authors were never…

  17. The Direct Loan Demonstration Program: An Analysis of the Legislative Process Involving Federal Student Loan Policy.

    ERIC Educational Resources Information Center

    McCormick, Joe Lew

    This study examined major stakeholders' perceptions of their involvement and role in the legislative process surrounding the introduction, deliberation, and ultimate passage of the Direct Loan Demonstration Program (DLDP), a federal pilot student loan program. Data analysis was based on a detailed description of the legislative process surrounding…

  18. Cost Analysis Sources and Documents Data Base Reference Manual (Update)

    DTIC Science & Technology

    1989-06-01

    M: Refcrence Manual PRICE H: Training Course Workbook 11. Use in Cost Analysis. Important source of cost estimates for electronic and mechanical...Nature of Data. Contains many microeconomic time series by month or quarter. 5. Level of Detail. Very detailed. 6. Normalization Processes Required...Reference Manual. Moorestown, N.J,: GE Corporation, September 1986. 64. PRICE Training Course Workbook . Moorestown, N.J.: GE Corporation, February 1986

  19. A Unified Mathematical Approach to Image Analysis.

    DTIC Science & Technology

    1987-08-31

    describes four instances of the paradigm in detail. Directions for ongoing and future research are also indicated. Keywords: Image processing; Algorithms; Segmentation; Boundary detection; tomography; Global image analysis .

  20. 29 CFR 1926.64 - Process safety management of highly hazardous chemicals.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... analysis methodology being used. (5) The employer shall establish a system to promptly address the team's... the decision as to the appropriate PHA methodology to use. All PHA methodologies are subject to... be developed in conjunction with the process hazard analysis in sufficient detail to support the...

  1. 29 CFR 1926.64 - Process safety management of highly hazardous chemicals.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... analysis methodology being used. (5) The employer shall establish a system to promptly address the team's... the decision as to the appropriate PHA methodology to use. All PHA methodologies are subject to... be developed in conjunction with the process hazard analysis in sufficient detail to support the...

  2. Process mining techniques: an application to time management

    NASA Astrophysics Data System (ADS)

    Khowaja, Ali Raza

    2018-04-01

    In an environment people have to make sure that all of their work are completed within a given time in accordance with its quality. In order to achieve the real phenomenon of process mining one needs to understand all of these processes in a detailed manner. Personal Information and communication has always been a highlighting issue on internet but for now information and communication tools within factual life refers to their daily schedule, location analysis, environmental analysis and, more generally, social media applications support these systems which makes data available for data analysis generated through event logs, but also for process analysis which combines environmental and location analysis. Process mining can be used to exploit all these real live processes with the help of the event logs which are already available in those datasets through user censored data or may be user labeled data. These processes could be used to redesign a user's flow and understand all these processes in a bit more detailed manner. In order to increase the quality of each of the processes that we go through our daily lives is to give a closer look to each of the processes and after analyzing them, one should make changes to get better results. On the contrarily, we applied process mining techniques on seven different subjects combined in a single dataset collected from Korea. Above all, the following paper comments on the efficiency of processes in the event logs referring to time management's sphere of influence.

  3. The application of digital techniques to the analysis of metallurgical experiments

    NASA Technical Reports Server (NTRS)

    Rathz, T. J.

    1977-01-01

    The application of a specific digital computer system (known as the Image Data Processing System) to the analysis of three NASA-sponsored metallurgical experiments is discussed in some detail. The basic hardware and software components of the Image Data Processing System are presented. Many figures are presented in the discussion of each experimental analysis in an attempt to show the accuracy and speed that the Image Data Processing System affords in analyzing photographic images dealing with metallurgy, and in particular with material processing.

  4. The Vehicle Integrated Performance Analysis Experience: Reconnecting With Technical Integration

    NASA Technical Reports Server (NTRS)

    McGhee, D. S.

    2006-01-01

    Very early in the Space Launch Initiative program, a small team of engineers at MSFC proposed a process for performing system-level assessments of a launch vehicle. Aimed primarily at providing insight and making NASA a smart buyer, the Vehicle Integrated Performance Analysis (VIPA) team was created. The difference between the VIPA effort and previous integration attempts is that VIPA a process using experienced people from various disciplines, which focuses them on a technically integrated assessment. The foundations of VIPA s process are described. The VIPA team also recognized the need to target early detailed analysis toward identifying significant systems issues. This process is driven by the T-model for technical integration. VIPA s approach to performing system-level technical integration is discussed in detail. The VIPA process significantly enhances the development and monitoring of realizable project requirements. VIPA s assessment validates the concept s stated performance, identifies significant issues either with the concept or the requirements, and then reintegrates these issues to determine impacts. This process is discussed along with a description of how it may be integrated into a program s insight and review process. The VIPA process has gained favor with both engineering and project organizations for being responsive and insightful

  5. Heat and Mass Transfer Processes in Scrubber of Flue Gas Heat Recovery Device

    NASA Astrophysics Data System (ADS)

    Veidenbergs, Ivars; Blumberga, Dagnija; Vigants, Edgars; Kozuhars, Grigorijs

    2010-01-01

    The paper deals with the heat and mass transfer process research in a flue gas heat recovery device, where complicated cooling, evaporation and condensation processes are taking place simultaneously. The analogy between heat and mass transfer is used during the process of analysis. In order to prepare a detailed process analysis based on heat and mass process descriptive equations, as well as the correlation for wet gas parameter calculation, software in the Microsoft Office Excel environment is being developed.

  6. Modeling and analysis of power processing systems: Feasibility investigation and formulation of a methodology

    NASA Technical Reports Server (NTRS)

    Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.

    1974-01-01

    A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.

  7. 32 CFR 989.8 - Analysis of alternatives.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 6 2011-07-01 2011-07-01 false Analysis of alternatives. 989.8 Section 989.8... ENVIRONMENTAL IMPACT ANALYSIS PROCESS (EIAP) § 989.8 Analysis of alternatives. (a) The Air Force must analyze... of reasonable alternatives, it may limit alternatives selected for detailed environmental analysis to...

  8. Revisiting photon-statistics effects on multiphoton ionization

    NASA Astrophysics Data System (ADS)

    Mouloudakis, G.; Lambropoulos, P.

    2018-05-01

    We present a detailed analysis of the effects of photon statistics on multiphoton ionization. Through a detailed study of the role of intermediate states, we evaluate the conditions under which the premise of nonresonant processes is valid. The limitations of its validity are manifested in the dependence of the process on the stochastic properties of the radiation and found to be quite sensitive to the intensity. The results are quantified through detailed calculations for coherent, chaotic, and squeezed vacuum radiation. Their significance in the context of recent developments in radiation sources such as the short-wavelength free-electron laser and squeezed vacuum radiation is also discussed.

  9. Research on Interventions for Adolescents with Learning Disabilities: A Meta-Analysis of Outcomes Related to Higher-Order Processing.

    ERIC Educational Resources Information Center

    Swanson, H. Lee

    2001-01-01

    Details meta-analysis of 58 intervention studies related to higher-order processing (i.e., problem solving) for adolescents with learning disabilities. Discusses factors that increased effect sizes: (1) measures of metacognition and text understanding; (2) instruction including advanced organizers, new skills, and extended practice; and (3)…

  10. Aircraft Structural Mass Property Prediction Using Conceptual-Level Structural Analysis

    NASA Technical Reports Server (NTRS)

    Sexstone, Matthew G.

    1998-01-01

    This paper describes a methodology that extends the use of the Equivalent LAminated Plate Solution (ELAPS) structural analysis code from conceptual-level aircraft structural analysis to conceptual-level aircraft mass property analysis. Mass property analysis in aircraft structures has historically depended upon parametric weight equations at the conceptual design level and Finite Element Analysis (FEA) at the detailed design level. ELAPS allows for the modeling of detailed geometry, metallic and composite materials, and non-structural mass coupled with analytical structural sizing to produce high-fidelity mass property analyses representing fully configured vehicles early in the design process. This capability is especially valuable for unusual configuration and advanced concept development where existing parametric weight equations are inapplicable and FEA is too time consuming for conceptual design. This paper contrasts the use of ELAPS relative to empirical weight equations and FEA. ELAPS modeling techniques are described and the ELAPS-based mass property analysis process is detailed. Examples of mass property stochastic calculations produced during a recent systems study are provided. This study involved the analysis of three remotely piloted aircraft required to carry scientific payloads to very high altitudes at subsonic speeds. Due to the extreme nature of this high-altitude flight regime, few existing vehicle designs are available for use in performance and weight prediction. ELAPS was employed within a concurrent engineering analysis process that simultaneously produces aerodynamic, structural, and static aeroelastic results for input to aircraft performance analyses. The ELAPS models produced for each concept were also used to provide stochastic analyses of wing structural mass properties. The results of this effort indicate that ELAPS is an efficient means to conduct multidisciplinary trade studies at the conceptual design level.

  11. Aircraft Structural Mass Property Prediction Using Conceptual-Level Structural Analysis

    NASA Technical Reports Server (NTRS)

    Sexstone, Matthew G.

    1998-01-01

    This paper describes a methodology that extends the use of the Equivalent LAminated Plate Solution (ELAPS) structural analysis code from conceptual-level aircraft structural analysis to conceptual-level aircraft mass property analysis. Mass property analysis in aircraft structures has historically depended upon parametric weight equations at the conceptual design level and Finite Element Analysis (FEA) at the detailed design level ELAPS allows for the modeling of detailed geometry, metallic and composite materials, and non-structural mass coupled with analytical structural sizing to produce high-fidelity mass property analyses representing fully configured vehicles early in the design process. This capability is especially valuable for unusual configuration and advanced concept development where existing parametric weight equations are inapplicable and FEA is too time consuming for conceptual design. This paper contrasts the use of ELAPS relative to empirical weight equations and FEA. ELAPS modeling techniques are described and the ELAPS-based mass property analysis process is detailed Examples of mass property stochastic calculations produced during a recent systems study are provided This study involved the analysis of three remotely piloted aircraft required to carry scientific payloads to very high altitudes at subsonic speeds. Due to the extreme nature of this high-altitude flight regime,few existing vehicle designs are available for use in performance and weight prediction. ELAPS was employed within a concurrent engineering analysis process that simultaneously produces aerodynamic, structural, and static aeroelastic results for input to aircraft performance analyses. The ELAPS models produced for each concept were also used to provide stochastic analyses of wing structural mass properties. The results of this effort indicate that ELAPS is an efficient means to conduct multidisciplinary trade studies at the conceptual design level.

  12. Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures

    NASA Technical Reports Server (NTRS)

    Biegel, Bryan A. (Technical Monitor); Jost, G.; Jin, H.; Labarta J.; Gimenez, J.; Caubet, J.

    2003-01-01

    Parallel programming paradigms include process level parallelism, thread level parallelization, and multilevel parallelism. This viewgraph presentation describes a detailed performance analysis of these paradigms for Shared Memory Architecture (SMA). This analysis uses the Paraver Performance Analysis System. The presentation includes diagrams of a flow of useful computations.

  13. Imaging Girls: Visual Methodologies and Messages for Girls' Education

    ERIC Educational Resources Information Center

    Magno, Cathryn; Kirk, Jackie

    2008-01-01

    This article describes the use of visual methodologies to examine images of girls used by development agencies to portray and promote their work in girls' education, and provides a detailed discussion of three report cover images. It details the processes of methodology and tool development for the visual analysis and presents initial 'readings'…

  14. SU-E-T-635: Process Mapping of Eye Plaque Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huynh, J; Kim, Y

    Purpose: To apply a risk-based assessment and analysis technique (AAPM TG 100) to eye plaque brachytherapy treatment of ocular melanoma. Methods: The role and responsibility of personnel involved in the eye plaque brachytherapy is defined for retinal specialist, radiation oncologist, nurse and medical physicist. The entire procedure was examined carefully. First, major processes were identified and then details for each major process were followed. Results: Seventy-one total potential modes were identified. Eight major processes (corresponding detailed number of modes) are patient consultation (2 modes), pretreatment tumor localization (11), treatment planning (13), seed ordering and calibration (10), eye plaque assembly (10),more » implantation (11), removal (11), and deconstruction (3), respectively. Half of the total modes (36 modes) are related to physicist while physicist is not involved in processes such as during the actual procedure of suturing and removing the plaque. Conclusion: Not only can failure modes arise from physicist-related procedures such as treatment planning and source activity calibration, but it can also exist in more clinical procedures by other medical staff. The improvement of the accurate communication for non-physicist-related clinical procedures could potentially be an approach to prevent human errors. More rigorous physics double check would reduce the error for physicist-related procedures. Eventually, based on this detailed process map, failure mode and effect analysis (FMEA) will identify top tiers of modes by ranking all possible modes with risk priority number (RPN). For those high risk modes, fault tree analysis (FTA) will provide possible preventive action plans.« less

  15. Time-Resolved and Spatio-Temporal Analysis of Complex Cognitive Processes and their Role in Disorders like Developmental Dyscalculia

    PubMed Central

    Mórocz, István Akos; Janoos, Firdaus; van Gelderen, Peter; Manor, David; Karni, Avi; Breznitz, Zvia; von Aster, Michael; Kushnir, Tammar; Shalev, Ruth

    2012-01-01

    The aim of this article is to report on the importance and challenges of a time-resolved and spatio-temporal analysis of fMRI data from complex cognitive processes and associated disorders using a study on developmental dyscalculia (DD). Participants underwent fMRI while judging the incorrectness of multiplication results, and the data were analyzed using a sequence of methods, each of which progressively provided more a detailed picture of the spatio-temporal aspect of this disease. Healthy subjects and subjects with DD performed alike behaviorally though they exhibited parietal disparities using traditional voxel-based group analyses. Further and more detailed differences, however, surfaced with a time-resolved examination of the neural responses during the experiment. While performing inter-group comparisons, a third group of subjects with dyslexia (DL) but with no arithmetic difficulties was included to test the specificity of the analysis and strengthen the statistical base with overall fifty-eight subjects. Surprisingly, the analysis showed a functional dissimilarity during an initial reading phase for the group of dyslexic but otherwise normal subjects, with respect to controls, even though only numerical digits and no alphabetic characters were presented. Thus our results suggest that time-resolved multi-variate analysis of complex experimental paradigms has the ability to yield powerful new clinical insights about abnormal brain function. Similarly, a detailed compilation of aberrations in the functional cascade may have much greater potential to delineate the core processing problems in mental disorders. PMID:22368322

  16. Quantitative method of medication system interface evaluation.

    PubMed

    Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F

    2007-01-01

    The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.

  17. Development of Low-cost, High Energy-per-unit-area Solar Cell Modules

    NASA Technical Reports Server (NTRS)

    Jones, G. T.; Chitre, S.; Rhee, S. S.

    1978-01-01

    The development of two hexagonal solar cell process sequences, a laserscribing process technique for scribing hexagonal and modified hexagonal solar cells, a large through-put diffusion process, and two surface macrostructure processes suitable for large scale production is reported. Experimental analysis was made on automated spin-on anti-reflective coating equipment and high pressure wafer cleaning equipment. Six hexagonal solar cell modules were fabricated. Also covered is a detailed theoretical analysis on the optimum silicon utilization by modified hexagonal solar cells.

  18. The Process of Student Cognition in Constructing Mathematical Conjecture

    ERIC Educational Resources Information Center

    Astawa, I. Wayan Puja; Budayasa, I. Ketut; Juniati, Dwi

    2018-01-01

    This research aims to describe the process of student cognition in constructing mathematical conjecture. Many researchers have studied this process but without giving a detailed explanation of how students understand the information to construct a mathematical conjecture. The researchers focus their analysis on how to construct and prove the…

  19. Six Sigma Approach to Improve Stripping Quality of Automotive Electronics Component – a case study

    NASA Astrophysics Data System (ADS)

    Razali, Noraini Mohd; Murni Mohamad Kadri, Siti; Con Ee, Toh

    2018-03-01

    Lacking of problem solving skill techniques and cooperation between support groups are the two obstacles that always been faced in actual production line. Inadequate detail analysis and inappropriate technique in solving the problem may cause the repeating issues which may give impact to the organization performance. This study utilizes a well-structured six sigma DMAIC with combination of other problem solving tools to solve product quality problem in manufacturing of automotive electronics component. The study is concentrated at the stripping process, a critical process steps with highest rejection rate that contribute to the scrap and rework performance. The detail analysis is conducted in the analysis phase to identify the actual root cause of the problem. Then several improvement activities are implemented and the results show that the rejection rate due to stripping defect decrease tremendously and the process capability index improved from 0.75 to 1.67. This results prove that the six sigma approach used to tackle the quality problem is substantially effective.

  20. China’s Energy Security and the South China Sea

    DTIC Science & Technology

    2002-05-01

    discussion focuses on review of Chinese economic and energy policies. Subsequent analysis details Chinese behavior within the broader context of...15 3. AREA ANALYSIS ...supported by the deductive reasoning process through the collection and analysis of relevant research in the field. For an appropriate assessment of the

  1. SU-F-P-01: Changing Your Oncology Information System: A Detailed Process and Lessons Learned

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abing, C

    Purpose: Radiation Oncology departments are faced with many options for pairing their treatment machines with record and verify systems. Recently, there is a push to have a single-vendor-solution. In order to achieve this, the department must go through an intense and rigorous transition process. Our department has recently completed this process and now offer a detailed description of the process along with lessons learned. Methods: Our cancer center transitioned from a multi-vendor department to a single-vendor department over the 2015 calendar year. Our staff was partitioned off into superuser groups, an interface team, migration team, and go-live team. Six monthsmore » after successful implementation, a detailed survey was sent to the radiation oncology department to determine areas for improvement as well as successes in the process. Results: The transition between record and verify systems was considered a complete success. The results of the survey did point out some areas for improving inefficiencies with our staff; both interactions between each other and the vendors. Conclusion: Though this process was intricate and lengthy, it can be made easier with careful planning and detailed designation of project responsibilities. Our survey results and retrospective analysis of the transition are valuable to those wishing to make this change.« less

  2. Tracking the Spatiotemporal Neural Dynamics of Real-world Object Size and Animacy in the Human Brain.

    PubMed

    Khaligh-Razavi, Seyed-Mahdi; Cichy, Radoslaw Martin; Pantazis, Dimitrios; Oliva, Aude

    2018-06-07

    Animacy and real-world size are properties that describe any object and thus bring basic order into our perception of the visual world. Here, we investigated how the human brain processes real-world size and animacy. For this, we applied representational similarity to fMRI and MEG data to yield a view of brain activity with high spatial and temporal resolutions, respectively. Analysis of fMRI data revealed that a distributed and partly overlapping set of cortical regions extending from occipital to ventral and medial temporal cortex represented animacy and real-world size. Within this set, parahippocampal cortex stood out as the region representing animacy and size stronger than most other regions. Further analysis of the detailed representational format revealed differences among regions involved in processing animacy. Analysis of MEG data revealed overlapping temporal dynamics of animacy and real-world size processing starting at around 150 msec and provided the first neuromagnetic signature of real-world object size processing. Finally, to investigate the neural dynamics of size and animacy processing simultaneously in space and time, we combined MEG and fMRI with a novel extension of MEG-fMRI fusion by representational similarity. This analysis revealed partly overlapping and distributed spatiotemporal dynamics, with parahippocampal cortex singled out as a region that represented size and animacy persistently when other regions did not. Furthermore, the analysis highlighted the role of early visual cortex in representing real-world size. A control analysis revealed that the neural dynamics of processing animacy and size were distinct from the neural dynamics of processing low-level visual features. Together, our results provide a detailed spatiotemporal view of animacy and size processing in the human brain.

  3. Development and testing of a fast conceptual river water quality model.

    PubMed

    Keupers, Ingrid; Willems, Patrick

    2017-04-15

    Modern, model based river quality management strongly relies on river water quality models to simulate the temporal and spatial evolution of pollutant concentrations in the water body. Such models are typically constructed by extending detailed hydrodynamic models with a component describing the advection-diffusion and water quality transformation processes in a detailed, physically based way. This approach is too computational time demanding, especially when simulating long time periods that are needed for statistical analysis of the results or when model sensitivity analysis, calibration and validation require a large number of model runs. To overcome this problem, a structure identification method to set up a conceptual river water quality model has been developed. Instead of calculating the water quality concentrations at each water level and discharge node, the river branch is divided into conceptual reservoirs based on user information such as location of interest and boundary inputs. These reservoirs are modelled as Plug Flow Reactor (PFR) and Continuously Stirred Tank Reactor (CSTR) to describe advection and diffusion processes. The same water quality transformation processes as in the detailed models are considered but with adjusted residence times based on the hydrodynamic simulation results and calibrated to the detailed water quality simulation results. The developed approach allows for a much faster calculation time (factor 10 5 ) without significant loss of accuracy, making it feasible to perform time demanding scenario runs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. A Measurable Model of the Creative Process in the Context of a Learning Process

    ERIC Educational Resources Information Center

    Ma, Min; Van Oystaeyen, Fred

    2016-01-01

    The authors' aim was to arrive at a measurable model of the creative process by putting creativity in the context of a learning process. The authors aimed to provide a rather detailed description of how creative thinking fits in a general description of the learning process without trying to go into an analysis of a biological description of the…

  5. Pahoa geothermal industrial park. Engineering and economic analysis for direct applications of geothermal energy in an industrial park at Pahoa, Hawaii

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreau, J.W.

    1980-12-01

    This engineering and economic study evaluated the potential for developing a geothermal industrial park in the Puna District near Pahoa on the Island of Hawaii. Direct heat industrial applications were analyzed from a marketing, engineering, economic, environmental, and sociological standpoint to determine the most viable industries for the park. An extensive literature search produced 31 existing processes currently using geothermal heat. An additional list was compiled indicating industrial processes that require heat that could be provided by geothermal energy. From this information, 17 possible processes were selected for consideration. Careful scrutiny and analysis of these 17 processes revealed three thatmore » justified detailed economic workups. The three processes chosen for detailed analysis were: an ethanol plant using bagasse and wood as feedstock; a cattle feed mill using sugar cane leaf trash as feedstock; and a papaya processing facility providing both fresh and processed fruit. In addition, a research facility to assess and develop other processes was treated as a concept. Consideration was given to the impediments to development, the engineering process requirements and the governmental support for each process. The study describes the geothermal well site chosen, the pipeline to transmit the hydrothermal fluid, and the infrastructure required for the industrial park. A conceptual development plan for the ethanol plant, the feedmill and the papaya processing facility was prepared. The study concluded that a direct heat industrial park in Pahoa, Hawaii, involves considerable risks.« less

  6. GSBPP CAPSTONE REVIEW

    DTIC Science & Technology

    2016-12-01

    including the GSBPP exit survey , archived GSBPP capstones, faculty advisement data, faculty interviews, and a new GSBPP student survey in order to detail...analysis from multiple sources, including the GSBPP exit survey , archived GSBPP capstones, faculty advisement data, faculty interviews, and a new...GSBPP student survey in order to detail the capstone’s process, content, and value to multiple stakeholders. The project team also employs the Plan-Do

  7. Idealized simulation of the Colorado hailstorm case: comparison of bulk and detailed microphysics

    NASA Astrophysics Data System (ADS)

    Geresdi, I.

    One of the purposes of the Fourth Cloud Modeling Workshop was to compare different microphysical treatments. In this paper, the results of a widely used bulk treatment and five versions of a detailed microphysical model are presented. Sensitivity analysis was made to investigate the effect of bulk parametrization, ice initiation technique, CCN concentration and collision efficiency of rimed ice crystal-drop collision. The results show that: (i) The mixing ratios of different species of hydrometeors calculated by bulk and one of the detailed models show some similarity. However, the processes of hail/graupel formation are different in the bulk and the detailed models. (ii) Using different ice initiation in the detailed models' different processes became important in the hail and graupel formation. (iii) In the case of higher CCN concentration, the mixing ratio of liquid water, hail and graupel were more sensitive to the value of collision efficiency of rimed ice crystal-drop collision. (iv) The Bergeron-Findeisen process does not work in the updraft core of a convective cloud. The vapor content was always over water saturation; moreover, the supersaturation gradually increased after the appearance of precipitation ice particles.

  8. A novel method about detecting missing holes on the motor carling

    NASA Astrophysics Data System (ADS)

    Xu, Hongsheng; Tan, Hao; Li, Guirong

    2018-03-01

    After a deep analysis on how to use an image processing system to detect the missing holes on the motor carling, we design the whole system combined with the actual production conditions of the motor carling. Afterwards we explain the whole system's hardware and software in detail. We introduce the general functions for the system's hardware and software. Analyzed these general functions, we discuss the modules of the system's hardware and software and the theory to design these modules in detail. The measurement to confirm the area to image processing, edge detection, randomized Hough transform to circle detecting is explained in detail. Finally, the system result tested in the laboratory and in the factory is given out.

  9. Hyperspectral data analysis procedures with reduced sensitivity to noise

    NASA Technical Reports Server (NTRS)

    Landgrebe, David A.

    1993-01-01

    Multispectral sensor systems have become steadily improved over the years in their ability to deliver increased spectral detail. With the advent of hyperspectral sensors, including imaging spectrometers, this technology is in the process of taking a large leap forward, thus providing the possibility of enabling delivery of much more detailed information. However, this direction of development has drawn even more attention to the matter of noise and other deleterious effects in the data, because reducing the fundamental limitations of spectral detail on information collection raises the limitations presented by noise to even greater importance. Much current effort in remote sensing research is thus being devoted to adjusting the data to mitigate the effects of noise and other deleterious effects. A parallel approach to the problem is to look for analysis approaches and procedures which have reduced sensitivity to such effects. We discuss some of the fundamental principles which define analysis algorithm characteristics providing such reduced sensitivity. One such analysis procedure including an example analysis of a data set is described, illustrating this effect.

  10. Nozzle Numerical Analysis Of The Scimitar Engine

    NASA Astrophysics Data System (ADS)

    Battista, F.; Marini, M.; Cutrone, L.

    2011-05-01

    This work describes part of the activities on the LAPCAT-II A2 vehicle, in which starting from the available conceptual vehicle design and the related pre- cooled turbo-ramjet engine called SCIMITAR, well- thought assumptions made for performance figures of different components during the iteration process within LAPCAT-I will be assessed in more detail. In this paper it is presented a numerical analysis aimed at the design optimization of the nozzle contour of the LAPCAT A2 SCIMITAR engine designed by Reaction Engines Ltd. (REL) (see Figure 1). In particular, nozzle shape optimization process is presented for cruise conditions. All the computations have been carried out by using the CIRA C3NS code in non equilibrium conditions. The effect of considering detailed or reduced chemical kinetic schemes has been analyzed with a particular focus on the production of pollutants. An analysis of engine performance parameters, such as thrust and combustion efficiency has been carried out.

  11. [Comparative analysis on industrial standardization degree of Chinese and Korean ginseng].

    PubMed

    Chu, Qiao; Xi, Xing-Jun; Wang, He-Yan; Si, Ding-Hua; Tang, Fei; Lan, Tao

    2017-05-01

    Panax ginseng is a well-known medicinal plant all over the world. It has high nutritional value and medicinal value. China and South Korea are the major countries in the world for ginseng cultivation, production and exportation. China's ginseng production accounts for more than half of the world, but the output value is less than that of Korea. The standardization process of ginseng industry plays an important role. This paper makes a detailed analysis of the Chinese and Korean ginseng national standards and the standardization process, and makes a detailed comparative analysis of the categories, standard contents, index selection, age, implementation and promotion status of the Chinese and Korean ginseng standards. The development disadvantages of ginseng industry standardization were displayed. And we give our advises on the standard revision, implementation of China's ginseng industry standardization, hoping to enhance the competitiveness of China's ginseng industry. Copyright© by the Chinese Pharmaceutical Association.

  12. A modeling analysis program for the JPL table mountain Io sodium cloud data

    NASA Technical Reports Server (NTRS)

    Smyth, W. H.; Goldberg, B. A.

    1984-01-01

    A detailed review of 110 of the 263 Region B/C images of the 1981 data set is undertaken and a preliminary assessment of 39 images of the 1976-79 data set is presented. The basic spatial characteristics of these images are discussed. Modeling analysis of these images after further data processing will provide useful information about Io and the planetary magnetosphere. Plans for data processing and modeling analysis are outlined. Results of very preliminary modeling activities are presented.

  13. Information Presentation in Decision and Risk Analysis: Answered, Partly Answered, and Unanswered Questions.

    PubMed

    Keller, L Robin; Wang, Yitong

    2017-06-01

    For the last 30 years, researchers in risk analysis, decision analysis, and economics have consistently proven that decisionmakers employ different processes for evaluating and combining anticipated and actual losses, gains, delays, and surprises. Although rational models generally prescribe a consistent response, people's heuristic processes will sometimes lead them to be inconsistent in the way they respond to information presented in theoretically equivalent ways. We point out several promising future research directions by listing and detailing a series of answered, partly answered, and unanswered questions. © 2016 Society for Risk Analysis.

  14. The Analysis of Nine Process-Concepts in Elementary Science. Technical Report No. 428.

    ERIC Educational Resources Information Center

    Klausmeier, Herbert J.; And Others

    Theory and research background regarding the teaching of concepts are presented. Procedures are given in detail on how a concept can be analyzed in order to aid in teaching and preparing instructional materials. Nine processes of science drawn from a published elementary science curriculum ("Science: A Process Approach") are treated as concepts…

  15. Microfluidic electrochemical device and process for chemical imaging and electrochemical analysis at the electrode-liquid interface in-situ

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Xiao-Ying; Liu, Bingwen; Yang, Li

    2016-03-01

    A microfluidic electrochemical device and process are detailed that provide chemical imaging and electrochemical analysis under vacuum at the surface of the electrode-sample or electrode-liquid interface in-situ. The electrochemical device allows investigation of various surface layers including diffuse layers at selected depths populated with, e.g., adsorbed molecules in which chemical transformation in electrolyte solutions occurs.

  16. Who uses toll roads? an analysis of central Texas Turnpike users.

    DOT National Transportation Integrated Search

    2008-12-01

    The report characterizes with the greatest detail both the passenger and commuter users as : well as non-users of the Central Texas Turnpike System recently opened in November 2006 in : Austin, TX. The process of analysis includes a review of literat...

  17. The Reactome pathway Knowledgebase

    PubMed Central

    Fabregat, Antonio; Sidiropoulos, Konstantinos; Garapati, Phani; Gillespie, Marc; Hausmann, Kerstin; Haw, Robin; Jassal, Bijay; Jupe, Steven; Korninger, Florian; McKay, Sheldon; Matthews, Lisa; May, Bruce; Milacic, Marija; Rothfels, Karen; Shamovsky, Veronica; Webber, Marissa; Weiser, Joel; Williams, Mark; Wu, Guanming; Stein, Lincoln; Hermjakob, Henning; D'Eustachio, Peter

    2016-01-01

    The Reactome Knowledgebase (www.reactome.org) provides molecular details of signal transduction, transport, DNA replication, metabolism and other cellular processes as an ordered network of molecular transformations—an extended version of a classic metabolic map, in a single consistent data model. Reactome functions both as an archive of biological processes and as a tool for discovering unexpected functional relationships in data such as gene expression pattern surveys or somatic mutation catalogues from tumour cells. Over the last two years we redeveloped major components of the Reactome web interface to improve usability, responsiveness and data visualization. A new pathway diagram viewer provides a faster, clearer interface and smooth zooming from the entire reaction network to the details of individual reactions. Tool performance for analysis of user datasets has been substantially improved, now generating detailed results for genome-wide expression datasets within seconds. The analysis module can now be accessed through a RESTFul interface, facilitating its inclusion in third party applications. A new overview module allows the visualization of analysis results on a genome-wide Reactome pathway hierarchy using a single screen page. The search interface now provides auto-completion as well as a faceted search to narrow result lists efficiently. PMID:26656494

  18. The CRDS method application for study of the gas-phase processes in the hot CVD diamond thin film.

    NASA Astrophysics Data System (ADS)

    Buzaianumakarov, Vladimir; Hidalgo, Arturo; Morell, Gerardo; Weiner, Brad; Buzaianu, Madalina

    2006-03-01

    For detailed analysis of problem related to the hot CVD carbon-containing nano-material growing, we have to detect different intermediate species forming during the growing process as well as investigate dependences of concentrations of these species on different experimental parameters (concentrations of the CJH4, H2S stable chemical compounds and distance from the filament system to the substrate surface). In the present study, the HS and CS radicals were detected using the Cavity Ring Down Spectroscopic (CRDS) method in the hot CVD diamond thin film for the CH4(0.4 %) + H2 mixture doped by H2S (400 ppm). The absolute absorption density spectra of the HS and CS radicals were obtained as a function of different experimental parameters. This study proofs that the HS and CS radicals are an intermediate, which forms during the hot filament CVD process. The kinetics approach was developed for detailed analysis of the experimental data obtained. The kinetics scheme includes homogenous and heterogenous processes as well as processes of the chemical species transport in the CVD chamber.

  19. System Evaluation and Life-Cycle Cost Analysis of a Commercial-Scale High-Temperature Electrolysis Hydrogen Production Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwin A. Harvego; James E. O'Brien; Michael G. McKellar

    2012-11-01

    Results of a system evaluation and lifecycle cost analysis are presented for a commercial-scale high-temperature electrolysis (HTE) central hydrogen production plant. The plant design relies on grid electricity to power the electrolysis process and system components, and industrial natural gas to provide process heat. The HYSYS process analysis software was used to evaluate the reference central plant design capable of producing 50,000 kg/day of hydrogen. The HYSYS software performs mass and energy balances across all components to allow optimization of the design using a detailed process flow sheet and realistic operating conditions specified by the analyst. The lifecycle cost analysismore » was performed using the H2A analysis methodology developed by the Department of Energy (DOE) Hydrogen Program. This methodology utilizes Microsoft Excel spreadsheet analysis tools that require detailed plant performance information (obtained from HYSYS), along with financial and cost information to calculate lifecycle costs. The results of the lifecycle analyses indicate that for a 10% internal rate of return, a large central commercial-scale hydrogen production plant can produce 50,000 kg/day of hydrogen at an average cost of $2.68/kg. When the cost of carbon sequestration is taken into account, the average cost of hydrogen production increases by $0.40/kg to $3.08/kg.« less

  20. Using a 3D CAD plant model to simplify process hazard reviews

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tolpa, G.

    A Hazard and Operability (HAZOP) review is a formal predictive procedure used to identify potential hazard and operability problems associated with certain processes and facilities. The HAZOP procedure takes place several times during the life cycle of the facility. Replacing plastic models, layout and detail drawings with a 3D CAD electronic model, provides access to process safety information and a detailed level of plant topology that approaches the visualization capability of the imagination. This paper describes the process that is used for adding the use of a 3D CAD model to flowsheets and proven computer programs for the conduct ofmore » hazard and operability reviews. Using flowsheets and study nodes as a road map for the review the need for layout and other detail drawings is all but eliminated. Using the 3D CAD model again for a post-P and ID HAZOP supports conformance to layout and safety requirements, provides superior visualization of the plant configuration and preserves the owners equity in the design. The response from the review teams are overwhelmingly in favor of this type of review over a review that uses only drawings. Over the long term the plant model serves more than just process hazards analysis. Ongoing use of the model can satisfy the required access to process safety information, OHSA documentation and other legal requirements. In this paper extensive instructions address the logic for the process hazards analysis and the preparation required to assist anyone who wishes to add the use of a 3D model to their review.« less

  1. Nonlinear Real-Time Optical Signal Processing.

    DTIC Science & Technology

    1981-06-30

    bandwidth and space-bandwidth products. Real-time homonorphic and loga- rithmic filtering by halftone nonlinear processing has been achieved. A...Page ABSTRACT 1 1. RESEARCH OBJECTIVES AND PROGRESS 3 I-- 1.1 Introduction and Project overview 3 1.2 Halftone Processing 9 1.3 Direct Nonlinear...time homomorphic and logarithmic filtering by halftone nonlinear processing has been achieved. A detailed analysis of degradation due to the finite gamma

  2. Testing for detailed balance in a financial market

    NASA Astrophysics Data System (ADS)

    Fiebig, H. R.; Musgrove, D. P.

    2015-06-01

    We test a historical price-time series in a financial market (the NASDAQ 100 index) for a statistical property known as detailed balance. The presence of detailed balance would imply that the market can be modeled by a stochastic process based on a Markov chain, thus leading to equilibrium. In economic terms, a positive outcome of the test would support the efficient market hypothesis, a cornerstone of neo-classical economic theory. In contrast to the usage in prevalent economic theory the term equilibrium here is tied to the returns, rather than the price-time series. The test is based on an action functional S constructed from the elements of the detailed balance condition and the historical data set, and then analyzing S by means of simulated annealing. Checks are performed to verify the validity of the analysis method. We discuss the outcome of this analysis.

  3. High Dynamic Range Digital Imaging of Spacecraft

    NASA Technical Reports Server (NTRS)

    Karr, Brian A.; Chalmers, Alan; Debattista, Kurt

    2014-01-01

    The ability to capture engineering imagery with a wide degree of dynamic range during rocket launches is critical for post launch processing and analysis [USC03, NNC86]. Rocket launches often present an extreme range of lightness, particularly during night launches. Night launches present a two-fold problem: capturing detail of the vehicle and scene that is masked by darkness, while also capturing detail in the engine plume.

  4. Enhancement tuning and control for high dynamic range images in multi-scale locally adaptive contrast enhancement algorithms

    NASA Astrophysics Data System (ADS)

    Cvetkovic, Sascha D.; Schirris, Johan; de With, Peter H. N.

    2009-01-01

    For real-time imaging in surveillance applications, visibility of details is of primary importance to ensure customer confidence. If we display High Dynamic-Range (HDR) scenes whose contrast spans four or more orders of magnitude on a conventional monitor without additional processing, results are unacceptable. Compression of the dynamic range is therefore a compulsory part of any high-end video processing chain because standard monitors are inherently Low- Dynamic Range (LDR) devices with maximally two orders of display dynamic range. In real-time camera processing, many complex scenes are improved with local contrast enhancements, bringing details to the best possible visibility. In this paper, we show how a multi-scale high-frequency enhancement scheme, in which gain is a non-linear function of the detail energy, can be used for the dynamic range compression of HDR real-time video camera signals. We also show the connection of our enhancement scheme to the processing way of the Human Visual System (HVS). Our algorithm simultaneously controls perceived sharpness, ringing ("halo") artifacts (contrast) and noise, resulting in a good balance between visibility of details and non-disturbance of artifacts. The overall quality enhancement, suitable for both HDR and LDR scenes, is based on a careful selection of the filter types for the multi-band decomposition and a detailed analysis of the signal per frequency band.

  5. Evidence from mixed hydrate nucleation for a funnel model of crystallization.

    PubMed

    Hall, Kyle Wm; Carpendale, Sheelagh; Kusalik, Peter G

    2016-10-25

    The molecular-level details of crystallization remain unclear for many systems. Previous work has speculated on the phenomenological similarities between molecular crystallization and protein folding. Here we demonstrate that molecular crystallization can involve funnel-shaped potential energy landscapes through a detailed analysis of mixed gas hydrate nucleation, a prototypical multicomponent crystallization process. Through this, we contribute both: (i) a powerful conceptual framework for exploring and rationalizing molecular crystallization, and (ii) an explanation of phenomenological similarities between protein folding and crystallization. Such funnel-shaped potential energy landscapes may be typical of broad classes of molecular ordering processes, and can provide a new perspective for both studying and understanding these processes.

  6. Evidence from mixed hydrate nucleation for a funnel model of crystallization

    PubMed Central

    Hall, Kyle Wm.; Carpendale, Sheelagh; Kusalik, Peter G.

    2016-01-01

    The molecular-level details of crystallization remain unclear for many systems. Previous work has speculated on the phenomenological similarities between molecular crystallization and protein folding. Here we demonstrate that molecular crystallization can involve funnel-shaped potential energy landscapes through a detailed analysis of mixed gas hydrate nucleation, a prototypical multicomponent crystallization process. Through this, we contribute both: (i) a powerful conceptual framework for exploring and rationalizing molecular crystallization, and (ii) an explanation of phenomenological similarities between protein folding and crystallization. Such funnel-shaped potential energy landscapes may be typical of broad classes of molecular ordering processes, and can provide a new perspective for both studying and understanding these processes. PMID:27790987

  7. Planning for Success: Integrating Analysis with Decision Making.

    ERIC Educational Resources Information Center

    Goho, James; Webb, Ken

    2003-01-01

    Describes a successful strategic planning process at a large community college, which linked the analytic inputs of research with the authority and intuition of leaders. Reports key factors attributed to the process' success, including a collegial and organized structure, detailed project management plans, and confidence in the environmental scan.…

  8. Plane-wave decomposition by spherical-convolution microphone array

    NASA Astrophysics Data System (ADS)

    Rafaely, Boaz; Park, Munhum

    2004-05-01

    Reverberant sound fields are widely studied, as they have a significant influence on the acoustic performance of enclosures in a variety of applications. For example, the intelligibility of speech in lecture rooms, the quality of music in auditoria, the noise level in offices, and the production of 3D sound in living rooms are all affected by the enclosed sound field. These sound fields are typically studied through frequency response measurements or statistical measures such as reverberation time, which do not provide detailed spatial information. The aim of the work presented in this seminar is the detailed analysis of reverberant sound fields. A measurement and analysis system based on acoustic theory and signal processing, designed around a spherical microphone array, is presented. Detailed analysis is achieved by decomposition of the sound field into waves, using spherical Fourier transform and spherical convolution. The presentation will include theoretical review, simulation studies, and initial experimental results.

  9. Producing Hydrogen With Sunlight

    NASA Technical Reports Server (NTRS)

    Biddle, J. R.; Peterson, D. B.; Fujita, T.

    1987-01-01

    Costs high but reduced by further research. Producing hydrogen fuel on large scale from water by solar energy practical if plant costs reduced, according to study. Sunlight attractive energy source because it is free and because photon energy converts directly to chemical energy when it breaks water molecules into diatomic hydrogen and oxygen. Conversion process low in efficiency and photochemical reactor must be spread over large area, requiring large investment in plant. Economic analysis pertains to generic photochemical processes. Does not delve into details of photochemical reactor design because detailed reactor designs do not exist at this early stage of development.

  10. Optical Traps to Study Properties of Molecular Motors

    PubMed Central

    Spudich, James A.; Rice, Sarah E.; Rock, Ronald S.; Purcell, Thomas J.; Warrick, Hans M.

    2016-01-01

    In vitro motility assays enabled the analysis of coupling between ATP hydrolysis and movement of myosin along actin filaments or kinesin along microtubules. Single-molecule assays using laser trapping have been used to obtain more detailed information about kinesins, myosins, and processive DNA enzymes. The combination of in vitro motility assays with laser-trap measurements has revealed detailed dynamic structural changes associated with the ATPase cycle. This article describes the use of optical traps to study processive and nonprocessive molecular motor proteins, focusing on the design of the instrument and the assays to characterize motility. PMID:22046048

  11. Internal Drivers of External Flexibility: A Detailed Analysis

    DTIC Science & Technology

    2007-08-14

    example, order processing within a supplier’s firm is a competence. Meeting customer demand by providing a consistent delivery schedule is a capability...Focus interview on the following logistics areas: a. order processing b. inventory c. transportation d. warehousing, materials handling...demands. In logistics, superior service depends upon order processing (Byrne and Markham 1991); quality of contact personnel (Innis and LaLonde 1994

  12. U.S. data processing for the IRAS project. [by Jet Propulsion Laboratory Scientific Data Analysis System

    NASA Technical Reports Server (NTRS)

    Duxbury, J. H.

    1983-01-01

    The JPL's Scientific Data Analysis System (SDAS), which will process IRAS data and produce a catalogue of perhaps a million infrared sources in the sky, as well as other information for astronomical records, is described. The purposes of SDAS are discussed, and the major SDAS processors are shown in block diagram. The catalogue processing is addressed, mentioning the basic processing steps which will be applied to raw detector data. Signal reconstruction and conversion to astrophysical units, source detection, source confirmation, data management, and survey data products are considered in detail.

  13. Sustainable Design Approach: A case study of BIM use

    NASA Astrophysics Data System (ADS)

    Abdelhameed, Wael

    2017-11-01

    Achieving sustainable design in areas such as energy-efficient design depends largely on the accuracy of the analysis performed after the design is completed with all its components and material details. There are different analysis approaches and methods that predict relevant values and metrics such as U value, energy use and energy savings. Although certain differences in the accuracy of these approaches and methods have been recorded, this research paper does not focus on such matter, where determining the reason for discrepancies between those approaches and methods is difficult, because all error sources act simultaneously. The research paper rather introduces an approach through which BIM, building information modelling, can be utilised during the initial phases of the designing process, by analysing the values and metrics of sustainable design before going into the design details of a building. Managing all of the project drawings in a single file, BIM -building information modelling- is well known as one digital platform that offers a multidisciplinary detailed design -AEC model (Barison and Santos, 2010, Welle et.al., 2011). The paper presents in general BIM use in the early phases of the design process, in order to achieve certain required areas of sustainable design. The paper proceeds to introduce BIM use in specific areas such as site selection, wind velocity and building orientation, in terms of reaching the farther possible sustainable solution. In the initial phases of designing, material details and building components are not fully specified or selected yet. The designer usually focuses on zoning, topology, circulations, and other design requirements. The proposed approach employs the strategies and analysis of BIM use during those initial design phases in order to have the analysis and results of each solution or alternative design. The stakeholders and designers would have a better effective decision making process with a full clarity of each alternative's consequences. The architect would settle down and proceed in the alternative design of the best sustainable analysis. In later design stages, using the sustainable types of materials such as insulation, cladding, etc., and applying sustainable building components such as doors, windows, etc. would add more improvements and enhancements in reaching better values and metrics. The paper describes the methodology of this design approach through BIM strategies adopted in design creation. Case studies of architectural designs are used to highlight the details and benefits of this proposed approach.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mann, M.K.

    Technoeconomic analyses have been conducted on two processes to produce hydrogen from biomass: indirectly-heated gasification of biomass followed by steam reforming of the syngas, and biomass pyrolysis followed by steam reforming of the pyrolysis oil. The analysis of the gasification-based process was highly detailed, including a process flowsheet, material and energy balances calculated with a process simulation program, equipment cost estimation, and the determination of the necessary selling price of hydrogen. The pyrolysis-based process analysis was of a less detailed nature, as all necessary experimental data have not been obtained; this analysis is a follow-up to the preliminary economic analysismore » presented at the 1994 Hydrogen Program Review. A coproduct option in which pyrolysis oil is used to produce hydrogen and a commercial adhesive was also studied for economic viability. Based on feedstock availability estimates, three plant sizes were studied: 907 T/day, 272 T/day, and 27 T/day. The necessary selling price of hydrogen produced by steam reforming syngas from the Battelle Columbus Laboratories indirectly heated biomass gasifier falls within current market values for the large and medium size plants within a wide range of feedstock costs. Results show that the small scale plant does not produce hydrogen at economically competitive prices, indicating that if gasification is used as the upstream process to produce hydrogen, local refueling stations similar to current gasoline stations, would probably not be feasible.« less

  15. Marketing Higher Education to Adults.

    ERIC Educational Resources Information Center

    Kelly, Diana K.

    With fewer recent high school graduates available to attend college, colleges need to increase their efforts to attract adults. If colleges want to attract more adult students, they must develop a comprehensive marketing plan. The marketing process entails a thorough marketing study that includes a detailed institutional analysis, an analysis of…

  16. Evaluation of risk and benefit in thermal effusivity sensor for monitoring lubrication process in pharmaceutical product manufacturing.

    PubMed

    Uchiyama, Jumpei; Kato, Yoshiteru; Uemoto, Yoshifumi

    2014-08-01

    In the process design of tablet manufacturing, understanding and control of the lubrication process is important from various viewpoints. A detailed analysis of thermal effusivity data in the lubrication process was conducted in this study. In addition, we evaluated the risk and benefit in the lubrication process by a detailed investigation. It was found that monitoring of thermal effusivity detected mainly the physical change of bulk density, which was changed by dispersal of the lubricant and the coating powder particle by the lubricant. The monitoring of thermal effusivity was almost the monitoring of bulk density, thermal effusivity could have a high correlation with tablet hardness. Moreover, as thermal effusivity sensor could detect not only the change of the conventional bulk density but also the fractional change of thermal conductivity and thermal capacity, two-phase progress of lubrication process could be revealed. However, each contribution of density, thermal conductivity, or heat capacity to thermal effusivity has the risk of fluctuation by formulation. After carefully considering the change factor with the risk to be changed by formulation, thermal effusivity sensor can be a useful tool for monitoring as process analytical technology, estimating tablet hardness and investigating the detailed mechanism of the lubrication process.

  17. Advanced Information Processing System (AIPS)-based fault tolerant avionics architecture for launch vehicles

    NASA Technical Reports Server (NTRS)

    Lala, Jaynarayan H.; Harper, Richard E.; Jaskowiak, Kenneth R.; Rosch, Gene; Alger, Linda S.; Schor, Andrei L.

    1990-01-01

    An avionics architecture for the advanced launch system (ALS) that uses validated hardware and software building blocks developed under the advanced information processing system program is presented. The AIPS for ALS architecture defined is preliminary, and reliability requirements can be met by the AIPS hardware and software building blocks that are built using the state-of-the-art technology available in the 1992-93 time frame. The level of detail in the architecture definition reflects the level of detail available in the ALS requirements. As the avionics requirements are refined, the architecture can also be refined and defined in greater detail with the help of analysis and simulation tools. A useful methodology is demonstrated for investigating the impact of the avionics suite to the recurring cost of the ALS. It is shown that allowing the vehicle to launch with selected detected failures can potentially reduce the recurring launch costs. A comparative analysis shows that validated fault-tolerant avionics built out of Class B parts can result in lower life-cycle-cost in comparison to simplex avionics built out of Class S parts or other redundant architectures.

  18. WEST-3 wind turbine simulator development

    NASA Technical Reports Server (NTRS)

    Hoffman, J. A.; Sridhar, S.

    1985-01-01

    The software developed for WEST-3, a new, all digital, and fully programmable wind turbine simulator is given. The process of wind turbine simulation on WEST-3 is described in detail. The major steps are, the processing of the mathematical models, the preparation of the constant data, and the use of system software generated executable code for running on WEST-3. The mechanics of reformulation, normalization, and scaling of the mathematical models is discussed in detail, in particulr, the significance of reformulation which leads to accurate simulations. Descriptions for the preprocessor computer programs which are used to prepare the constant data needed in the simulation are given. These programs, in addition to scaling and normalizing all the constants, relieve the user from having to generate a large number of constants used in the simulation. Also given are brief descriptions of the components of the WEST-3 system software: Translator, Assembler, Linker, and Loader. Also included are: details of the aeroelastic rotor analysis, which is the center of a wind turbine simulation model, analysis of the gimbal subsystem; and listings of the variables, constants, and equations used in the simulation.

  19. Environmental analysis using integrated GIS and remotely sensed data - Some research needs and priorities

    NASA Technical Reports Server (NTRS)

    Davis, Frank W.; Quattrochi, Dale A.; Ridd, Merrill K.; Lam, Nina S.-N.; Walsh, Stephen J.

    1991-01-01

    This paper discusses some basic scientific issues and research needs in the joint processing of remotely sensed and GIS data for environmental analysis. Two general topics are treated in detail: (1) scale dependence of geographic data and the analysis of multiscale remotely sensed and GIS data, and (2) data transformations and information flow during data processing. The discussion of scale dependence focuses on the theory and applications of spatial autocorrelation, geostatistics, and fractals for characterizing and modeling spatial variation. Data transformations during processing are described within the larger framework of geographical analysis, encompassing sampling, cartography, remote sensing, and GIS. Development of better user interfaces between image processing, GIS, database management, and statistical software is needed to expedite research on these and other impediments to integrated analysis of remotely sensed and GIS data.

  20. 22 CFR 124.13 - Procurement by United States persons in foreign countries (offshore procurement).

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... (build-to-print means producing an end-item (i.e., system, subsystem or component) from technical... of any information which discloses design methodology, engineering analysis, detailed process...

  1. 22 CFR 124.13 - Procurement by United States persons in foreign countries (offshore procurement).

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... (build-to-print means producing an end-item (i.e., system, subsystem or component) from technical... of any information which discloses design methodology, engineering analysis, detailed process...

  2. 22 CFR 124.13 - Procurement by United States persons in foreign countries (offshore procurement).

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... (build-to-print means producing an end-item (i.e., system, subsystem or component) from technical... of any information which discloses design methodology, engineering analysis, detailed process...

  3. 22 CFR 124.13 - Procurement by United States persons in foreign countries (offshore procurement).

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... (build-to-print means producing an end-item (i.e., system, subsystem or component) from technical... of any information which discloses design methodology, engineering analysis, detailed process...

  4. 22 CFR 124.13 - Procurement by United States persons in foreign countries (offshore procurement).

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... (build-to-print means producing an end-item (i.e., system, subsystem or component) from technical... of any information which discloses design methodology, engineering analysis, detailed process...

  5. Morphology of the external genitalia of the adult male and female mice as an endpoint of sex differentiation

    PubMed Central

    Weiss, Dana A.; Rodriguez, Esequiel; Cunha, Tristan; Menshenina, Julia; Barcellos, Dale; Chan, Lok Yun; Risbridger, Gail; Baskin, Laurence; Cunha, Gerald

    2013-01-01

    Adult external genitalia (ExG) are the endpoints of normal sex differentiation. Detailed morphometric analysis and comparison of adult mouse ExG has revealed 10 homologous features distinguishing the penis and clitoris that define masculine vs. feminine sex differentiation. These features have enabled the construction of a simple metric to evaluate various intersex conditions in mutant or hormonally manipulated mice. This review focuses on the morphology of the adult mouse penis and clitoris through detailed analysis of histologic sections, scanning electron microscopy, and three-dimensional reconstruction. We also present previous results from evaluation of “non-traditional” mammals, such as the spotted hyena and wallaby to demonstrate the complex process of sex differentiation that involves not only androgen-dependent processes, but also estrogen-dependent and hormone-independent mechanisms. PMID:21893161

  6. Parvocellular Pathway Impairment in Autism Spectrum Disorder: Evidence from Visual Evoked Potentials

    ERIC Educational Resources Information Center

    Fujita, Takako; Yamasaki, Takao; Kamio, Yoko; Hirose, Shinichi; Tobimatsu, Shozo

    2011-01-01

    In humans, visual information is processed via parallel channels: the parvocellular (P) pathway analyzes color and form information, whereas the magnocellular (M) stream plays an important role in motion analysis. Individuals with autism spectrum disorder (ASD) often show superior performance in processing fine detail, but impaired performance in…

  7. Application of the NUREG/CR-6850 EPRI/NRC Fire PRA Methodology to a DOE Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tom Elicson; Bentley Harwood; Richard Yorg

    2011-03-01

    The application NUREG/CR-6850 EPRI/NRC fire PRA methodology to DOE facility presented several challenges. This paper documents the process and discusses several insights gained during development of the fire PRA. A brief review of the tasks performed is provided with particular focus on the following: • Tasks 5 and 14: Fire-induced risk model and fire risk quantification. A key lesson learned was to begin model development and quantification as early as possible in the project using screening values and simplified modeling if necessary. • Tasks 3 and 9: Fire PRA cable selection and detailed circuit failure analysis. In retrospect, it wouldmore » have been beneficial to perform the model development and quantification in 2 phases with detailed circuit analysis applied during phase 2. This would have allowed for development of a robust model and quantification earlier in the project and would have provided insights into where to focus the detailed circuit analysis efforts. • Tasks 8 and 11: Scoping fire modeling and detailed fire modeling. More focus should be placed on detailed fire modeling and less focus on scoping fire modeling. This was the approach taken for the fire PRA. • Task 14: Fire risk quantification. Typically, multiple safe shutdown (SSD) components fail during a given fire scenario. Therefore dependent failure analysis is critical to obtaining a meaningful fire risk quantification. Dependent failure analysis for the fire PRA presented several challenges which will be discussed in the full paper.« less

  8. Classification of processes involved in sharing individual participant data from clinical trials.

    PubMed

    Ohmann, Christian; Canham, Steve; Banzi, Rita; Kuchinke, Wolfgang; Battaglia, Serena

    2018-01-01

    Background: In recent years, a cultural change in the handling of data from research has resulted in the strong promotion of a culture of openness and increased sharing of data. In the area of clinical trials, sharing of individual participant data involves a complex set of processes and the interaction of many actors and actions. Individual services/tools to support data sharing are available, but what is missing is a detailed, structured and comprehensive list of processes/subprocesses involved and tools/services needed. Methods : Principles and recommendations from a published data sharing consensus document are analysed in detail by a small expert group. Processes/subprocesses involved in data sharing are identified and linked to actors and possible services/tools. Definitions are adapted from the business process model and notation (BPMN) and applied in the analysis. Results: A detailed and comprehensive list of individual processes/subprocesses involved in data sharing, structured according to 9 main processes, is provided. Possible tools/services to support these processes/subprocesses are identified and grouped according to major type of support. Conclusions: The list of individual processes/subprocesses and tools/services identified is a first step towards development of a generic framework or architecture for sharing of data from clinical trials. Such a framework is strongly needed to give an overview of how various actors, research processes and services could form an interoperable system for data sharing.

  9. Classification of processes involved in sharing individual participant data from clinical trials

    PubMed Central

    Ohmann, Christian; Canham, Steve; Banzi, Rita; Kuchinke, Wolfgang; Battaglia, Serena

    2018-01-01

    Background: In recent years, a cultural change in the handling of data from research has resulted in the strong promotion of a culture of openness and increased sharing of data. In the area of clinical trials, sharing of individual participant data involves a complex set of processes and the interaction of many actors and actions. Individual services/tools to support data sharing are available, but what is missing is a detailed, structured and comprehensive list of processes/subprocesses involved and tools/services needed. Methods: Principles and recommendations from a published data sharing consensus document are analysed in detail by a small expert group. Processes/subprocesses involved in data sharing are identified and linked to actors and possible services/tools. Definitions are adapted from the business process model and notation (BPMN) and applied in the analysis. Results: A detailed and comprehensive list of individual processes/subprocesses involved in data sharing, structured according to 9 main processes, is provided. Possible tools/services to support these processes/subprocesses are identified and grouped according to major type of support. Conclusions: The list of individual processes/subprocesses and tools/services identified is a first step towards development of a generic framework or architecture for sharing of data from clinical trials. Such a framework is strongly needed to give an overview of how various actors, research processes and services could form an interoperable system for data sharing. PMID:29623192

  10. Silicon production process evaluations

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Chemical engineering analyses involving the preliminary process design of a plant (1,000 metric tons/year capacity) to produce silicon via the technology under consideration were accomplished. Major activities in the chemical engineering analyses included base case conditions, reaction chemistry, process flowsheet, material balance, energy balance, property data, equipment design, major equipment list, production labor and forward for economic analysis. The process design package provided detailed data for raw materials, utilities, major process equipment and production labor requirements necessary for polysilicon production in each process.

  11. State of the art in pathology business process analysis, modeling, design and optimization.

    PubMed

    Schrader, Thomas; Blobel, Bernd; García-Rojo, Marcial; Daniel, Christel; Słodkowska, Janina

    2012-01-01

    For analyzing current workflows and processes, for improving them, for quality management and quality assurance, for integrating hardware and software components, but also for education, training and communication between different domains' experts, modeling business process in a pathology department is inevitable. The authors highlight three main processes in pathology: general diagnostic, cytology diagnostic, and autopsy. In this chapter, those processes are formally modeled and described in detail. Finally, specialized processes such as immunohistochemistry and frozen section have been considered.

  12. Stochastic flow shop scheduling of overlapping jobs on tandem machines in application to optimizing the US Army's deliberate nuclear, biological, and chemical decontamination process, (final report). Master's thesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Novikov, V.

    1991-05-01

    The U.S. Army's detailed equipment decontamination process is a stochastic flow shop which has N independent non-identical jobs (vehicles) which have overlapping processing times. This flow shop consists of up to six non-identical machines (stations). With the exception of one station, the processing times of the jobs are random variables. Based on an analysis of the processing times, the jobs for the 56 Army heavy division companies were scheduled according to the best shortest expected processing time - longest expected processing time (SEPT-LEPT) sequence. To assist in this scheduling the Gap Comparison Heuristic was developed to select the best SEPT-LEPTmore » schedule. This schedule was then used in balancing the detailed equipment decon line in order to find the best possible site configuration subject to several constraints. The detailed troop decon line, in which all jobs are independent and identically distributed, was then balanced. Lastly, an NBC decon optimization computer program was developed using the scheduling and line balancing results. This program serves as a prototype module for the ANBACIS automated NBC decision support system.... Decontamination, Stochastic flow shop, Scheduling, Stochastic scheduling, Minimization of the makespan, SEPT-LEPT Sequences, Flow shop line balancing, ANBACIS.« less

  13. RootGraph: a graphic optimization tool for automated image analysis of plant roots

    PubMed Central

    Cai, Jinhai; Zeng, Zhanghui; Connor, Jason N.; Huang, Chun Yuan; Melino, Vanessa; Kumar, Pankaj; Miklavcic, Stanley J.

    2015-01-01

    This paper outlines a numerical scheme for accurate, detailed, and high-throughput image analysis of plant roots. In contrast to existing root image analysis tools that focus on root system-average traits, a novel, fully automated and robust approach for the detailed characterization of root traits, based on a graph optimization process is presented. The scheme, firstly, distinguishes primary roots from lateral roots and, secondly, quantifies a broad spectrum of root traits for each identified primary and lateral root. Thirdly, it associates lateral roots and their properties with the specific primary root from which the laterals emerge. The performance of this approach was evaluated through comparisons with other automated and semi-automated software solutions as well as against results based on manual measurements. The comparisons and subsequent application of the algorithm to an array of experimental data demonstrate that this method outperforms existing methods in terms of accuracy, robustness, and the ability to process root images under high-throughput conditions. PMID:26224880

  14. Process-based Cost Estimation for Ramjet/Scramjet Engines

    NASA Technical Reports Server (NTRS)

    Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John

    2003-01-01

    Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.

  15. SemanticSCo: A platform to support the semantic composition of services for gene expression analysis.

    PubMed

    Guardia, Gabriela D A; Ferreira Pires, Luís; da Silva, Eduardo G; de Farias, Cléver R G

    2017-02-01

    Gene expression studies often require the combined use of a number of analysis tools. However, manual integration of analysis tools can be cumbersome and error prone. To support a higher level of automation in the integration process, efforts have been made in the biomedical domain towards the development of semantic web services and supporting composition environments. Yet, most environments consider only the execution of simple service behaviours and requires users to focus on technical details of the composition process. We propose a novel approach to the semantic composition of gene expression analysis services that addresses the shortcomings of the existing solutions. Our approach includes an architecture designed to support the service composition process for gene expression analysis, and a flexible strategy for the (semi) automatic composition of semantic web services. Finally, we implement a supporting platform called SemanticSCo to realize the proposed composition approach and demonstrate its functionality by successfully reproducing a microarray study documented in the literature. The SemanticSCo platform provides support for the composition of RESTful web services semantically annotated using SAWSDL. Our platform also supports the definition of constraints/conditions regarding the order in which service operations should be invoked, thus enabling the definition of complex service behaviours. Our proposed solution for semantic web service composition takes into account the requirements of different stakeholders and addresses all phases of the service composition process. It also provides support for the definition of analysis workflows at a high-level of abstraction, thus enabling users to focus on biological research issues rather than on the technical details of the composition process. The SemanticSCo source code is available at https://github.com/usplssb/SemanticSCo. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. The role of real-time in biomedical science: a meta-analysis on computational complexity, delay and speedup.

    PubMed

    Faust, Oliver; Yu, Wenwei; Rajendra Acharya, U

    2015-03-01

    The concept of real-time is very important, as it deals with the realizability of computer based health care systems. In this paper we review biomedical real-time systems with a meta-analysis on computational complexity (CC), delay (Δ) and speedup (Sp). During the review we found that, in the majority of papers, the term real-time is part of the thesis indicating that a proposed system or algorithm is practical. However, these papers were not considered for detailed scrutiny. Our detailed analysis focused on papers which support their claim of achieving real-time, with a discussion on CC or Sp. These papers were analyzed in terms of processing system used, application area (AA), CC, Δ, Sp, implementation/algorithm (I/A) and competition. The results show that the ideas of parallel processing and algorithm delay were only recently introduced and journal papers focus more on Algorithm (A) development than on implementation (I). Most authors compete on big O notation (O) and processing time (PT). Based on these results, we adopt the position that the concept of real-time will continue to play an important role in biomedical systems design. We predict that parallel processing considerations, such as Sp and algorithm scaling, will become more important. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. An image-processing method to detect sub-optical features based on understanding noise in intensity measurements.

    PubMed

    Bhatia, Tripta

    2018-07-01

    Accurate quantitative analysis of image data requires that we distinguish between fluorescence intensity (true signal) and the noise inherent to its measurements to the extent possible. We image multilamellar membrane tubes and beads that grow from defects in the fluid lamellar phase of the lipid 1,2-dioleoyl-sn-glycero-3-phosphocholine dissolved in water and water-glycerol mixtures by using fluorescence confocal polarizing microscope. We quantify image noise and determine the noise statistics. Understanding the nature of image noise also helps in optimizing image processing to detect sub-optical features, which would otherwise remain hidden. We use an image-processing technique "optimum smoothening" to improve the signal-to-noise ratio of features of interest without smearing their structural details. A high SNR renders desired positional accuracy with which it is possible to resolve features of interest with width below optical resolution. Using optimum smoothening, the smallest and the largest core diameter detected is of width [Formula: see text] and [Formula: see text] nm, respectively, discussed in this paper. The image-processing and analysis techniques and the noise modeling discussed in this paper can be used for detailed morphological analysis of features down to sub-optical length scales that are obtained by any kind of fluorescence intensity imaging in the raster mode.

  18. Local Assessment: Using Genre Analysis to Validate Directed Self-Placement

    ERIC Educational Resources Information Center

    Gere, Anne Ruggles; Aull, Laura; Escudero, Moises Damian Perales; Lancaster, Zak; Lei, Elizabeth Vander

    2013-01-01

    Grounded in the principle that writing assessment should be locally developed and controlled, this article describes a study that contextualizes and validates the decisions that students make in the modified Directed Self-Placement (DSP) process used at the University of Michigan. The authors present results of a detailed text analysis of…

  19. Identifying Natural Sources of Resistance: A Case Study Analysis of Curriculum Implementation.

    ERIC Educational Resources Information Center

    Swanson-Owens, Deborah

    Frequently curriculum implementation procedures consist of little more than teachers receiving descriptions of subject matter, definitions of new or technical terminology, and/or outlines detailing the surface steps of an instructional process. This case study analysis of how two high school teachers adapted some new curriculum features into their…

  20. Air pollution from aircraft. [jet exhaust - aircraft fuels/combustion efficiency

    NASA Technical Reports Server (NTRS)

    Heywood, J. B.; Chigier, N. A.

    1975-01-01

    A model which predicts nitric oxide and carbon monoxide emissions from a swirl can modular combustor is discussed. A detailed analysis of the turbulent fuel-air mixing process in the swirl can module wake region is reviewed. Hot wire anemometry was employed, and gas sampling analysis of fuel combustion emissions were performed.

  1. Laboratory cost control and financial management software.

    PubMed

    Mayer, M

    1998-02-09

    Economical constraints within the health care system advocate the introduction of tighter control of costs in clinical laboratories. Detailed cost information forms the basis for cost control and financial management. Based on the cost information, proper decisions regarding priorities, procedure choices, personnel policies and investments can be made. This presentation outlines some principles of cost analysis, describes common limitations of cost analysis, and exemplifies use of software to achieve optimized cost control. One commercially available cost analysis software, LabCost, is described in some detail. In addition to provision of cost information, LabCost also serves as a general management tool for resource handling, accounting, inventory management and billing. The application of LabCost in the selection process of a new high throughput analyzer for a large clinical chemistry service is taken as an example for decisions that can be assisted by cost evaluation. It is concluded that laboratory management that wisely utilizes cost analysis to support the decision-making process will undoubtedly have a clear advantage over those laboratories that fail to employ cost considerations to guide their actions.

  2. An innovative and shared methodology for event reconstruction using images in forensic science.

    PubMed

    Milliet, Quentin; Jendly, Manon; Delémont, Olivier

    2015-09-01

    This study presents an innovative methodology for forensic science image analysis for event reconstruction. The methodology is based on experiences from real cases. It provides real added value to technical guidelines such as standard operating procedures (SOPs) and enriches the community of practices at stake in this field. This bottom-up solution outlines the many facets of analysis and the complexity of the decision-making process. Additionally, the methodology provides a backbone for articulating more detailed and technical procedures and SOPs. It emerged from a grounded theory approach; data from individual and collective interviews with eight Swiss and nine European forensic image analysis experts were collected and interpreted in a continuous, circular and reflexive manner. Throughout the process of conducting interviews and panel discussions, similarities and discrepancies were discussed in detail to provide a comprehensive picture of practices and points of view and to ultimately formalise shared know-how. Our contribution sheds light on the complexity of the choices, actions and interactions along the path of data collection and analysis, enhancing both the researchers' and participants' reflexivity. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  3. Get on Board the Cost Effective Way: A Tech Prep Replication Process.

    ERIC Educational Resources Information Center

    Moore, Wayne A.; Szul, Linda F.; Rivosecchi, Karen

    1997-01-01

    The Northwestern Pennsylvania Tech Prep Consortium model for replicating tech prep programs includes these steps: fact finding, local industry analysis, curriculum development, detailed description, marketing strategies, implementation, and program evaluation. (SK)

  4. An information adaptive system study report and development plan

    NASA Technical Reports Server (NTRS)

    Ataras, W. S.; Eng, K.; Morone, J. J.; Beaudet, P. R.; Chin, R.

    1980-01-01

    The purpose of the information adaptive system (IAS) study was to determine how some selected Earth resource applications may be processed onboard a spacecraft and to provide a detailed preliminary IAS design for these applications. Detailed investigations of a number of applications were conducted with regard to IAS and three were selected for further analysis. Areas of future research and development include algorithmic specifications, system design specifications, and IAS recommended time lines.

  5. Integrated Glass Coating Manufacturing Line

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brophy, Brenor

    2015-09-30

    This project aims to enable US module manufacturers to coat glass with Enki’s state of the art tunable functionalized AR coatings at the lowest possible cost and highest possible performance by encapsulating Enki’s coating process in an integrated tool that facilitates effective process improvement through metrology and data analysis for greater quality and performance while reducing footprint, operating and capital costs. The Phase 1 objective was a fully designed manufacturing line, including fully specified equipment ready for issue of purchase requisitions; a detailed economic justification based on market prices at the end of Phase 1 and projected manufacturing costs andmore » a detailed deployment plan for the equipment.« less

  6. Quality-assurance plan for the analysis of fluvial sediment by the U.S. Geological Survey Kentucky Water Science Center Sediment Laboratory

    USGS Publications Warehouse

    Shreve, Elizabeth A.; Downs, Aimee C.

    2005-01-01

    This report describes laboratory procedures used by the U.S. Geological Survey Kentucky Water Science Center Sediment Laboratory for the processing and analysis of fluvial-sediment samples for concentration of sand and finer material. The report details the processing of a sediment sample through the laboratory from receiving the sediment sample, through the analytical process, to compiling results of the requested analysis. Procedures for preserving sample integrity, calibrating and maintaining of laboratory and field instruments and equipment, analyzing samples, internal quality assurance and quality control, and validity of the sediment-analysis results also are described. The report includes a list of references cited and a glossary of sediment and quality-assurance terms.

  7. Cryogenic mirror analysis

    NASA Technical Reports Server (NTRS)

    Nagy, S.

    1988-01-01

    Due to extraordinary distances scanned by modern telescopes, optical surfaces in such telescopes must be manufactured to unimaginable standards of perfection of a few thousandths of a centimeter. The detection of imperfections of less than 1/20 of a wavelength of light, for application in the building of the mirror for the Space Infrared Telescope Facility, was undertaken. Because the mirror must be kept very cold while in space, another factor comes into effect: cryogenics. The process to test a specific morror under cryogenic conditions is described; including the follow-up analysis accomplished through computer work. To better illustrate the process and analysis, a Pyrex Hex-Core mirror is followed through the process from the laser interferometry in the lab, to computer analysis via a computer program called FRINGE. This analysis via FRINGE is detailed.

  8. Rapid Disaster Damage Estimation

    NASA Astrophysics Data System (ADS)

    Vu, T. T.

    2012-07-01

    The experiences from recent disaster events showed that detailed information derived from high-resolution satellite images could accommodate the requirements from damage analysts and disaster management practitioners. Richer information contained in such high-resolution images, however, increases the complexity of image analysis. As a result, few image analysis solutions can be practically used under time pressure in the context of post-disaster and emergency responses. To fill the gap in employment of remote sensing in disaster response, this research develops a rapid high-resolution satellite mapping solution built upon a dual-scale contextual framework to support damage estimation after a catastrophe. The target objects are building (or building blocks) and their condition. On the coarse processing level, statistical region merging deployed to group pixels into a number of coarse clusters. Based on majority rule of vegetation index, water and shadow index, it is possible to eliminate the irrelevant clusters. The remaining clusters likely consist of building structures and others. On the fine processing level details, within each considering clusters, smaller objects are formed using morphological analysis. Numerous indicators including spectral, textural and shape indices are computed to be used in a rule-based object classification. Computation time of raster-based analysis highly depends on the image size or number of processed pixels in order words. Breaking into 2 level processing helps to reduce the processed number of pixels and the redundancy of processing irrelevant information. In addition, it allows a data- and tasks- based parallel implementation. The performance is demonstrated with QuickBird images captured a disaster-affected area of Phanga, Thailand by the 2004 Indian Ocean tsunami are used for demonstration of the performance. The developed solution will be implemented in different platforms as well as a web processing service for operational uses.

  9. Research and Analysis of Image Processing Technologies Based on DotNet Framework

    NASA Astrophysics Data System (ADS)

    Ya-Lin, Song; Chen-Xi, Bai

    Microsoft.Net is a kind of most popular program development tool. This paper gave a detailed analysis concluded about some image processing technologies of the advantages and disadvantages by .Net processed image while the same algorithm is used in Programming experiments. The result shows that the two best efficient methods are unsafe pointer and Direct 3D, and Direct 3D used to 3D simulation development, and the others are useful in some fields while these technologies are poor efficiency and not suited to real-time processing. The experiment results in paper will help some projects about image processing and simulation based DotNet and it has strong practicability.

  10. Asymmetric simple exclusion process with position-dependent hopping rates: Phase diagram from boundary-layer analysis.

    PubMed

    Mukherji, Sutapa

    2018-03-01

    In this paper, we study a one-dimensional totally asymmetric simple exclusion process with position-dependent hopping rates. Under open boundary conditions, this system exhibits boundary-induced phase transitions in the steady state. Similarly to totally asymmetric simple exclusion processes with uniform hopping, the phase diagram consists of low-density, high-density, and maximal-current phases. In various phases, the shape of the average particle density profile across the lattice including its boundary-layer parts changes significantly. Using the tools of boundary-layer analysis, we obtain explicit solutions for the density profile in different phases. A detailed analysis of these solutions under different boundary conditions helps us obtain the equations for various phase boundaries. Next, we show how the shape of the entire density profile including the location of the boundary layers can be predicted from the fixed points of the differential equation describing the boundary layers. We discuss this in detail through several examples of density profiles in various phases. The maximal-current phase appears to be an especially interesting phase where the boundary layer flows to a bifurcation point on the fixed-point diagram.

  11. Asymmetric simple exclusion process with position-dependent hopping rates: Phase diagram from boundary-layer analysis

    NASA Astrophysics Data System (ADS)

    Mukherji, Sutapa

    2018-03-01

    In this paper, we study a one-dimensional totally asymmetric simple exclusion process with position-dependent hopping rates. Under open boundary conditions, this system exhibits boundary-induced phase transitions in the steady state. Similarly to totally asymmetric simple exclusion processes with uniform hopping, the phase diagram consists of low-density, high-density, and maximal-current phases. In various phases, the shape of the average particle density profile across the lattice including its boundary-layer parts changes significantly. Using the tools of boundary-layer analysis, we obtain explicit solutions for the density profile in different phases. A detailed analysis of these solutions under different boundary conditions helps us obtain the equations for various phase boundaries. Next, we show how the shape of the entire density profile including the location of the boundary layers can be predicted from the fixed points of the differential equation describing the boundary layers. We discuss this in detail through several examples of density profiles in various phases. The maximal-current phase appears to be an especially interesting phase where the boundary layer flows to a bifurcation point on the fixed-point diagram.

  12. Preliminary results from the High Speed Airframe Integration Research project

    NASA Technical Reports Server (NTRS)

    Coen, Peter G.; Sobieszczanski-Sobieski, Jaroslaw; Dollyhigh, Samuel M.

    1992-01-01

    A review is presented of the accomplishment of the near term objectives of developing an analysis system and optimization methods during the first year of the NASA Langley High Speed Airframe Integration Research (HiSAIR) project. The characteristics of a Mach 3 HSCT transport have been analyzed utilizing the newly developed process. In addition to showing more detailed information about the aerodynamic and structural coupling for this type of vehicle, this exercise aided in further refining the data requirements for the analysis process.

  13. Vulnerability-attention analysis for space-related activities

    NASA Technical Reports Server (NTRS)

    Ford, Donnie; Hays, Dan; Lee, Sung Yong; Wolfsberger, John

    1988-01-01

    Techniques for representing and analyzing trouble spots in structures and processes are discussed. Identification of vulnerable areas usually depends more on particular and often detailed knowledge than on algorithmic or mathematical procedures. In some cases, machine inference can facilitate the identification. The analysis scheme proposed first establishes the geometry of the process, then marks areas that are conditionally vulnerable. This provides a basis for advice on the kinds of human attention or machine sensing and control that can make the risks tolerable.

  14. Elixir - how to handle 2 trillion pixels

    NASA Astrophysics Data System (ADS)

    Magnier, Eugene A.; Cuillandre, Jean-Charles

    2002-12-01

    The Elixir system at CFHT provides automatic data quality assurance and calibration for the wide-field mosaic imager camera CFH12K. Elixir consists of a variety of tools, including: a real-time analysis suite which runs at the telescope to provide quick feedback to the observers; a detailed analysis of the calibration data; and an automated pipeline for processing data to be distributed to observers. To date, 2.4 × 1012 night-time sky pixels from CFH12K have been processed by the Elixir system.

  15. Looking for Professor Right: Mentee Selection of Mentors in a Formal Mentoring Program

    ERIC Educational Resources Information Center

    Bell, Amani; Treleaven, Lesley

    2011-01-01

    Finding a suitable mentor is crucial to the success of mentoring relationships. In the mentoring literature, however, there is conflicting evidence about the best ways to support the pairing process in organisational mentoring programs. This paper presents a detailed analysis of the pairing process in an academic mentoring program that has…

  16. Capital Budgeting Guidelines: How to Decide Whether to Fund a New Dorm or an Upgraded Computer Lab.

    ERIC Educational Resources Information Center

    Swiger, John; Klaus, Allen

    1996-01-01

    A process for college and university decision making and budgeting for capital outlays that focuses on evaluating the qualitative and quantitative benefits of each proposed project is described and illustrated. The process provides a means to solicit suggestions from those involved and provide detailed information for cost-benefit analysis. (MSE)

  17. Interaction Structures between a Child and Two Therapists in the Psychodynamic Treatment of a Child with Asperger's Disorder

    ERIC Educational Resources Information Center

    Goodman, Geoff; Athey-Lloyd, Laura

    2011-01-01

    Leading the charge to link intervention research with clinical practice is the development of process research, which involves a detailed analysis of specific therapeutic processes over the course of treatment. The delineation of interaction structures--repetitive patterns of interactions between patient and therapist over the course of…

  18. Exploiting Distance Technology to Foster Experimental Design as a Neglected Learning Objective in Labwork in Chemistry

    ERIC Educational Resources Information Center

    d'Ham, Cedric; de Vries, Erica; Girault, Isabelle; Marzin, Patricia

    2004-01-01

    This paper deals with the design process of a remote laboratory for labwork in chemistry. In particular, it focuses on the mutual dependency of theoretical conjectures about learning in the experimental sciences and technological opportunities in creating learning environments. The design process involves a detailed analysis of the expert task and…

  19. Towards an Understanding of the Business Process Analyst: An Analysis of Competencies

    ERIC Educational Resources Information Center

    Sonteya, Thembela; Seymour, Lisa

    2012-01-01

    The increase in adoption of business process management (BPM) and service oriented architecture (SOA) has created a high demand for qualified professionals with a plethora of skills. However, despite the growing amount of literature available on the topics of BPM and SOA, little research has been conducted around developing a detailed list of…

  20. Comprehensive NMR analysis of compositional changes of black garlic during thermal processing.

    PubMed

    Liang, Tingfu; Wei, Feifei; Lu, Yi; Kodani, Yoshinori; Nakada, Mitsuhiko; Miyakawa, Takuya; Tanokura, Masaru

    2015-01-21

    Black garlic is a processed food product obtained by subjecting whole raw garlic to thermal processing that causes chemical reactions, such as the Maillard reaction, which change the composition of the garlic. In this paper, we report a nuclear magnetic resonance (NMR)-based comprehensive analysis of raw garlic and black garlic extracts to determine the compositional changes resulting from thermal processing. (1)H NMR spectra with a detailed signal assignment showed that 38 components were altered by thermal processing of raw garlic. For example, the contents of 11 l-amino acids increased during the first step of thermal processing over 5 days and then decreased. Multivariate data analysis revealed changes in the contents of fructose, glucose, acetic acid, formic acid, pyroglutamic acid, cycloalliin, and 5-(hydroxymethyl)furfural (5-HMF). Our results provide comprehensive information on changes in NMR-detectable components during thermal processing of whole garlic.

  1. Attitude Determination Error Analysis System (ADEAS) mathematical specifications document

    NASA Technical Reports Server (NTRS)

    Nicholson, Mark; Markley, F.; Seidewitz, E.

    1988-01-01

    The mathematical specifications of Release 4.0 of the Attitude Determination Error Analysis System (ADEAS), which provides a general-purpose linear error analysis capability for various spacecraft attitude geometries and determination processes, are presented. The analytical basis of the system is presented. The analytical basis of the system is presented, and detailed equations are provided for both three-axis-stabilized and spin-stabilized attitude sensor models.

  2. Technical/commercial feasibility study of the production of fuel-grade ethanol from corn: 100-million-gallon-per-year production facility in Myrtle Grove, Louisiana

    NASA Astrophysics Data System (ADS)

    1982-05-01

    The technical and economic feasibility of producing motor fuel alcohol from corn in a 100 million gallon per year plant to be constructed in Myrtle Grove, Louisiana is evaluated. The evaluation includes a detailed process design using proven technology, a capital cost estimate for the plant, a detailed analysis of the annual operating cost, a market study, a socioeconomic, environmental, health and safety analysis, and a complete financial analysis. Several other considerations for production of ethanol were evaluated including: cogeneration and fuel to be used in firing the boilers; single by-products vs. multiple by-products; and use of boiler flue gas for by-product drying.

  3. Analysis of Particle Image Velocimetry (PIV) Data for Application to Subsonic Jet Noise Studies

    NASA Technical Reports Server (NTRS)

    Blackshire, James L.

    1997-01-01

    Global velocimetry measurements were taken using Particle Image Velocimetry (PIV) in the subsonic flow exiting a 1 inch circular nozzle in an attempt to better understand the turbulence characteristics of its shear layer region. This report presents the results of the PIV analysis and data reduction portions of the test and details the processing that was done. Custom data analysis and data validation algorithms were developed and applied to a data ensemble consisting of over 750 PIV 70 mm photographs taken in the 0.85 mach flow facility. Results are presented detailing spatial characteristics of the flow including ensemble mean and standard deviation, turbulence intensities and Reynold's stress levels, and 2-point spatial correlations.

  4. Fatigue Life Variability in Large Aluminum Forgings with Residual Stress

    DTIC Science & Technology

    2011-07-01

    been conducted. A detailed finite element analysis of the forge/ quench /coldwork/machine process was performed in order to predict the bulk residual...forge/ quench /coldwork/machine process was performed in order to predict the bulk residual stresses in a fictitious aluminum bulkhead. The residual...continues to develop the capability for computational simulation of the forge, quench , cold work and machining processes. In order to handle the

  5. Modular digital holographic fringe data processing system

    NASA Technical Reports Server (NTRS)

    Downward, J. G.; Vavra, P. C.; Schebor, F. S.; Vest, C. M.

    1985-01-01

    A software architecture suitable for reducing holographic fringe data into useful engineering data is developed and tested. The results, along with a detailed description of the proposed architecture for a Modular Digital Fringe Analysis System, are presented.

  6. An Update on the NASA Planetary Science Division Research and Analysis Program

    NASA Astrophysics Data System (ADS)

    Richey, Christina; Bernstein, Max; Rall, Jonathan

    2015-01-01

    Introduction: NASA's Planetary Science Division (PSD) solicits its Research and Analysis (R&A) programs each year in Research Opportunities in Space and Earth Sciences (ROSES). Beginning with the 2014 ROSES solicitation, PSD will be changing the structure of the program elements under which the majority of planetary science R&A is done. Major changes include the creation of five core research program elements aligned with PSD's strategic science questions, the introduction of several new R&A opportunities, new submission requirements, and a new timeline for proposal submissionROSES and NSPIRES: ROSES contains the research announcements for all of SMD. Submission of ROSES proposals is done electronically via NSPIRES: http://nspires.nasaprs.com. We will present further details on the proposal submission process to help guide younger scientists. Statistical trends, including the average award size within the PSD programs, selections rates, and lessons learned, will be presented. Information on new programs will also be presented, if available.Review Process and Volunteering: The SARA website (http://sara.nasa.gov) contains information on all ROSES solicitations. There is an email address (SARA@nasa.gov) for inquiries and an area for volunteer reviewers to sign up. The peer review process is based on Scientific/Technical Merit, Relevance, and Level of Effort, and will be detailed within this presentation.ROSES 2014 submission changes: All PSD programs will use a two-step proposal submission process. A Step-1 proposal is required and must be submitted electronically by the Step-1 due date. The Step-1 proposal should include a description of the science goals and objectives to be addressed by the proposal, a brief description of the methodology to be used to address the science goals and objectives, and the relevance of the proposed research to the call submitted to.Additional Information: Additional details will be provided on the Cassini Data Analysis Program, the Exoplanets Research program and Discovery Data Analysis Program, for which Dr. Richey is the Lead Program Officer.

  7. Laser synthesized super-hydrophobic conducting carbon with broccoli-type morphology as a counter-electrode for dye sensitized solar cells

    NASA Astrophysics Data System (ADS)

    Gokhale, Rohan; Agarkar, Shruti; Debgupta, Joyashish; Shinde, Deodatta; Lefez, Benoit; Banerjee, Abhik; Jog, Jyoti; More, Mahendra; Hannoyer, Beatrice; Ogale, Satishchandra

    2012-10-01

    A laser photochemical process is introduced to realize superhydrophobic conducting carbon coatings with broccoli-type hierarchical morphology for use as a metal-free counter electrode in a dye sensitized solar cell. The process involves pulsed excimer laser irradiation of a thin layer of liquid haloaromatic organic solvent o-dichlorobenzene (DCB). The coating reflects a carbon nanoparticle-self assembled and process-controlled morphology that yields solar to electric power conversion efficiency of 5.1% as opposed to 6.2% obtained with the conventional Pt-based electrode.A laser photochemical process is introduced to realize superhydrophobic conducting carbon coatings with broccoli-type hierarchical morphology for use as a metal-free counter electrode in a dye sensitized solar cell. The process involves pulsed excimer laser irradiation of a thin layer of liquid haloaromatic organic solvent o-dichlorobenzene (DCB). The coating reflects a carbon nanoparticle-self assembled and process-controlled morphology that yields solar to electric power conversion efficiency of 5.1% as opposed to 6.2% obtained with the conventional Pt-based electrode. Electronic supplementary information (ESI) available: Materials and equipment details, solar cell fabrication protocol, electrolyte spreading time measurement details, XPS spectra, electronic study, film adhesion test detailed analysis and field emission results. See DOI: 10.1039/c2nr32082g

  8. Cognitive Task Analysis of Network Analysts and Managers for Network Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erbacher, Robert; Frincke, Deborah A.; Wong, Pak C.

    The goal of the project was to create a set of next generation cyber situational awareness capabilities with applications to other domains in the long term. The goal is to improve the decision making process such that decision makers can choose better actions. To this end, we put extensive effort into ensuring we had feedback from network analysts and managers and understood what their needs truly were. Consequently, this is the focus of this portion of the research. This paper discusses the methodology we followed to acquire this feedback from the analysts, namely a cognitive task analysis. Additionally, this papermore » provides the details we acquired from the analysts. This essentially provides details on their processes, goals, concerns, the data and meta-data they analyze, etc. A final result we describe is the generation of a task-flow diagram.« less

  9. High resolution imaging of latent fingerprints by localized corrosion on brass surfaces.

    PubMed

    Goddard, Alex J; Hillman, A Robert; Bond, John W

    2010-01-01

    The Atomic Force Microscope (AFM) is capable of imaging fingerprint ridges on polished brass substrates at an unprecedented level of detail. While exposure to elevated humidity at ambient or slightly raised temperatures does not change the image appreciably, subsequent brief heating in a flame results in complete loss of the sweat deposit and the appearance of pits and trenches. Localized elemental analysis (using EDAX, coupled with SEM imaging) shows the presence of the constituents of salt in the initial deposits. Together with water and atmospheric oxygen--and with thermal enhancement--these are capable of driving a surface corrosion process. This process is sufficiently localized that it has the potential to generate a durable negative topographical image of the fingerprint. AFM examination of surface regions between ridges revealed small deposits (probably microscopic "spatter" of sweat components or transferred particulates) that may ultimately limit the level of ridge detail analysis.

  10. Pantex Falling Man - Independent Review Panel Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertolini, Louis; Brannon, Nathan; Olson, Jared

    2014-11-01

    Consolidated Nuclear Security (CNS) Pantex took the initiative to organize a Review Panel of subject matter experts to independently assess the adequacy of the Pantex Tripping Man Analysis methodology. The purpose of this report is to capture the details of the assessment including the scope, approach, results, and detailed Appendices. Along with the assessment of the analysis methodology, the panel evaluated the adequacy with which the methodology was applied as well as congruence with Department of Energy (DOE) standards 3009 and 3016. The approach included the review of relevant documentation, interactive discussion with Pantex staff, and the iterative process ofmore » evaluating critical lines of inquiry.« less

  11. A viscous flow analysis for the tip vortex generation process

    NASA Technical Reports Server (NTRS)

    Shamroth, S. J.; Briley, W. R.

    1979-01-01

    A three dimensional, forward-marching, viscous flow analysis is applied to the tip vortex generation problem. The equations include a streamwise momentum equation, a streamwise vorticity equation, a continuity equation, and a secondary flow stream function equation. The numerical method used combines a consistently split linearized scheme for parabolic equations with a scalar iterative ADI scheme for elliptic equations. The analysis is used to identify the source of the tip vortex generation process, as well as to obtain detailed flow results for a rectangular planform wing immersed in a high Reynolds number free stream at 6 degree incidence.

  12. 2013 Workplace and Equal Opportunity Survey of Active Duty Members: Administration, Datasets, and Codebook

    DTIC Science & Technology

    2016-05-01

    and Kroeger (2002) provide details on sampling and weighting. Following the summary of the survey methodology is a description of the survey analysis... description of priority, for the ADDRESS file). At any given time, the current address used corresponded to the address number with the highest priority...types of address updates provided by the postal service. They are detailed below; each includes a description of the processing steps. 1. Postal Non

  13. Advanced membrane devices. Interim report for October 1996--September 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laciak, D.V.; Langsam, M.; Lewnard, J.J.

    1997-12-31

    Under this Cooperative Agreement, Air Products and Chemicals, Inc. has continued to investigate and develop improved membrane technology for removal of carbon dioxide from natural gas. The task schedule for this reporting period included a detailed assessment of the market opportunity (Chapter 2), continued development and evaluation of membranes and membrane polymers (Chapter 3) and a detailed economic analysis comparing the potential of Air Products membranes to that of established acid gas removal processes (Chapter 4).

  14. Computer-Supported Collaborative Inquiry on Buoyancy: A Discourse Analysis Supporting the "Pieces" Position on Conceptual Change

    ERIC Educational Resources Information Center

    Turcotte, Sandrine

    2012-01-01

    This article describes in detail a conversation analysis of conceptual change in a computer-supported collaborative learning environment. Conceptual change is an essential learning process in science education that has yet to be fully understood. While many models and theories have been developed over the last three decades, empirical data to…

  15. The Main Portal of the Cathedral of Monreale: First Geometric Analysis and Interpretive Assessment of Architectural Features

    NASA Astrophysics Data System (ADS)

    Lo Brutto, M.; Dardanelli, G.; Ebolese, D.; Milazzo, G.; Pipitone, C.; Sciortino, R.

    2017-05-01

    Nowadays, 3D documentation of architectural assets is becoming a demanding task for the valorisation of Cultural Heritage especially after a restoration project. The 3D documentation can be used for detailed analysis of specific elements, for monitoring the state of conservation and for valorisation actions. The paper describes the results of the 3D close-range photogrammetry survey of the main portal of the Cathedral of Monreale (Palermo, Italy). The Cathedral is one the most important monumental complexes in Sicily that, for its high historical and artistic importance has been inscribed on UNESCO's World Heritage List since 2015. The main portal of the Cathedral has been recently restored. The restoration work has given the opportunity to evidence small details of the sculptural decorations and to carry out new interpretative analysis of the bas-reliefs. The main purpose of the work is to obtain a detailed 3D model and a high-resolution ortophoto of the entire portal and of some architectural details. The study was used to evaluate the most appropriate technical solutions for the 3D survey and to define the most suitable parameters for image acquisition and data processing.

  16. FTIR MONITORING OF THE VENTILATION AIR OF CRITICAL BUILDINGS

    EPA Science Inventory

    Fourier transform infrared (FTIR) spectroscopy has been used for detailed analysis of environmental and industrial process samples for many years. FTIR spectrometers have the capability of measuring multiple compounds simultaneously, thus providing an advantage over most other me...

  17. The need for conducting forensic analysis of decommissioned bridges.

    DOT National Transportation Integrated Search

    2014-01-01

    A limiting factor in current bridge management programs is a lack of detailed knowledge of bridge deterioration : mechanisms and processes. The current state of the art is to predict future condition using statistical forecasting : models based upon ...

  18. Users guide to E859 phoswich analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Costales, J.B.

    1992-11-30

    In this memo the authors describe the analysis path used to transform the phoswich data from raw data banks into cross sections suitable for publication. The primary purpose of this memo is not to document each analysis step in great detail but rather to point the reader to the fortran code used and to point out the essential features of the analysis path. A flow chart which summarizes the various steps performed to massage the data from beginning to end is given. In general, each step corresponds to a fortran program which was written to perform that particular task. Themore » automation of the data analysis has been kept purposefully minimal in order to ensure the highest quality of the final product. However, tools have been developed which ease the non--automated steps. There are two major parallel routes for the data analysis: data reduction and acceptance determination using detailed GEANT Monte Carlo simulations. In this memo, the authors will first describe the data reduction up to the point where PHAD banks (Pass 1-like banks) are created. They the will describe the steps taken in the GEANT Monte Carlo route. Note that a detailed memo describing the methodology of the acceptance corrections has already been written. Therefore the discussion of the acceptance determination will be kept to a minimum and the reader will be referred to the other memo for further details. Finally, they will describe the cross section formation process and how final spectra are extracted.« less

  19. Exergy analysis of an industrial-scale ultrafiltrated (UF) cheese production plant: a detailed survey

    NASA Astrophysics Data System (ADS)

    Nasiri, Farshid; Aghbashlo, Mortaza; Rafiee, Shahin

    2017-02-01

    In this study, a detailed exergy analysis of an industrial-scale ultrafiltrated (UF) cheese production plant was conducted based on actual operational data in order to provide more comprehensive insights into the performance of the whole plant and its main subcomponents. The plant included four main subsystems, i.e., steam generator (I), above-zero refrigeration system (II), Bactocatch-assisted pasteurization line (III), and UF cheese production line (IV). In addition, this analysis was aimed at quantifying the exergy destroyed in processing a known quantity of the UF cheese using the mass allocation method. The specific exergy destruction of the UF cheese production was determined at 2330.42 kJ/kg. The contributions of the subsystems I, II, III, and IV to the specific exergy destruction of the UF cheese production were computed as 1337.67, 386.18, 283.05, and 323.51 kJ/kg, respectively. Additionally, it was observed through the analysis that the steam generation system had the largest contribution to the thermodynamic inefficiency of the UF cheese production, accounting for 57.40 % of the specific exergy destruction. Generally, the outcomes of this survey further manifested the benefits of applying exergy analysis for design, analysis, and optimization of industrial-scale dairy processing plants to achieve the most cost-effective and environmentally-benign production strategies.

  20. Carl Rogers' Responses in the 17th Session with Miss Mun: Comments from a Process-Experiential and Psychoanalytic Perspective.

    ERIC Educational Resources Information Center

    Gundrum, Monica; Lietaer, Germain; Van Hees-Matthijssen, Christiane

    1999-01-01

    Reproduces the transcript of one of Carl Rogers' filmed therapeutic sessions with Miss Mun, followed by an empirical and clinical-qualitative analysis. Five task oriented processes are examined in detail: the evocative impact of reflections of feeling; empathic affirmation as a marker of intense vulnerability; focusing reflections; working with…

  1. Uncovering Procedural Knowledge in Craft, Design, and Technology Education: A Case of Hands-On Activities in Electronics

    ERIC Educational Resources Information Center

    Pirttimaa, Matti; Husu, Jukka; Metsärinne, Mika

    2017-01-01

    Different knowledge types have their own specific features and tasks in the learning process. Procedural knowledge is used in craft and technology education when students solve problems individually and share their working knowledge with others. This study presents a detailed analysis of a one student's learning process in technology education and…

  2. SIGMA Release v1.2 - Capabilities, Enhancements and Fixes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahadevan, Vijay; Grindeanu, Iulian R.; Ray, Navamita

    In this report, we present details on SIGMA toolkit along with its component structure, capabilities, and feature additions in FY15, release cycles, and continuous integration process. These software processes along with updated documentation are imperative to successfully integrate and utilize in several applications including the SHARP coupled analysis toolkit for reactor core systems funded under the NEAMS DOE-NE program.

  3. Understanding product cost vs. performance through an in-depth system Monte Carlo analysis

    NASA Astrophysics Data System (ADS)

    Sanson, Mark C.

    2017-08-01

    The manner in which an optical system is toleranced and compensated greatly affects the cost to build it. By having a detailed understanding of different tolerance and compensation methods, the end user can decide on the balance of cost and performance. A detailed phased approach Monte Carlo analysis can be used to demonstrate the tradeoffs between cost and performance. In complex high performance optical systems, performance is fine-tuned by making adjustments to the optical systems after they are initially built. This process enables the overall best system performance, without the need for fabricating components to stringent tolerance levels that often can be outside of a fabricator's manufacturing capabilities. A good performance simulation of as built performance can interrogate different steps of the fabrication and build process. Such a simulation may aid the evaluation of whether the measured parameters are within the acceptable range of system performance at that stage of the build process. Finding errors before an optical system progresses further into the build process saves both time and money. Having the appropriate tolerances and compensation strategy tied to a specific performance level will optimize the overall product cost.

  4. Use of paired simple and complex models to reduce predictive bias and quantify uncertainty

    NASA Astrophysics Data System (ADS)

    Doherty, John; Christensen, Steen

    2011-12-01

    Modern environmental management and decision-making is based on the use of increasingly complex numerical models. Such models have the advantage of allowing representation of complex processes and heterogeneous system property distributions inasmuch as these are understood at any particular study site. The latter are often represented stochastically, this reflecting knowledge of the character of system heterogeneity at the same time as it reflects a lack of knowledge of its spatial details. Unfortunately, however, complex models are often difficult to calibrate because of their long run times and sometimes questionable numerical stability. Analysis of predictive uncertainty is also a difficult undertaking when using models such as these. Such analysis must reflect a lack of knowledge of spatial hydraulic property details. At the same time, it must be subject to constraints on the spatial variability of these details born of the necessity for model outputs to replicate observations of historical system behavior. In contrast, the rapid run times and general numerical reliability of simple models often promulgates good calibration and ready implementation of sophisticated methods of calibration-constrained uncertainty analysis. Unfortunately, however, many system and process details on which uncertainty may depend are, by design, omitted from simple models. This can lead to underestimation of the uncertainty associated with many predictions of management interest. The present paper proposes a methodology that attempts to overcome the problems associated with complex models on the one hand and simple models on the other hand, while allowing access to the benefits each of them offers. It provides a theoretical analysis of the simplification process from a subspace point of view, this yielding insights into the costs of model simplification, and into how some of these costs may be reduced. It then describes a methodology for paired model usage through which predictive bias of a simplified model can be detected and corrected, and postcalibration predictive uncertainty can be quantified. The methodology is demonstrated using a synthetic example based on groundwater modeling environments commonly encountered in northern Europe and North America.

  5. Basic as well as detailed neurosonograms can be performed by offline analysis of three-dimensional fetal brain volumes.

    PubMed

    Bornstein, E; Monteagudo, A; Santos, R; Strock, I; Tsymbal, T; Lenchner, E; Timor-Tritsch, I E

    2010-07-01

    To evaluate the feasibility and the processing time of offline analysis of three-dimensional (3D) brain volumes to perform a basic, as well as a detailed, targeted, fetal neurosonogram. 3D fetal brain volumes were obtained in 103 consecutive healthy fetuses that underwent routine anatomical survey at 20-23 postmenstrual weeks. Transabdominal gray-scale and power Doppler volumes of the fetal brain were acquired by one of three experienced sonographers (an average of seven volumes per fetus). Acquisition was first attempted in the sagittal and coronal planes. When the fetal position did not enable easy and rapid access to these planes, axial acquisition at the level of the biparietal diameter was performed. Offline analysis of each volume was performed by two of the authors in a blinded manner. A systematic technique of 'volume manipulation' was used to identify a list of 25 brain dimensions/structures comprising a complete basic evaluation, intracranial biometry and a detailed targeted fetal neurosonogram. The feasibility and reproducibility of obtaining diagnostic-quality images of the different structures was evaluated, and processing times were recorded, by the two examiners. Diagnostic-quality visualization was feasible in all of the 25 structures, with an excellent visualization rate (85-100%) reported in 18 structures, a good visualization rate (69-97%) reported in five structures and a low visualization rate (38-54%) reported in two structures, by the two examiners. An average of 4.3 and 5.4 volumes were used to complete the examination by the two examiners, with a mean processing time of 7.2 and 8.8 minutes, respectively. The overall agreement rate for diagnostic visualization of the different brain structures between the two examiners was 89.9%, with a kappa coefficient of 0.5 (P < 0.001). In experienced hands, offline analysis of 3D brain volumes is a reproducible modality that can identify all structures necessary to complete both a basic and a detailed second-trimester fetal neurosonogram. Copyright 2010 ISUOG. Published by John Wiley & Sons, Ltd.

  6. Improving Analytical Characterization of Glycoconjugate Vaccines through Combined High-Resolution MS and NMR: Application to Neisseria meningitidis Serogroup B Oligosaccharide-Peptide Glycoconjugates.

    PubMed

    Yu, Huifeng; An, Yanming; Battistel, Marcos D; Cipollo, John F; Freedberg, Darón I

    2018-04-17

    Conjugate vaccines are highly heterogeneous in terms of glycosylation sites and linked oligosaccharide length. Therefore, the characterization of conjugate vaccines' glycosylation state is challenging. However, improved product characterization can lead to enhancements in product control and product quality. Here, we present a synergistic combination of high-resolution mass spectrometry (MS) and nuclear magnetic resonance spectroscopy (NMR) for the analysis of glycoconjugates. We use the power of this strategy to characterize model polysaccharide conjugates and to demonstrate a detailed level of glycoproteomic analysis. These are first steps on model compounds that will help untangle the details of complex product characterization in conjugate vaccines. Ultimately, this strategy can be applied to enhance the characterization of polysaccharide conjugate vaccines. In this study, we lay the groundwork for the analysis of conjugate vaccines. To begin this effort, oligosaccharide-peptide conjugates were synthesized by periodate oxidation of an oligosaccharide of a defined length, α,2-8 sialic acid trimer, followed by a reductive amination, and linking the trimer to an immunogenic peptide from tetanus toxoid. Combined mass spectrometry and nuclear magnetic resonance were used to monitor each reaction and conjugation products. Complete NMR peak assignment and detailed MS information on oxidized oligosialic acid and conjugates are reported. These studies provide a deeper understanding of the conjugation chemistry process and products, which can lead to a better controlled production process.

  7. Image encryption based on a delayed fractional-order chaotic logistic system

    NASA Astrophysics Data System (ADS)

    Wang, Zhen; Huang, Xia; Li, Ning; Song, Xiao-Na

    2012-05-01

    A new image encryption scheme is proposed based on a delayed fractional-order chaotic logistic system. In the process of generating a key stream, the time-varying delay and fractional derivative are embedded in the proposed scheme to improve the security. Such a scheme is described in detail with security analyses including correlation analysis, information entropy analysis, run statistic analysis, mean-variance gray value analysis, and key sensitivity analysis. Experimental results show that the newly proposed image encryption scheme possesses high security.

  8. Detailed Modeling and Irreversible Transfer Process Analysis of a Multi-Element Thermoelectric Generator System

    NASA Astrophysics Data System (ADS)

    Xiao, Heng; Gou, Xiaolong; Yang, Suwen

    2011-05-01

    Thermoelectric (TE) power generation technology, due to its several advantages, is becoming a noteworthy research direction. Many researchers conduct their performance analysis and optimization of TE devices and related applications based on the generalized thermoelectric energy balance equations. These generalized TE equations involve the internal irreversibility of Joule heating inside the thermoelectric device and heat leakage through the thermoelectric couple leg. However, it is assumed that the thermoelectric generator (TEG) is thermally isolated from the surroundings except for the heat flows at the cold and hot junctions. Since the thermoelectric generator is a multi-element device in practice, being composed of many fundamental TE couple legs, the effect of heat transfer between the TE couple leg and the ambient environment is not negligible. In this paper, based on basic theories of thermoelectric power generation and thermal science, detailed modeling of a thermoelectric generator taking account of the phenomenon of energy loss from the TE couple leg is reported. The revised generalized thermoelectric energy balance equations considering the effect of heat transfer between the TE couple leg and the ambient environment have been derived. Furthermore, characteristics of a multi-element thermoelectric generator with irreversibility have been investigated on the basis of the new derived TE equations. In the present investigation, second-law-based thermodynamic analysis (exergy analysis) has been applied to the irreversible heat transfer process in particular. It is found that the existence of the irreversible heat convection process causes a large loss of heat exergy in the TEG system, and using thermoelectric generators for low-grade waste heat recovery has promising potential. The results of irreversibility analysis, especially irreversible effects on generator system performance, based on the system model established in detail have guiding significance for the development and application of thermoelectric generators, particularly for the design and optimization of TE modules.

  9. Analysis and modeling of wafer-level process variability in 28 nm FD-SOI using split C-V measurements

    NASA Astrophysics Data System (ADS)

    Pradeep, Krishna; Poiroux, Thierry; Scheer, Patrick; Juge, André; Gouget, Gilles; Ghibaudo, Gérard

    2018-07-01

    This work details the analysis of wafer level global process variability in 28 nm FD-SOI using split C-V measurements. The proposed approach initially evaluates the native on wafer process variability using efficient extraction methods on split C-V measurements. The on-wafer threshold voltage (VT) variability is first studied and modeled using a simple analytical model. Then, a statistical model based on the Leti-UTSOI compact model is proposed to describe the total C-V variability in different bias conditions. This statistical model is finally used to study the contribution of each process parameter to the total C-V variability.

  10. Approximation of the Newton Step by a Defect Correction Process

    NASA Technical Reports Server (NTRS)

    Arian, E.; Batterman, A.; Sachs, E. W.

    1999-01-01

    In this paper, an optimal control problem governed by a partial differential equation is considered. The Newton step for this system can be computed by solving a coupled system of equations. To do this efficiently with an iterative defect correction process, a modifying operator is introduced into the system. This operator is motivated by local mode analysis. The operator can be used also for preconditioning in Generalized Minimum Residual (GMRES). We give a detailed convergence analysis for the defect correction process and show the derivation of the modifying operator. Numerical tests are done on the small disturbance shape optimization problem in two dimensions for the defect correction process and for GMRES.

  11. Conducting a narrative analysis.

    PubMed

    Emden, C

    1998-07-01

    This paper describes the process of narrative analysis as undertaken within a nursing study on scholars and scholarship. If follows an earlier paper titled: Theoretical perspectives on narrative inquiry that described the influencing ideas of Bruner (1987) and Roof (1994) upon the same study. Analysis procedures are described here in sufficient detail for other researchers wishing to implement a similar approach to do so. The process as described has two main components: (A) strategies of 'core story creation' and 'employment'; and (B) issues and dilemmas of narrative analysis, especially relating to rigour. The ideas of Polkinghorne (1988), Mishler (1986), and Labov (in Mishler 1986a) are introduced in so far as they impinge upon the analysis process. These relate especially to the development of key terms, and to the analysis strategies of core story creation and employment. Outcomes of the study in question are termed 'Signposting the lived-world of scholarship'.

  12. An Aquatic Acoustic Metrics Interface Utility for Underwater Sound Monitoring and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Huiying; Halvorsen, Michele B.; Deng, Zhiqun

    Fishes and other marine mammals suffer a range of potential effects from intense sound sources generated by anthropogenic underwater processes such as pile driving, shipping, sonars, and underwater blasting. Several underwater sound recording devices (USR) were built to monitor the acoustic sound pressure waves generated by those anthropogenic underwater activities, so the relevant processing software becomes indispensable for analyzing the audio files recorded by these USRs. However, existing software packages did not meet performance and flexibility requirements. In this paper, we provide a detailed description of a new software package, named Aquatic Acoustic Metrics Interface (AAMI), which is a Graphicalmore » User Interface (GUI) designed for underwater sound monitoring and analysis. In addition to the general functions, such as loading and editing audio files recorded by USRs, the software can compute a series of acoustic metrics in physical units, monitor the sound's influence on fish hearing according to audiograms from different species of fishes and marine mammals, and batch process the sound files. The detailed applications of the software AAMI will be discussed along with several test case scenarios to illustrate its functionality.« less

  13. A fuzzy cost-benefit function to select economical products for processing in a closed-loop supply chain

    NASA Astrophysics Data System (ADS)

    Pochampally, Kishore K.; Gupta, Surendra M.; Cullinane, Thomas P.

    2004-02-01

    The cost-benefit analysis of data associated with re-processing of used products often involves the uncertainty feature of cash-flow modeling. The data is not objective because of uncertainties in supply, quality and disassembly times of used products. Hence, decision-makers must rely on "fuzzy" data for analysis. The same parties that are involved in the forward supply chain often carry out the collection and re-processing of used products. It is therefore important that the cost-benefit analysis takes the data of both new products and used products into account. In this paper, a fuzzy cost-benefit function is proposed that is used to perform a multi-criteria economic analysis to select the most economical products to process in a closed-loop supply chain. Application of the function is detailed through an illustrative example.

  14. Assessment of Southern California environment from ERTS-1

    NASA Technical Reports Server (NTRS)

    Bowden, L. W.; Viellenave, J. H.

    1973-01-01

    ERTS-1 imagery is a useful source of data for evaluation of earth resources in Southern California. The improving quality of ERTS-1 imagery, and our increasing ability to enhance the imagery has resulted in studies of a variety of phenomena in several Southern California environments. These investigations have produced several significant results of varying detail. They include the detection and identification of macro-scale tectonic and vegetational patterns, as well as detailed analysis of urban and agricultural processes. The sequential nature of ERTS-1 imagery has allowed these studies to monitor significant changes in the environment. In addiation, some preliminary work has begun directed toward assessing the impact of expanding recreation, agriculture and urbanization into the fragile desert environment. Refinement of enhancement and mapping techniques and more intensive analysis of ERTS-1 imagery should lead to a greater capability to extract detailed information for more precise evaluations and more accurate monitoring of earth resources in Southern California.

  15. Computer-aided system design

    NASA Technical Reports Server (NTRS)

    Walker, Carrie K.

    1991-01-01

    A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.

  16. High-throughput sequencing: a failure mode analysis.

    PubMed

    Yang, George S; Stott, Jeffery M; Smailus, Duane; Barber, Sarah A; Balasundaram, Miruna; Marra, Marco A; Holt, Robert A

    2005-01-04

    Basic manufacturing principles are becoming increasingly important in high-throughput sequencing facilities where there is a constant drive to increase quality, increase efficiency, and decrease operating costs. While high-throughput centres report failure rates typically on the order of 10%, the causes of sporadic sequencing failures are seldom analyzed in detail and have not, in the past, been formally reported. Here we report the results of a failure mode analysis of our production sequencing facility based on detailed evaluation of 9,216 ESTs generated from two cDNA libraries. Two categories of failures are described; process-related failures (failures due to equipment or sample handling) and template-related failures (failures that are revealed by close inspection of electropherograms and are likely due to properties of the template DNA sequence itself). Preventative action based on a detailed understanding of failure modes is likely to improve the performance of other production sequencing pipelines.

  17. A radiologic study of an ancient Egyptian mummy with a prosthetic toe.

    PubMed

    Brier, Bob; Vinh, Phuong; Schuster, Michael; Mayforth, Howard; Johnson Chapin, Emily

    2015-06-01

    A radiologic examination (both CT and traditional X-ray) of two mummies curated at the Albany Institute of History and Art revealed the identity of the mummified remains as well as details of the person's life style parameters (markers of occupational stress). These mummies, brought to the Institute over 100 years ago, were unstudied until 1989. This preliminary study led to the misappropriation of the remains, and subsequent switching of the remains within their coffins. Recent and more detailed analyses lead to the correct identification of sex, a re-association of the remains to their interment coffins, as well as a detailed analysis of occupational markers. A prosthetic toe was identified in one of the mummies which lead to the functional exploration of prosthetics in the past including their use as part of funerary processing in ancient Egypt. Finally, details of the embalming process place the wrapped mummy within the time frame identified on the coffin of the mummy identified as Ankhefenmut as well as confirming his social status. © 2015 Wiley Periodicals, Inc.

  18. Development of a software tool to support chemical and biological terrorism intelligence analysis

    NASA Astrophysics Data System (ADS)

    Hunt, Allen R.; Foreman, William

    1997-01-01

    AKELA has developed a software tool which uses a systems analytic approach to model the critical processes which support the acquisition of biological and chemical weapons by terrorist organizations. This tool has four major components. The first is a procedural expert system which describes the weapon acquisition process. It shows the relationship between the stages a group goes through to acquire and use a weapon, and the activities in each stage required to be successful. It applies to both state sponsored and small group acquisition. An important part of this expert system is an analysis of the acquisition process which is embodied in a list of observables of weapon acquisition activity. These observables are cues for intelligence collection The second component is a detailed glossary of technical terms which helps analysts with a non- technical background understand the potential relevance of collected information. The third component is a linking capability which shows where technical terms apply to the parts of the acquisition process. The final component is a simple, intuitive user interface which shows a picture of the entire process at a glance and lets the user move quickly to get more detailed information. This paper explains e each of these five model components.

  19. Sampling procedure for lake or stream surface water chemistry

    Treesearch

    Robert Musselman

    2012-01-01

    Surface waters collected in the field for chemical analyses are easily contaminated. This research note presents a step-by-step detailed description of how to avoid sample contamination when field collecting, processing, and transporting surface water samples for laboratory analysis.

  20. Image processing, analysis, and management tools for gusset plate connections in steel truss bridges.

    DOT National Transportation Integrated Search

    2016-10-01

    This report details the research undertaken and software tools that were developed that enable digital : images of gusset plates to be converted into orthophotos, establish physical dimensions, collect : geometric information from them, and conduct s...

  1. Mechanical Design of a Performance Test Rig for the Turbine Air-Flow Task (TAFT)

    NASA Technical Reports Server (NTRS)

    Xenofos, George; Forbes, John; Farrow, John; Williams, Robert; Tyler, Tom; Sargent, Scott; Moharos, Jozsef

    2003-01-01

    To support development of the Boeing-Rocketdyne RS84 rocket engine, a fill-flow, reaction turbine geometry was integrated into the NASA-MSFC turbine air-flow test facility. A mechanical design was generated which minimized the amount of new hardware while incorporating all test and instrUmentation requirements. This paper provides details of the mechanical design for this Turbine Air-Flow Task (TAFT) test rig. The mechanical design process utilized for this task included the following basic stages: Conceptual Design. Preliminary Design. Detailed Design. Baseline of Design (including Configuration Control and Drawing Revision). Fabrication. Assembly. During the design process, many lessons were learned that should benefit future test rig design projects. Of primary importance are well-defined requirements early in the design process, a thorough detailed design package, and effective communication with both the customer and the fabrication contractors. The test rig provided steady and unsteady pressure data necessary to validate the computational fluid dynamics (CFD) code. The rig also helped characterize the turbine blade loading conditions. Test and CFD analysis results are to be presented in another JANNAF paper.

  2. Range Process Simulation Tool

    NASA Technical Reports Server (NTRS)

    Phillips, Dave; Haas, William; Barth, Tim; Benjamin, Perakath; Graul, Michael; Bagatourova, Olga

    2005-01-01

    Range Process Simulation Tool (RPST) is a computer program that assists managers in rapidly predicting and quantitatively assessing the operational effects of proposed technological additions to, and/or upgrades of, complex facilities and engineering systems such as the Eastern Test Range. Originally designed for application to space transportation systems, RPST is also suitable for assessing effects of proposed changes in industrial facilities and large organizations. RPST follows a model-based approach that includes finite-capacity schedule analysis and discrete-event process simulation. A component-based, scalable, open architecture makes RPST easily and rapidly tailorable for diverse applications. Specific RPST functions include: (1) definition of analysis objectives and performance metrics; (2) selection of process templates from a processtemplate library; (3) configuration of process models for detailed simulation and schedule analysis; (4) design of operations- analysis experiments; (5) schedule and simulation-based process analysis; and (6) optimization of performance by use of genetic algorithms and simulated annealing. The main benefits afforded by RPST are provision of information that can be used to reduce costs of operation and maintenance, and the capability for affordable, accurate, and reliable prediction and exploration of the consequences of many alternative proposed decisions.

  3. Analysis of the packet formation process in packet-switched networks

    NASA Astrophysics Data System (ADS)

    Meditch, J. S.

    Two new queueing system models for the packet formation process in packet-switched telecommunication networks are developed, and their applications in process stability, performance analysis, and optimization studies are illustrated. The first, an M/M/1 queueing system characterization of the process, is a highly aggregated model which is useful for preliminary studies. The second, a marked extension of an earlier M/G/1 model, permits one to investigate stability, performance characteristics, and design of the packet formation process in terms of the details of processor architecture, and hardware and software implementations with processor structure and as many parameters as desired as variables. The two new models together with the earlier M/G/1 characterization span the spectrum of modeling complexity for the packet formation process from basic to advanced.

  4. Research on Demand Analysis of the Users of the Senior English Diagnostic System

    ERIC Educational Resources Information Center

    Guo, Chen; Zhang, Hui; Yao, Qian; Wu, Min

    2013-01-01

    As the significance of learning English is becoming increasingly apparent, more and more English online practice systems are used by English learners. However, a thorough process of research and detailed analysis of user demand have not fully implemented before the design of these systems. As a result, these systems may suffer the defects of low…

  5. A Novel Method for the In-Depth Multimodal Analysis of Student Learning Trajectories in Intelligent Tutoring Systems

    ERIC Educational Resources Information Center

    Liu, Ran; Stamper, John; Davenport, Jodi

    2018-01-01

    Temporal analyses are critical to understanding learning processes, yet understudied in education research. Data from different sources are often collected at different grain sizes, which are difficult to integrate. Making sense of data at many levels of analysis, including the most detailed levels, is highly time-consuming. In this paper, we…

  6. The Automated Array Assembly Task of the Low-cost Silicon Solar Array Project, Phase 2

    NASA Technical Reports Server (NTRS)

    Coleman, M. G.; Grenon, L.; Pastirik, E. M.; Pryor, R. A.; Sparks, T. G.

    1978-01-01

    An advanced process sequence for manufacturing high efficiency solar cells and modules in a cost-effective manner is discussed. Emphasis is on process simplicity and minimizing consumed materials. The process sequence incorporates texture etching, plasma processes for damage removal and patterning, ion implantation, low pressure silicon nitride deposition, and plated metal. A reliable module design is presented. Specific process step developments are given. A detailed cost analysis was performed to indicate future areas of fruitful cost reduction effort. Recommendations for advanced investigations are included.

  7. Interdisciplinary study of atmospheric processes and constituents of the mid-Atlantic coastal region.. [air pollution control studies in Virginia

    NASA Technical Reports Server (NTRS)

    Kindle, E. C.; Bandy, E. C.; Copeland, G.; Blais, R.; Levy, G.; Sonenshine, D.

    1975-01-01

    Past research projects for the year 1974-1975 are listed along with future research programs in the area of air pollution control, remote sensor analysis of smoke plumes, the biosphere component, and field experiments. A detailed budget analysis is presented. Attachments are included on the following topics: mapping forest vegetation with ERTS-1 MSS data and automatic data processing techniques, and use of LARS system for the quantitative determination of smoke plume lateral diffusion coefficients from ERTS images of Virginia.

  8. Digital image processing for information extraction.

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.

    1973-01-01

    The modern digital computer has made practical image processing techniques for handling nonlinear operations in both the geometrical and the intensity domains, various types of nonuniform noise cleanup, and the numerical analysis of pictures. An initial requirement is that a number of anomalies caused by the camera (e.g., geometric distortion, MTF roll-off, vignetting, and nonuniform intensity response) must be taken into account or removed to avoid their interference with the information extraction process. Examples illustrating these operations are discussed along with computer techniques used to emphasize details, perform analyses, classify materials by multivariate analysis, detect temporal differences, and aid in human interpretation of photos.

  9. TESS Data Processing and Quick-look Pipeline

    NASA Astrophysics Data System (ADS)

    Fausnaugh, Michael; Huang, Xu; Glidden, Ana; Guerrero, Natalia; TESS Science Office

    2018-01-01

    We describe the data analysis procedures and pipelines for the Transiting Exoplanet Survey Satellite (TESS). We briefly review the processing pipeline developed and implemented by the Science Processing Operations Center (SPOC) at NASA Ames, including pixel/full-frame image calibration, photometric analysis, pre-search data conditioning, transiting planet search, and data validation. We also describe data-quality diagnostic analyses and photometric performance assessment tests. Finally, we detail a "quick-look pipeline" (QLP) that has been developed by the MIT branch of the TESS Science Office (TSO) to provide a fast and adaptable routine to search for planet candidates in the 30 minute full-frame images.

  10. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ravindra, M.K.; Banon, H.

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as amore » three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC's PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.« less

  11. Crystallographic data processing for free-electron laser sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Thomas A., E-mail: taw@physics.org; Barty, Anton; Stellato, Francesco

    2013-07-01

    A processing pipeline for diffraction data acquired using the ‘serial crystallography’ methodology with a free-electron laser source is described with reference to the crystallographic analysis suite CrystFEL and the pre-processing program Cheetah. A processing pipeline for diffraction data acquired using the ‘serial crystallography’ methodology with a free-electron laser source is described with reference to the crystallographic analysis suite CrystFEL and the pre-processing program Cheetah. A detailed analysis of the nature and impact of indexing ambiguities is presented. Simulations of the Monte Carlo integration scheme, which accounts for the partially recorded nature of the diffraction intensities, are presented and show thatmore » the integration of partial reflections could be made to converge more quickly if the bandwidth of the X-rays were to be increased by a small amount or if a slight convergence angle were introduced into the incident beam.« less

  12. An analytical approach to customer requirement information processing

    NASA Astrophysics Data System (ADS)

    Zhou, Zude; Xiao, Zheng; Liu, Quan; Ai, Qingsong

    2013-11-01

    'Customer requirements' (CRs) management is a key component of customer relationship management (CRM). By processing customer-focused information, CRs management plays an important role in enterprise systems (ESs). Although two main CRs analysis methods, quality function deployment (QFD) and Kano model, have been applied to many fields by many enterprises in the past several decades, the limitations such as complex processes and operations make them unsuitable for online businesses among small- and medium-sized enterprises (SMEs). Currently, most SMEs do not have the resources to implement QFD or Kano model. In this article, we propose a method named customer requirement information (CRI), which provides a simpler and easier way for SMEs to run CRs analysis. The proposed method analyses CRs from the perspective of information and applies mathematical methods to the analysis process. A detailed description of CRI's acquisition, classification and processing is provided.

  13. PRODUCTION OF HEAVY WATER SAVANNAH RIVER AND DANA PLANTS. Technical Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bebbington, W.P.; Thayer, V.R. eds.; Proctor, J.F. comp.

    1959-07-01

    A summary is presented of the basic technical iniormation that pertains to processes that are used at the Dana and Savannah River Plants for the production of heavy water. The manual is intended primarily for plant operating and technical personnel and was prepared to supplement and provide technical support for detailed operating procedures. Introductory sections contain some background information on the history, uses, available processes, and analytical procedures for heavy water. They also include a general comparison of the design and laserformance of the two plants and an analysis of their differences. The technology of the heavy water separation processesmore » used, namely hydrogen sulfide exchange, distillation of water, and electrolysis is discussed in detail. The manufacture and storage of hydrogen sulfide gas and the process water treatment facilities are also discussed. (auth)« less

  14. Teaching concept analysis to graduate nursing students.

    PubMed

    Schiller, Catharine J

    2018-04-01

    To provide guidance to educators who use the Wilson (1963) concept analysis method, as modified by Walker and Avant (2011), in their graduate nursing curriculum BACKGROUND: While graduate nursing curricula often include a concept analysis assignment, there is a paucity of literature to assist educators in guiding students through this challenging process. This article details one way for educators to assist graduate nursing students in learning how to undertake each step of the Wilson (1963) concept analysis method, as modified by Walker and Avant (2011). Wilson (1963) concept analysis method, as modified by Walker and Avant (2011). Using examples, this article walks the reader through the Walker and Avant (2011) concept analysis process and addresses those issues commonly encountered by educators during this process. This article presented one way of walking students through a Walker and Avant (2011) concept analysis. Having clear information about the steps involved in developing a concept analysis will make it easier for educators to incorporate it into their graduate nursing curriculum and to effectively guide students on their journey through this process. © 2018 Wiley Periodicals, Inc.

  15. European Software Engineering Process Group Conference (2nd Annual), EUROPEAN SEPG󈨥. Delegate Material, Tutorials

    DTIC Science & Technology

    1997-06-17

    There is Good and Bad News With CMMs8 *bad news: process improvement takes time *good news: the first benefit Is better schedule management With PSP s...e g similar supp v EURO not sudden death toolset for assessment and v EURO => Business benefits detailed analysis) . EURO could collapse (low risk...from SPI live on even after year 2000. Priority BENEFITS Actions * Improved management and application development processes * Strengthened Change

  16. Processing and analysis of cardiac optical mapping data obtained with potentiometric dyes

    PubMed Central

    Laughner, Jacob I.; Ng, Fu Siong; Sulkin, Matthew S.; Arthur, R. Martin

    2012-01-01

    Optical mapping has become an increasingly important tool to study cardiac electrophysiology in the past 20 years. Multiple methods are used to process and analyze cardiac optical mapping data, and no consensus currently exists regarding the optimum methods. The specific methods chosen to process optical mapping data are important because inappropriate data processing can affect the content of the data and thus alter the conclusions of the studies. Details of the different steps in processing optical imaging data, including image segmentation, spatial filtering, temporal filtering, and baseline drift removal, are provided in this review. We also provide descriptions of the common analyses performed on data obtained from cardiac optical imaging, including activation mapping, action potential duration mapping, repolarization mapping, conduction velocity measurements, and optical action potential upstroke analysis. Optical mapping is often used to study complex arrhythmias, and we also discuss dominant frequency analysis and phase mapping techniques used for the analysis of cardiac fibrillation. PMID:22821993

  17. AFRL Solid Propellant Laboratory Explosive Siting and Renovation Lessons Learned

    DTIC Science & Technology

    2010-07-01

    Area 1-30A explosive facility and provide consultation/support during the review process for each of the site plans. • Applied Engineering Services...provided consultation/support during the siting review process. • Applied Engineering Services (AES) Inc. performed a detailed structural, blast, thermal... Applied Engineering Services (AES) Inc. structural, blast, thermal and fragment hazard analysis to determine the appropriate siting values based on

  18. Is Outdoor Education Environmental Education?

    ERIC Educational Resources Information Center

    Parkin, Danny

    1998-01-01

    Explores the relationship between outdoor education and environmental education by examining the broad nature of outdoor education and discussing whether outdoor education and environmental education are overlapping philosophies or separate methods of instruction. Includes analysis of a survey of outdoor educators and details a process for…

  19. The Accounting Classroom--People, Activities, Content

    ERIC Educational Resources Information Center

    Mayne, F. Blair

    1969-01-01

    Discusses the classroom components of students, learning activities, and program content in relation to the increased demand for more extensive and detailed analysis of financial information, the increased use of automated data processing equipment, and resulting job shifts and the upgrading of skills. (CH)

  20. Inferring Group Processes from Computer-Mediated Affective Text Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schryver, Jack C; Begoli, Edmon; Jose, Ajith

    2011-02-01

    Political communications in the form of unstructured text convey rich connotative meaning that can reveal underlying group social processes. Previous research has focused on sentiment analysis at the document level, but we extend this analysis to sub-document levels through a detailed analysis of affective relationships between entities extracted from a document. Instead of pure sentiment analysis, which is just positive or negative, we explore nuances of affective meaning in 22 affect categories. Our affect propagation algorithm automatically calculates and displays extracted affective relationships among entities in graphical form in our prototype (TEAMSTER), starting with seed lists of affect terms. Severalmore » useful metrics are defined to infer underlying group processes by aggregating affective relationships discovered in a text. Our approach has been validated with annotated documents from the MPQA corpus, achieving a performance gain of 74% over comparable random guessers.« less

  1. A time-driven, activity-based costing methodology for determining the costs of red blood cell transfusion in patients with beta thalassaemia major.

    PubMed

    Burns, K E; Haysom, H E; Higgins, A M; Waters, N; Tahiri, R; Rushford, K; Dunstan, T; Saxby, K; Kaplan, Z; Chunilal, S; McQuilten, Z K; Wood, E M

    2018-04-10

    To describe the methodology to estimate the total cost of administration of a single unit of red blood cells (RBC) in adults with beta thalassaemia major in an Australian specialist haemoglobinopathy centre. Beta thalassaemia major is a genetic disorder of haemoglobin associated with multiple end-organ complications and typically requiring lifelong RBC transfusion therapy. New therapeutic agents are becoming available based on advances in understanding of the disorder and its consequences. Assessment of the true total cost of transfusion, incorporating both product and activity costs, is required in order to evaluate the benefits and costs of these new therapies. We describe the bottom-up, time-driven, activity-based costing methodology used to develop process maps to provide a step-by-step outline of the entire transfusion pathway. Detailed flowcharts for each process are described. Direct observations and timing of the process maps document all activities, resources, staff, equipment and consumables in detail. The analysis will include costs associated with performing these processes, including resources and consumables. Sensitivity analyses will be performed to determine the impact of different staffing levels, timings and probabilities associated with performing different tasks. Thirty-one process maps have been developed, with over 600 individual activities requiring multiple timings. These will be used for future detailed cost analyses. Detailed process maps using bottom-up, time-driven, activity-based costing for determining the cost of RBC transfusion in thalassaemia major have been developed. These could be adapted for wider use to understand and compare the costs and complexities of transfusion in other settings. © 2018 British Blood Transfusion Society.

  2. Safety Guided Design Based on Stamp/STPA for Manned Vehicle in Concept Design Phase

    NASA Astrophysics Data System (ADS)

    Ujiie, Ryo; Katahira, Masafumi; Miyamoto, Yuko; Umeda, Hiroki; Leveson, Nancy; Hoshino, Nobuyuki

    2013-09-01

    In manned vehicles, such as the Soyuz and the Space Shuttle, the crew and computer system cooperate to succeed in returning to the earth. While computers increase the functionality of system, they also increase the complexity of the interaction between the controllers (human and computer) and the target dynamics. In some cases, the complexity can produce a serious accident. To prevent such losses, traditional hazard analysis such as FTA has been applied to system development, however it can be used after creating a detailed system because it focuses on detailed component failures. As a result, it's more difficult to eliminate hazard cause early in the process when it is most feasible.STAMP/STPA is a new hazard analysis that can be applied from the early development phase, with the analysis being refined as more detailed decisions are made. In essence, the analysis and design decisions are intertwined and go hand-in-hand. We have applied STAMP/STPA to a concept design of a new JAXA manned vehicle and tried safety guided design of the vehicle. As a result of this trial, it has been shown that STAMP/STPA can be accepted easily by system engineers and the design has been made more sophisticated from a safety viewpoint. The result also shows that the consequences of human errors on system safety can be analysed in the early development phase and the system designed to prevent them. Finally, the paper will discuss an effective way to harmonize this safety guided design approach with system engineering process based on the result of this experience in this project.

  3. The Interplay of Proton, Electron, and Metabolite Supply for Photosynthetic H2 Production in Chlamydomonas reinhardtii*

    PubMed Central

    Doebbe, Anja; Keck, Matthias; La Russa, Marco; Mussgnug, Jan H.; Hankamer, Ben; Tekçe, Ercan; Niehaus, Karsten; Kruse, Olaf

    2010-01-01

    To obtain a detailed picture of sulfur deprivation-induced H2 production in microalgae, metabolome analyses were performed during key time points of the anaerobic H2 production process of Chlamydomonas reinhardtii. Analyses were performed using gas chromatography coupled to mass spectrometry (GC/MS), two-dimensional gas chromatography combined with time-of-flight mass spectrometry (GCxGC-TOFMS), lipid and starch analysis, and enzymatic determination of fermentative products. The studies were designed to provide a detailed metabolite profile of the solar Bio-H2 production process. This work reports on the differential analysis of metabolic profiles of the high H2-producing strain Stm6Glc4 and the wild-type cc406 (WT) before and during the H2 production phase. Using GCxGC-TOFMS analysis the number of detected peaks increased from 128 peaks, previously detected by GC/MS techniques, to ∼1168. More detailed analysis of the anaerobic H2 production phase revealed remarkable differences between wild-type and mutant cells in a number of metabolic pathways. Under these physiological conditions the WT produced up to 2.6 times more fatty acids, 2.2 times more neutral lipids, and up to 4 times more fermentation products compared with Stm6Glc4. Based on these results, specific metabolic pathways involving the synthesis of fatty acids, neutral lipids, and fermentation products during anaerobiosis in C. reinhardtii have been identified as potential targets for metabolic engineering to further enhance substrate supply for the hydrogenase(s) in the chloroplast. PMID:20581114

  4. Knowledge representation in metabolic pathway databases.

    PubMed

    Stobbe, Miranda D; Jansen, Gerbert A; Moerland, Perry D; van Kampen, Antoine H C

    2014-05-01

    The accurate representation of all aspects of a metabolic network in a structured format, such that it can be used for a wide variety of computational analyses, is a challenge faced by a growing number of researchers. Analysis of five major metabolic pathway databases reveals that each database has made widely different choices to address this challenge, including how to deal with knowledge that is uncertain or missing. In concise overviews, we show how concepts such as compartments, enzymatic complexes and the direction of reactions are represented in each database. Importantly, also concepts which a database does not represent are described. Which aspects of the metabolic network need to be available in a structured format and to what detail differs per application. For example, for in silico phenotype prediction, a detailed representation of gene-protein-reaction relations and the compartmentalization of the network is essential. Our analysis also shows that current databases are still limited in capturing all details of the biology of the metabolic network, further illustrated with a detailed analysis of three metabolic processes. Finally, we conclude that the conceptual differences between the databases, which make knowledge exchange and integration a challenge, have not been resolved, so far, by the exchange formats in which knowledge representation is standardized.

  5. Quantitative analysis of detailed lignin monomer composition by pyrolysis-gas chromatography combined with preliminary acetylation of the samples.

    PubMed

    Sonoda, T; Ona, T; Yokoi, H; Ishida, Y; Ohtani, H; Tsuge, S

    2001-11-15

    Detailed quantitative analysis of lignin monomer composition comprising p-coumaryl, coniferyl, and sinapyl alcohol and p-coumaraldehyde, coniferaldehyde, and sinapaldehyde in plant has not been studied from every point mainly because of artifact formation during the lignin isolation procedure, partial loss of the lignin components inherent in the chemical degradative methods, and difficulty in the explanation of the complex spectra generally observed for the lignin components. Here we propose a new method to quantify lignin monomer composition in detail by pyrolysis-gas chromatography (Py-GC) using acetylated lignin samples. The lignin acetylation procedure would contribute to prevent secondary formation of cinnamaldehydes from the corresponding alcohol forms during pyrolysis, which are otherwise unavoidable in conventional Py-GC process to some extent. On the basis of the characteristic peaks on the pyrograms of the acetylated sample, lignin monomer compositions in various dehydrogenative polymers (DHP) as lignin model compounds were determined, taking even minor components such as cinnamaldehydes into consideration. The observed compositions by Py-GC were in good agreement with the supplied lignin monomer contents on DHP synthesis. The new Py-GC method combined with sample preacetylation allowed us an accurate quantitative analysis of detailed lignin monomer composition using a microgram order of extractive-free plant samples.

  6. Research progress in Asia on methods of processing laser-induced breakdown spectroscopy data

    NASA Astrophysics Data System (ADS)

    Guo, Yang-Min; Guo, Lian-Bo; Li, Jia-Ming; Liu, Hong-Di; Zhu, Zhi-Hao; Li, Xiang-You; Lu, Yong-Feng; Zeng, Xiao-Yan

    2016-10-01

    Laser-induced breakdown spectroscopy (LIBS) has attracted much attention in terms of both scientific research and industrial application. An important branch of LIBS research in Asia, the development of data processing methods for LIBS, is reviewed. First, the basic principle of LIBS and the characteristics of spectral data are briefly introduced. Next, two aspects of research on and problems with data processing methods are described: i) the basic principles of data preprocessing methods are elaborated in detail on the basis of the characteristics of spectral data; ii) the performance of data analysis methods in qualitative and quantitative analysis of LIBS is described. Finally, a direction for future development of data processing methods for LIBS is also proposed.

  7. An Objective Method of Measuring Psychological States Associated With Changes in Neural Function: Content Analysis of Verbal Behavior.

    ERIC Educational Resources Information Center

    Gottschalk, Louis A.

    This paper examines the use of content analysis of speech in the objective recording and measurement of changes in emotional and cognitive function of humans in whom natural or experimental changes in neural status have occurred. A brief description of the data gathering process, details of numerous physiological effects, an anxiety scale, and a…

  8. User's guide to SILVAH: stand analysis, prescription, and management simulator program for hardwood stands of the Alleghenies.

    Treesearch

    David A. Marquis; Richard L. Ernst

    1992-01-01

    Describes the purpose and function of the SILVAH computer program in general terms; provides detailed instructions on use of the program; and provides information on program organization , data formats, and the basis of processing algorithms.

  9. Exploratory studies of the cruise performance of upper surface blown configurations: Program analysis and conclusions

    NASA Technical Reports Server (NTRS)

    Braden, J. A.; Hancock, J. P.; Hackett, J. E.; Lyman, V.

    1979-01-01

    The experimental data encompassing surface pressure measurements, and wake surveys at static and wind-on conditions are analyzed. Cruise performance trends reflecting nacelle geometric variations, and nozzle operating conditions are presented. Details of the modeling process are included.

  10. Developing a model for the adequate description of electronic communication in hospitals.

    PubMed

    Saboor, Samrend; Ammenwerth, Elske

    2011-01-01

    Adequate information and communication systems (ICT) can help to improve the communication in hospitals. Changes to the ICT-infrastructure of hospitals must be planed carefully. In order to support a comprehensive planning, we presented a classification of 81 common errors of the electronic communication on the MIE 2008 congress. Our objective now was to develop a data model that defines specific requirements for an adequate description of electronic communication processes We first applied the method of explicating qualitative content analysis on the error categorization in order to determine the essential process details. After this, we applied the method of subsuming qualitative content analysis on the results of the first step. A data model for the adequate description of electronic communication. This model comprises 61 entities and 91 relationships. The data model comprises and organizes all details that are necessary for the detection of the respective errors. It can be for either used to extend the capabilities of existing modeling methods or as a basis for the development of a new approach.

  11. Environmental research program. 1995 Annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, N.J.

    1996-06-01

    The objective of the Environmental Research Program is to enhance the understanding of, and mitigate the effects of pollutants on health, ecological systems, global and regional climate, and air quality. The program is multidisciplinary and includes fundamental research and development in efficient and environmentally benign combustion, pollutant abatement and destruction, and novel methods of detection and analysis of criteria and noncriteria pollutants. This diverse group conducts investigations in combustion, atmospheric and marine processes, flue-gas chemistry, and ecological systems. Combustion chemistry research emphasizes modeling at microscopic and macroscopic scales. At the microscopic scale, functional sensitivity analysis is used to explore themore » nature of the potential-to-dynamics relationships for reacting systems. Rate coefficients are estimated using quantum dynamics and path integral approaches. At the macroscopic level, combustion processes are modelled using chemical mechanisms at the appropriate level of detail dictated by the requirements of predicting particular aspects of combustion behavior. Parallel computing has facilitated the efforts to use detailed chemistry in models of turbulent reacting flow to predict minor species concentrations.« less

  12. Risk analysis within environmental impact assessment of proposed construction activity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeleňáková, Martina; Zvijáková, Lenka

    Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less

  13. Materials requirements for optical processing and computing devices

    NASA Technical Reports Server (NTRS)

    Tanguay, A. R., Jr.

    1985-01-01

    Devices for optical processing and computing systems are discussed, with emphasis on the materials requirements imposed by functional constraints. Generalized optical processing and computing systems are described in order to identify principal categories of requisite components for complete system implementation. Three principal device categories are selected for analysis in some detail: spatial light modulators, volume holographic optical elements, and bistable optical devices. The implications for optical processing and computing systems of the materials requirements identified for these device categories are described, and directions for future research are proposed.

  14. Citygml and the Streets of New York - a Proposal for Detailed Street Space Modelling

    NASA Astrophysics Data System (ADS)

    Beil, C.; Kolbe, T. H.

    2017-10-01

    Three-dimensional semantic city models are increasingly used for the analysis of large urban areas. Until now the focus has mostly been on buildings. Nonetheless many applications could also benefit from detailed models of public street space for further analysis. However, there are only few guidelines for representing roads within city models. Therefore, related standards dealing with street modelling are examined and discussed. Nearly all street representations are based on linear abstractions. However, there are many use cases that require or would benefit from the detailed geometrical and semantic representation of street space. A variety of potential applications for detailed street space models are presented. Subsequently, based on related standards as well as on user requirements, a concept for a CityGML-compliant representation of street space in multiple levels of detail is developed. In the course of this process, the CityGML Transportation model of the currently valid OGC standard CityGML2.0 is examined to discover possibilities for further developments. Moreover, a number of improvements are presented. Finally, based on open data sources, the proposed concept is implemented within a semantic 3D city model of New York City generating a detailed 3D street space model for the entire city. As a result, 11 thematic classes, such as roadbeds, sidewalks or traffic islands are generated and enriched with a large number of thematic attributes.

  15. Abstraction of information in repository performance assessments. Examples from the SKI project Site-94

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dverstorp, B.; Andersson, J.

    1995-12-01

    Performance Assessment of a nuclear waste repository implies an analysis of a complex system with many interacting processes. Even if some of these processes may be known to large detail, problems arise when combining all information, and means of abstracting information from complex detailed models into models that couple different processes are needed. Clearly, one of the major objectives of performance assessment, to calculate doses or other performance indicators, implies an enormous abstraction of information compared to all information that is used as input. Other problems are that the knowledge of different parts or processes is strongly variable and adjustments,more » interpretations, are needed when combining models from different disciplines. In addition, people as well as computers, even today, always have a limited capacity to process information and choices have to be made. However, because abstraction of information clearly is unavoidable in performance assessment the validity of choices made, always need to be scrutinized and judgements made need to be updated in an iterative process.« less

  16. Progress on the development of automated data analysis algorithms and software for ultrasonic inspection of composites

    NASA Astrophysics Data System (ADS)

    Aldrin, John C.; Coughlin, Chris; Forsyth, David S.; Welter, John T.

    2014-02-01

    Progress is presented on the development and implementation of automated data analysis (ADA) software to address the burden in interpreting ultrasonic inspection data for large composite structures. The automated data analysis algorithm is presented in detail, which follows standard procedures for analyzing signals for time-of-flight indications and backwall amplitude dropout. ADA processing results are presented for test specimens that include inserted materials and discontinuities produced under poor manufacturing conditions.

  17. Compiled visualization with IPI method for analysing of liquid liquid mixing process

    NASA Astrophysics Data System (ADS)

    Jasikova, Darina; Kotek, Michal; Kysela, Bohus; Sulc, Radek; Kopecky, Vaclav

    2018-06-01

    The article deals with the research of mixing process using visualization techniques and IPI method. Characteristics of the size distribution and the evolution of two liquid-liquid phase's disintegration were studied. A methodology has been proposed for visualization and image analysis of data acquired during the initial phase of the mixing process. IPI method was used for subsequent detailed study of the disintegrated droplets. The article describes advantages of usage of appropriate method, presents the limits of each method, and compares them.

  18. Air Vehicle Integration and Technology Research (AVIATR). Delivery Order 0023: Predictive Capability for Hypersonic Structural Response and Life Prediction: Phase 2 - Detailed Design of Hypersonic Cruise Vehicle Hot-Structure

    DTIC Science & Technology

    2012-05-01

    30 Figure 5.0.1 Phase II Analysis Process ...panel study the panel selection process followed a review of the outer skin environment investigated during the HTV-3X program which was suitable as...Subsequently, Panel 1B was down-selected from the screening process as it was observed to be subjected to stronger thermal field contributions due to fuel

  19. Using the DOE Knowledge Base for Special Event Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armstrong, H.M.; Harris, J.M.; Young, C.J.

    1998-10-20

    The DOE Knowledge Base is a library of detailed information whose purpose is to support the United States National Data Center (USNDC) in its mission to monitor compliance with the Comprehensive Test Ban Treaty (CTBT). One of the important tasks which the USNDC must accomplish is to periodically perform detailed analysis of events of high interest, so-called "Special Events", to provide the national authority with information needed to make policy decisions. In this paper we investigate some possible uses of the Knowledge Base for Special Event Analysis (SEA), and make recommendations for improving Knowledge Base support for SEA. To analyzemore » an event in detail, there are two basic types of data which must be used sensor-derived data (wave- forms, arrivals, events, etc.) and regiohalized contextual data (known sources, geological characteristics, etc.). Cur- rently there is no single package which can provide full access to both types of data, so for our study we use a separate package for each MatSeis, the Sandia Labs-developed MATLAB-based seismic analysis package, for wave- form data analysis, and ArcView, an ESRI product, for contextual data analysis. Both packages are well-suited to pro- totyping because they provide a rich set of currently available functionality and yet are also flexible and easily extensible, . Using these tools and Phase I Knowledge Base data sets, we show how the Knowledge Base can improve both the speed and the quality of SEA. Empirically-derived interpolated correction information can be accessed to improve both location estimates and associated error estimates. This information can in turn be used to identi~ any known nearby sources (e.g. mines, volcanos), which may then trigger specialized processing of the sensor data. Based on the location estimate, preferred magnitude formulas and discriminants can be retrieved, and any known blockages can be identified to prevent miscalculations. Relevant historic events can be identilled either by spatial proximity searches or through waveform correlation processing. The locations and waveforms of these events can then be made available for side-by-side comparison and processing. If synthetic modeling is thought to be warranted, a wide variety of rele- vant contextu~l information (e.g. crustal thickness and layering, seismic velocities, attenuation factors) can be retrieved and sent to the appropriate applications. Once formedj the synthetics can then be brought in for side-by-side comparison and fhrther processing. Based on our study, we make two general recommendations. First, proper inter-process communication between sensor data analysis software and contextual data analysis sofisvare should be developed. Second, some of the Knowl- edge Base data sets should be prioritized or winnowed to streamline comparison with observed quantities.« less

  20. Fossil insect evidence for the end of the Western Settlement in Norse Greenland

    NASA Astrophysics Data System (ADS)

    Panagiotakopulu, Eva; Skidmore, Peter; Buckland, Paul

    2007-04-01

    The fate of Norse farming settlements in southwest Greenland has often been seen as one of the great mysteries of North Atlantic colonization and expansion. Preservation of organic remains in the permafrost of the area of the Western Settlement, inland from the modern capital Nuuk, allowed very detailed study of the phases of occupation. Samples were taken from house floors and middens during the process of archaeological excavations and from insect remains were abstracted and identified in the laboratory. In this study, we present a new paleoecological approach principally examining the fossil fly faunas from house floors. The results of our study provide contrasting detailed pictures of the demise of two neighboring farms, Gården under Sandet and Nipaatsoq, one where abandonment appears as part of a normal process of site selection and desertion, and the other where the end was more traumatic. The level of detail, which was obtained by analysis of the dipterous (true fly) remains, exceeds all previous work and provides insights otherwise unobtainable.

  1. Tackling The Dragon: Investigating Lensed Galaxy Structure

    NASA Astrophysics Data System (ADS)

    Fortenberry, Alexander; Livermore, Rachael

    2018-01-01

    Galaxies have been seen to have a rapid decrease in star formation beginning at a redshift of around 1-2 up to the present day. To understand the processes underpinning this change, we need to observe the inner structure of galaxies and understand where and how the stellar mass builds up. However, at high redshifts our observable resolution is limited, which hinders the accuracy of the data. The lack of resolution at high redshift can be counteracted with the use of gravitational lensing. The magnification provided by the gravitational lens between us and the galaxies in question enables us to see extreme detail within the galaxies. To begin fine-tuning this process, we used Hubble data of Abell 370, a galaxy cluster, which lenses a galaxy know as “The Dragon” at z=0.725. With the increased detail proved by the gravitational lens we provide a detailed analysis of the galaxy’s spatially resolved star formation rate, stellar age, and masses.

  2. Fossil insect evidence for the end of the Western Settlement in Norse Greenland.

    PubMed

    Panagiotakopulu, Eva; Skidmore, Peter; Buckland, Paul

    2007-04-01

    The fate of Norse farming settlements in southwest Greenland has often been seen as one of the great mysteries of North Atlantic colonization and expansion. Preservation of organic remains in the permafrost of the area of the Western Settlement, inland from the modern capital Nuuk, allowed very detailed study of the phases of occupation. Samples were taken from house floors and middens during the process of archaeological excavations and from insect remains were abstracted and identified in the laboratory. In this study, we present a new paleoecological approach principally examining the fossil fly faunas from house floors. The results of our study provide contrasting detailed pictures of the demise of two neighboring farms, Gården under Sandet and Nipaatsoq, one where abandonment appears as part of a normal process of site selection and desertion, and the other where the end was more traumatic. The level of detail, which was obtained by analysis of the dipterous (true fly) remains, exceeds all previous work and provides insights otherwise unobtainable.

  3. OPERATIONS RESEARCH IN THE DESIGN OF MANAGEMENT INFORMATION SYSTEMS

    DTIC Science & Technology

    management information systems is concerned with the identification and detailed specification of the information and data processing...of advanced data processing techniques in management information systems today, the close coordination of operations research and data systems activities has become a practical necessity for the modern business firm.... information systems in which mathematical models are employed as the basis for analysis and systems design. Operations research provides a

  4. Data-Driven Design: Learning from Student Experiences and Behaviors

    NASA Astrophysics Data System (ADS)

    Horodyskyj, L.; Mead, C.; Buxner, S.; Semken, S. C.; Anbar, A. D.

    2015-12-01

    Good instructors know that lessons and courses change over time. Limitations in time and data often prevent instructors from making changes that will most benefit their students. For example, in traditional in-person classrooms an instructor may only have access to the final product of a student's thought processes (such as a term paper, homework assignment, or exam). The thought processes that lead to a given answer are opaque to the instructor, making future modifications to course content an exercise in trial-and-error and instinct. Modern online intelligent tutoring systems can provide insight into a student's behavior, providing transparency to a previously opaque process and providing the instructor with better information for course modification. Habitable Worlds is an introductory level online-only astrobiology lab course that has been offered at Arizona State University since Fall 2011. The course is built and offered through an intelligent tutoring system, Smart Sparrow's Adaptive eLearning Platform, which provides in-depth analytics that allow the instructor to investigate detailed student behavior, from time spent on question to number of attempts to patterns of answers. We will detail the process we employ of informed modification of course content, including time and trial comparisons between semesters, analysis of submitted answers, analysis of alternative learning pathways taken, and A/B testing.

  5. Prediction and Estimation of Scaffold Strength with different pore size

    NASA Astrophysics Data System (ADS)

    Muthu, P.; Mishra, Shubhanvit; Sri Sai Shilpa, R.; Veerendranath, B.; Latha, S.

    2018-04-01

    This paper emphasizes the significance of prediction and estimation of the mechanical strength of 3D functional scaffolds before the manufacturing process. Prior evaluation of the mechanical strength and structural properties of the scaffold will reduce the cost fabrication and in fact ease up the designing process. Detailed analysis and investigation of various mechanical properties including shear stress equivalence have helped to estimate the effect of porosity and pore size on the functionality of the scaffold. The influence of variation in porosity was examined by computational approach via finite element analysis (FEA) and ANSYS application software. The results designate the adequate perspective of the evolutionary method for the regulation and optimization of the intricate engineering design process.

  6. Integrating complex business processes for knowledge-driven clinical decision support systems.

    PubMed

    Kamaleswaran, Rishikesan; McGregor, Carolyn

    2012-01-01

    This paper presents in detail the component of the Complex Business Process for Stream Processing framework that is responsible for integrating complex business processes to enable knowledge-driven Clinical Decision Support System (CDSS) recommendations. CDSSs aid the clinician in supporting the care of patients by providing accurate data analysis and evidence-based recommendations. However, the incorporation of a dynamic knowledge-management system that supports the definition and enactment of complex business processes and real-time data streams has not been researched. In this paper we discuss the process web service as an innovative method of providing contextual information to a real-time data stream processing CDSS.

  7. Stingray Failure Mode, Effects and Criticality Analysis: WEC Risk Registers

    DOE Data Explorer

    Ken Rhinefrank

    2016-07-25

    Analysis method to systematically identify all potential failure modes and their effects on the Stingray WEC system. This analysis is incorporated early in the development cycle such that the mitigation of the identified failure modes can be achieved cost effectively and efficiently. The FMECA can begin once there is enough detail to functions and failure modes of a given system, and its interfaces with other systems. The FMECA occurs coincidently with the design process and is an iterative process which allows for design changes to overcome deficiencies in the analysis.Risk Registers for major subsystems completed according to the methodology described in "Failure Mode Effects and Criticality Analysis Risk Reduction Program Plan.pdf" document below, in compliance with the DOE Risk Management Framework developed by NREL.

  8. Knowledge-based requirements analysis for automating software development

    NASA Technical Reports Server (NTRS)

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  9. FHWA travel analysis framework : development of VMT forecasting models for use by the Federal Highway Administration

    DOT National Transportation Integrated Search

    2014-05-12

    This document details the process that the Volpe National Transportation Systems Center (Volpe) used to develop travel forecasting models for the Federal Highway Administration (FHWA). The purpose of these models is to allow FHWA to forecast future c...

  10. Energy-Efficient Design for Florida Educational Facilities.

    ERIC Educational Resources Information Center

    Florida Solar Energy Center, Cape Canaveral.

    This manual provides a detailed simulation analysis of a variety of energy conservation measures (ECMs) with the intent of giving educational facility design teams in Florida a basis for decision making. The manual's three sections cover energy efficiency design considerations that appear throughout the following design processes: schematic…

  11. Misunderstanding and Repair in Tactile Auslan

    ERIC Educational Resources Information Center

    Willoughby, Louisa; Manns, Howard; Iwasaki, Shimako; Bartlett, Meredith

    2014-01-01

    This article discusses ways in which misunderstandings arise in Tactile Australian Sign Language (Tactile Auslan) and how they are resolved. Of particular interest are the similarities to and differences from the same processes in visually signed and spoken conversation. This article draws on detailed conversation analysis (CA) and demonstrates…

  12. Autonomous Mars ascent and orbit rendezvous for earth return missions

    NASA Technical Reports Server (NTRS)

    Edwards, H. C.; Balmanno, W. F.; Cruz, Manuel I.; Ilgen, Marc R.

    1991-01-01

    The details of tha assessment of autonomous Mars ascent and orbit rendezvous for earth return missions are presented. Analyses addressing navigation system assessments, trajectory planning, targeting approaches, flight control guidance strategies, and performance sensitivities are included. Tradeoffs in the analysis and design process are discussed.

  13. Comparative study of resist stabilization techniques for metal etch processing

    NASA Astrophysics Data System (ADS)

    Becker, Gerry; Ross, Matthew F.; Wong, Selmer S.; Minter, Jason P.; Marlowe, Trey; Livesay, William R.

    1999-06-01

    This study investigates resist stabilization techniques as they are applied to a metal etch application. The techniques that are compared are conventional deep-UV/thermal stabilization, or UV bake, and electron beam stabilization. The electron beam tool use din this study, an ElectronCure system from AlliedSignal Inc., ELectron Vision Group, utilizes a flood electron source and a non-thermal process. These stabilization techniques are compared with respect to a metal etch process. In this study, two types of resist are considered for stabilization and etch: a g/i-line resist, Shipley SPR-3012, and an advanced i-line, Shipley SPR 955- Cm. For each of these resist the effects of stabilization on resist features are evaluated by post-stabilization SEM analysis. Etch selectivity in all cases is evaluated by using a timed metal etch, and measuring resists remaining relative to total metal thickness etched. Etch selectivity is presented as a function of stabilization condition. Analyses of the effects of the type of stabilization on this method of selectivity measurement are also presented. SEM analysis was also performed on the features after a compete etch process, and is detailed as a function of stabilization condition. Post-etch cleaning is also an important factor impacted by pre-etch resist stabilization. Results of post- etch cleaning are presented for both stabilization methods. SEM inspection is also detailed for the metal features after resist removal processing.

  14. Comprehensive two-dimensional gas chromatography for the analysis of Fischer-Tropsch oil products.

    PubMed

    van der Westhuizen, Rina; Crous, Renier; de Villiers, André; Sandra, Pat

    2010-12-24

    The Fischer-Tropsch (FT) process involves a series of catalysed reactions of carbon monoxide and hydrogen, originating from coal, natural gas or biomass, leading to a variety of synthetic chemicals and fuels. The benefits of comprehensive two-dimensional gas chromatography (GC×GC) compared to one-dimensional GC (1D-GC) for the detailed investigation of the oil products of low and high temperature FT processes are presented. GC×GC provides more accurate quantitative data to construct Anderson-Schultz-Flory (ASF) selectivity models that correlate the FT product distribution with reaction variables. On the other hand, the high peak capacity and sensitivity of GC×GC allow the detailed study of components present at trace level. Analyses of the aromatic and oxygenated fractions of a high temperature FT (HT-FT) process are presented. GC×GC data have been used to optimise or tune the HT-FT process by using a lab-scale micro-FT-reactor. Copyright © 2010 Elsevier B.V. All rights reserved.

  15. Detail in architecture: Between arts & crafts

    NASA Astrophysics Data System (ADS)

    Dulencin, Juraj

    2016-06-01

    Architectural detail represents an important part of architecture. Not only can it be used as an identifier of a specific building but at the same time enhances the experience of the realized project. Within it lie the signs of a great architect and clues to understanding his or her way of thinking. It is therefore the central topic of a seminar offered to architecture students at the Brno University of Technology. During the course of the semester-long class the students acquaint themselves with atypical architectural details of domestic and international architects by learning to read them, understand them and subsequently draw them by creating architectural blueprints. In other words, by general analysis of a detail the students learn theoretical thinking of its architect who, depending on the nature of the design, had to incorporate a variety of techniques and crafts. Students apply this analytical part to their own architectural detail design. The methodology of the seminar consists of experiential learning by project management and is complemented by a series of lectures discussing a diversity of details as well as materials and technologies required to implement it. The architectural detail design is also part of students' bachelors thesis, therefore, the realistic nature of their blueprints can be verified in the production process of its physical counterpart. Based on their own documentation the students choose the most suitable manufacturing process whether it is supplied by a specific technology or a craftsman. Students actively participate in the production and correct their design proposals in real scale with the actual material. A student, as a future architect, stands somewhere between a client and an artisan, materializes his or her idea and adjusts the manufacturing process so that the final detail fulfills aesthetic consistency and is in harmony with its initial concept. One of the very important aspects of the design is its economic cost, an actual price of real implementation. The detail determines not only the physical expression, it becomes the characteristic feature from which the rest of the building is derived. This course motivates students to surpass mere technical calculations learned from books towards sophistication and refinement, pragmatism and experimentation, and encourages a shift from feasibility to perfection.

  16. Wash water recovery system

    NASA Technical Reports Server (NTRS)

    Deckman, G.; Rousseau, J. (Editor)

    1973-01-01

    The Wash Water Recovery System (WWRS) is intended for use in processing shower bath water onboard a spacecraft. The WWRS utilizes flash evaporation, vapor compression, and pyrolytic reaction to process the wash water to allow recovery of potable water. Wash water flashing and foaming characteristics, are evaluated physical properties, of concentrated wash water are determined, and a long term feasibility study on the system is performed. In addition, a computer analysis of the system and a detail design of a 10 lb/hr vortex-type water vapor compressor were completed. The computer analysis also sized remaining system components on the basis of the new vortex compressor design.

  17. A combined Bodian-Nissl stain for improved network analysis in neuronal cell culture.

    PubMed

    Hightower, M; Gross, G W

    1985-11-01

    Bodian and Nissl procedures were combined to stain dissociated mouse spinal cord cells cultured on coverslips. The Bodian technique stains fine neuronal processes in great detail as well as an intracellular fibrillar network concentrated around the nucleus and in proximal neurites. The Nissl stain clearly delimits neuronal cytoplasm in somata and in large dendrites. A combination of these techniques allows the simultaneous depiction of neuronal perikarya and all afferent and efferent processes. Costaining with little background staining by either procedure suggests high specificity for neurons. This procedure could be exploited for routine network analysis of cultured neurons.

  18. Simulating observations with HARMONI: the integral field spectrograph for the European Extremely Large Telescope

    NASA Astrophysics Data System (ADS)

    Zieleniewski, Simon; Thatte, Niranjan; Kendrew, Sarah; Houghton, Ryan; Tecza, Matthias; Clarke, Fraser; Fusco, Thierry; Swinbank, Mark

    2014-07-01

    With the next generation of extremely large telescopes commencing construction, there is an urgent need for detailed quantitative predictions of the scientific observations that these new telescopes will enable. Most of these new telescopes will have adaptive optics fully integrated with the telescope itself, allowing unprecedented spatial resolution combined with enormous sensitivity. However, the adaptive optics point spread function will be strongly wavelength dependent, requiring detailed simulations that accurately model these variations. We have developed a simulation pipeline for the HARMONI integral field spectrograph, a first light instrument for the European Extremely Large Telescope. The simulator takes high-resolution input data-cubes of astrophysical objects and processes them with accurate atmospheric, telescope and instrumental effects, to produce mock observed cubes for chosen observing parameters. The output cubes represent the result of a perfect data reduc- tion process, enabling a detailed analysis and comparison between input and output, showcasing HARMONI's capabilities. The simulations utilise a detailed knowledge of the telescope's wavelength dependent adaptive op- tics point spread function. We discuss the simulation pipeline and present an early example of the pipeline functionality for simulating observations of high redshift galaxies.

  19. Laser Doppler velocimeter system simulation for sensing aircraft wake vortices. Part 2: Processing and analysis of LDV data (for runs 1023 and 2023)

    NASA Technical Reports Server (NTRS)

    Meng, J. C. S.; Thomson, J. A. L.

    1975-01-01

    A data analysis program constructed to assess LDV system performance, to validate the simulation model, and to test various vortex location algorithms is presented. Real or simulated Doppler spectra versus range and elevation is used and the spatial distributions of various spectral moments or other spectral characteristics are calculated and displayed. Each of the real or simulated scans can be processed by one of three different procedures: simple frequency or wavenumber filtering, matched filtering, and deconvolution filtering. The final output is displayed as contour plots in an x-y coordinate system, as well as in the form of vortex tracks deduced from the maxima of the processed data. A detailed analysis of run number 1023 and run number 2023 is presented to demonstrate the data analysis procedure. Vortex tracks and system range resolutions are compared with theoretical predictions.

  20. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ravindra, M.K.; Banon, H.

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as amore » three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC`s PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.« less

  1. Analysis of Compton continuum measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gold, R.; Olson, I. K.

    1970-01-01

    Five computer programs: COMPSCAT, FEND, GABCO, DOSE, and COMPLOT, have been developed and used for the analysis and subsequent reduction of measured energy distributions of Compton recoil electrons to continuous gamma spectra. In addition to detailed descriptions of these computer programs, the relationship amongst these codes is stressed. The manner in which these programs function is illustrated by tracing a sample measurement through a complete cycle of the data-reduction process.

  2. Structural Design of Ares V Interstage Composite Structure

    NASA Technical Reports Server (NTRS)

    Sleigh, David W.; Sreekantamurthy, Thammaiah; Kosareo, Daniel N.; Martin, Robert A.; Johnson, Theodore F.

    2011-01-01

    Preliminary and detailed design studies were performed to mature composite structural design concepts for the Ares V Interstage structure as a part of NASA s Advanced Composite Technologies Project. Aluminum honeycomb sandwich and hat-stiffened composite panel structural concepts were considered. The structural design and analysis studies were performed using HyperSizer design sizing software and MSC Nastran finite element analysis software. System-level design trade studies were carried out to predict weight and margins of safety for composite honeycomb-core sandwich and composite hat-stiffened skin design concepts. Details of both preliminary and detailed design studies are presented in the paper. For the range of loads and geometry considered in this work, the hat-stiffened designs were found to be approximately 11-16 percent lighter than the sandwich designs. A down-select process was used to choose the most favorable structural concept based on a set of figures of merit, and the honeycomb sandwich design was selected as the best concept based on advantages in manufacturing cost.

  3. Meteorology, Macrophysics, Microphysics, Microwaves, and Mesoscale Modeling of Mediterranean Mountain Storms: The M8 Laboratory

    NASA Technical Reports Server (NTRS)

    Starr, David O. (Technical Monitor); Smith, Eric A.

    2002-01-01

    Comprehensive understanding of the microphysical nature of Mediterranean storms can be accomplished by a combination of in situ meteorological data analysis and radar-passive microwave data analysis, effectively integrated with numerical modeling studies at various scales, from synoptic scale down through the mesoscale, the cloud macrophysical scale, and ultimately the cloud microphysical scale. The microphysical properties of and their controls on severe storms are intrinsically related to meteorological processes under which storms have evolved, processes which eventually select and control the dominant microphysical properties themselves. This involves intense convective development, stratiform decay, orographic lifting, and sloped frontal lifting processes, as well as the associated vertical motions and thermodynamical instabilities governing physical processes that affect details of the size distributions and fall rates of the various types of hydrometeors found within the storm environment. Insofar as hazardous Mediterranean storms, highlighted in this study by three mountain storms producing damaging floods in northern Italy between 1992 and 2000, developing a comprehensive microphysical interpretation requires an understanding of the multiple phases of storm evolution and the heterogeneous nature of precipitation fields within a storm domain. This involves convective development, stratiform transition and decay, orographic lifting, and sloped frontal lifting processes. This also involves vertical motions and thermodynamical instabilities governing physical processes that determine details of the liquid/ice water contents, size disi:ributions, and fall rates of the various modes of hydrometeors found within hazardous storm environments.

  4. Ionospheric Irregularities: Source, Structure, Plasma Processes and Effects on Sensor Systems

    DTIC Science & Technology

    1989-10-31

    modified by using y - ye + w sin 0 sin 0 (20h) some of M orf et al.’s ideas. Fougere [1981] added a set of subroutines which find for each channel, the...correlation analysis has employed the modified Fedor algo- rithm (Fedor. 1967. CFB, this issue]. 3. ALGORITHM FOR THE SPECTRAL ANALYSIS The elements of the...knowledge, this paper presents with the every 240 m along the orbital track [Ifeelis et al., 1981]. Thus above data set the most detailed analysis of

  5. Varied Human Tolerance to the Combined Conditions of Low Contrast and Diminished Luminance: A Quasi-Meta Analysis

    DTIC Science & Technology

    2017-08-30

    as being three-fold: 1) a measurement of the integrity of both the central and peripheral visual processing centers; 2) an indicator of detail...visual assessment task 12 integral to the Army’s Class 1 Flight Physical (Ginsburg, 1981 and 1984; Bachman & Behar, 1986). During a Class 1 flight...systems. Meta-analysis has been defined as the statistical analysis of a collection of analytical results for the purpose of integrating the findings

  6. SisPorto 4.0 - computer analysis following the 2015 FIGO Guidelines for intrapartum fetal monitoring.

    PubMed

    Ayres-de-Campos, Diogo; Rei, Mariana; Nunes, Inês; Sousa, Paulo; Bernardes, João

    2017-01-01

    SisPorto 4.0 is the most recent version of a program for the computer analysis of cardiotocographic (CTG) signals and ST events, which has been adapted to the 2015 International Federation of Gynaecology and Obstetrics (FIGO) guidelines for intrapartum foetal monitoring. This paper provides a detailed description of the analysis performed by the system, including the signal-processing algorithms involved in identification of basic CTG features and the resulting real-time alerts.

  7. Realtime Decision Making on EO-1 Using Onboard Science Analysis

    NASA Technical Reports Server (NTRS)

    Sherwood, Robert; Chien, Steve; Davies, Ashley; Mandl, Dan; Frye, Stu

    2004-01-01

    Recent autonomy experiments conducted on Earth Observing 1 (EO-1) using the Autonomous Sciencecraft Experiment (ASE) flight software has been used to classify key features in hyperspectral images captured by EO-1. Furthermore, analysis is performed by this software onboard EO-1 and then used to modify the operational plan without interaction from the ground. This paper will outline the overall operations concept and provide some details and examples of the onboard science processing, science analysis, and replanning.

  8. Binary partition tree analysis based on region evolution and its application to tree simplification.

    PubMed

    Lu, Huihai; Woods, John C; Ghanbari, Mohammed

    2007-04-01

    Pyramid image representations via tree structures are recognized methods for region-based image analysis. Binary partition trees can be applied which document the merging process with small details found at the bottom levels and larger ones close to the root. Hindsight of the merging process is stored within the tree structure and provides the change histories of an image property from the leaf to the root node. In this work, the change histories are modelled by evolvement functions and their second order statistics are analyzed by using a knee function. Knee values show the reluctancy of each merge. We have systematically formulated these findings to provide a novel framework for binary partition tree analysis, where tree simplification is demonstrated. Based on an evolvement function, for each upward path in a tree, the tree node associated with the first reluctant merge is considered as a pruning candidate. The result is a simplified version providing a reduced solution space and still complying with the definition of a binary tree. The experiments show that image details are preserved whilst the number of nodes is dramatically reduced. An image filtering tool also results which preserves object boundaries and has applications for segmentation.

  9. Signal design study for shuttle/TDRSS Ku-band uplink

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The adequacy of the signal design approach chosen for the TDRSS/orbiter uplink was evaluated. Critical functions and/or components associated with the baseline design were identified, and design alternatives were developed for those areas considered high risk. A detailed set of RF and signal processing performance specifications for the orbiter hardware associated with the TDRSS/orbiter Ku band uplink was analyzed. Performances of a detailed design of the PN despreader, the PSK carrier synchronization loop, and the symbol synchronizer are identified. The performance of the downlink signal by means of computer simulation to obtain a realistic determination of bit error rate degradations was studied. The three channel PM downlink signal was detailed by means of analysis and computer simulation.

  10. Toward Theory-Based Instruction in Scientific Problem Solving.

    ERIC Educational Resources Information Center

    Heller, Joan I.; And Others

    Several empirical and theoretical analyses related to scientific problem-solving are reviewed, including: detailed studies of individuals at different levels of expertise, and computer models simulating some aspects of human information processing during problem solving. Analysis of these studies has revealed many facets about the nature of the…

  11. Negotiation: How Four Youth Organizations Create Learning Environments.

    ERIC Educational Resources Information Center

    Deschenes, Sarah; McDonald, Morva

    This paper details the efforts of four organizations that have been able to negotiate their environments effectively, in the hopes that the analysis provides insights into how organizations are able to establish valuable learning environments for youth in nonschool hours. The negotiation, the process of dealing with various layers of environments…

  12. Evaluating Organizational Change at a Multinational Transportation Corporation: Method and Reflections

    ERIC Educational Resources Information Center

    Plakhotnik, Maria S.

    2016-01-01

    The purpose of this perspective on practice is to share my experience conducting an organizational change evaluation using qualitative methodology at a multinational transportation company Global Logistics. I provide a detailed description of the three phase approach to data analysis and my reflections on the process.

  13. 76 FR 45300 - Notice of Issuance of Materials License SUA-1597 and Record of Decision for Uranerz Energy...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-28

    ... considered but eliminated from detailed analysis include conventional uranium mining and milling, conventional mining and heap leach processing, alternative site location, alternate lixiviants, and alternate...'s Agencywide Document Access and Management System (ADAMS), which provides text and image files of...

  14. Data assimilation and model evaluation experiment datasets

    NASA Technical Reports Server (NTRS)

    Lai, Chung-Cheng A.; Qian, Wen; Glenn, Scott M.

    1994-01-01

    The Institute for Naval Oceanography, in cooperation with Naval Research Laboratories and universities, executed the Data Assimilation and Model Evaluation Experiment (DAMEE) for the Gulf Stream region during fiscal years 1991-1993. Enormous effort has gone into the preparation of several high-quality and consistent datasets for model initialization and verification. This paper describes the preparation process, the temporal and spatial scopes, the contents, the structure, etc., of these datasets. The goal of DAMEE and the need of data for the four phases of experiment are briefly stated. The preparation of DAMEE datasets consisted of a series of processes: (1) collection of observational data; (2) analysis and interpretation; (3) interpolation using the Optimum Thermal Interpolation System package; (4) quality control and re-analysis; and (5) data archiving and software documentation. The data products from these processes included a time series of 3D fields of temperature and salinity, 2D fields of surface dynamic height and mixed-layer depth, analysis of the Gulf Stream and rings system, and bathythermograph profiles. To date, these are the most detailed and high-quality data for mesoscale ocean modeling, data assimilation, and forecasting research. Feedback from ocean modeling groups who tested this data was incorporated into its refinement. Suggestions for DAMEE data usages include (1) ocean modeling and data assimilation studies, (2) diagnosis and theoretical studies, and (3) comparisons with locally detailed observations.

  15. Analysis of Variance in Statistical Image Processing

    NASA Astrophysics Data System (ADS)

    Kurz, Ludwik; Hafed Benteftifa, M.

    1997-04-01

    A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.

  16. Investigation of Carbon Fiber Architecture in Braided Composites Using X-Ray CT Inspection

    NASA Technical Reports Server (NTRS)

    Rhoads, Daniel J.; Miller, Sandi G.; Roberts, Gary D.; Rauser, Richard W.; Golovaty, Dmitry; Wilber, J. Patrick; Espanol, Malena I.

    2017-01-01

    During the fabrication of braided carbon fiber composite materials, process variations occur which affect the fiber architecture. Quantitative measurements of local and global fiber architecture variations are needed to determine the potential effect of process variations on mechanical properties of the cured composite. Although non-destructive inspection via X-ray CT imaging is a promising approach, difficulties in quantitative analysis of the data arise due to the similar densities of the material constituents. In an effort to gain more quantitative information about features related to fiber architecture, methods have been explored to improve the details that can be captured by X-ray CT imaging. Metal-coated fibers and thin veils are used as inserts to extract detailed information about fiber orientations and inter-ply behavior from X-ray CT images.

  17. Ultrastructure Processing and Environmental Stability of Advanced Structural and Electronic Materials

    DTIC Science & Technology

    1992-08-31

    NC r") Form 1473, JUN 86 Previous editions are obsolete SECURITY CLASSIFICATION OF THIS PACE I 18. Subject Terms (Continued) I analysis, aging , band...detail the several steps involved in the processing of sol-gel derived optical silicas: I 1) mixing, 2) casting, 3) gelation, 4) aging , 5) drying, 6...ultrastructurcs, such as for doping applications and laser-enhanced densification. The possible disadvantages discussed in this Chapter are inherent

  18. Automation of the aircraft design process

    NASA Technical Reports Server (NTRS)

    Heldenfels, R. R.

    1974-01-01

    The increasing use of the computer to automate the aerospace product development and engineering process is examined with emphasis on structural analysis and design. Examples of systems of computer programs in aerospace and other industries are reviewed and related to the characteristics of aircraft design in its conceptual, preliminary, and detailed phases. Problems with current procedures are identified, and potential improvements from optimum utilization of integrated disciplinary computer programs by a man/computer team are indicated.

  19. Global Positioning System data collection, processing, and analysis conducted by the U.S. Geological Survey Earthquake Hazards Program

    USGS Publications Warehouse

    Murray, Jessica R.; Svarc, Jerry L.

    2017-01-01

    The U.S. Geological Survey Earthquake Science Center collects and processes Global Positioning System (GPS) data throughout the western United States to measure crustal deformation related to earthquakes and tectonic processes as part of a long‐term program of research and monitoring. Here, we outline data collection procedures and present the GPS dataset built through repeated temporary deployments since 1992. This dataset consists of observations at ∼1950 locations. In addition, this article details our data processing and analysis procedures, which consist of the following. We process the raw data collected through temporary deployments, in addition to data from continuously operating western U.S. GPS stations operated by multiple agencies, using the GIPSY software package to obtain position time series. Subsequently, we align the positions to a common reference frame, determine the optimal parameters for a temporally correlated noise model, and apply this noise model when carrying out time‐series analysis to derive deformation measures, including constant interseismic velocities, coseismic offsets, and transient postseismic motion.

  20. Study on the Preliminary Design of ARGO-M Operation System

    NASA Astrophysics Data System (ADS)

    Seo, Yoon-Kyung; Lim, Hyung-Chul; Rew, Dong-Young; Jo, Jung Hyun; Park, Jong-Uk; Park, Eun-Seo; Park, Jang-Hyun

    2010-12-01

    Korea Astronomy and Space Science Institute has been developing one mobile satellite laser ranging system named as accurate ranging system for geodetic observation-mobile (ARGO-M). Preliminary design of ARGO-M operation system (AOS) which is one of the ARGO-M subsystems was completed in 2009. Preliminary design results are applied to the following development phase by performing detailed design with analysis of pre-defined requirements and analysis of the derived specifications. This paper addresses the preliminary design of the whole AOS. The design results in operation and control part which is a key part in the operation system are described in detail. Analysis results of the interface between operation-supporting hardware and the control computer are summarized, which is necessary in defining the requirements for the operation-supporting hardware. Results of this study are expected to be used in the critical design phase to finalize the design process.

  1. Simulation of 7050 Wrought Aluminum Alloy Wheel Die Forging and its Defects Analysis based on DEFORM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang Shiquan; Yi Youping; Zhang Yuxun

    2010-06-15

    Defects such as folding, intercrystalline cracking and flow lines outcrop are very likely to occur in the forging of aluminum alloy. Moreover, it is difficult to achieve the optimal set of process parameters just by trial and error within an industrial environment. In producing 7050 wrought aluminum alloy wheel, a rigid-plastic finite element method (FEM) analysis has been performed to optimize die forging process. Processing parameters were analyzed, focusing on the effects of punch speed, friction factor and temperature. Meanwhile, mechanism as well as the evolution with respect to the defects of the wrought wheel was studied in details. Frommore » an analysis of the results, isothermal die forging was proposed for producing 7050 aluminum alloy wheel with good mechanical properties. Finally, verification experiment was carried out on hydropress.« less

  2. A new image encryption algorithm based on the fractional-order hyperchaotic Lorenz system

    NASA Astrophysics Data System (ADS)

    Wang, Zhen; Huang, Xia; Li, Yu-Xia; Song, Xiao-Na

    2013-01-01

    We propose a new image encryption algorithm on the basis of the fractional-order hyperchaotic Lorenz system. While in the process of generating a key stream, the system parameters and the derivative order are embedded in the proposed algorithm to enhance the security. Such an algorithm is detailed in terms of security analyses, including correlation analysis, information entropy analysis, run statistic analysis, mean-variance gray value analysis, and key sensitivity analysis. The experimental results demonstrate that the proposed image encryption scheme has the advantages of large key space and high security for practical image encryption.

  3. Amateur Image Pipeline Processing using Python plus PyRAF

    NASA Astrophysics Data System (ADS)

    Green, Wayne

    2012-05-01

    A template pipeline spanning observing planning to publishing is offered as a basis for establishing a long term observing program. The data reduction pipeline encapsulates all policy and procedures, providing an accountable framework for data analysis and a teaching framework for IRAF. This paper introduces the technical details of a complete pipeline processing environment using Python, PyRAF and a few other languages. The pipeline encapsulates all processing decisions within an auditable framework. The framework quickly handles the heavy lifting of image processing. It also serves as an excellent teaching environment for astronomical data management and IRAF reduction decisions.

  4. CRT image recording evaluation

    NASA Technical Reports Server (NTRS)

    1971-01-01

    Performance capabilities and limitations of a fiber optic coupled line scan CRT image recording system were investigated. The test program evaluated the following components: (1). P31 phosphor CRT with EMA faceplate; (2). P31 phosphor CRT with clear clad faceplate; (3). Type 7743 semi-gloss dry process positive print paper; (4). Type 777 flat finish dry process positive print paper; (5). Type 7842 dry process positive film; and (6). Type 1971 semi-gloss wet process positive print paper. Detailed test procedures used in each test are provided along with a description of each test, the test data, and an analysis of the results.

  5. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation, volume 2, part 1. Appendix A: Software documentation

    NASA Technical Reports Server (NTRS)

    Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.

    1982-01-01

    Documentation of the preliminary software developed as a framework for a generalized integrated robotic system simulation is presented. The program structure is composed of three major functions controlled by a program executive. The three major functions are: system definition, analysis tools, and post processing. The system definition function handles user input of system parameters and definition of the manipulator configuration. The analysis tools function handles the computational requirements of the program. The post processing function allows for more detailed study of the results of analysis tool function executions. Also documented is the manipulator joint model software to be used as the basis of the manipulator simulation which will be part of the analysis tools capability.

  6. Design and cost analysis of rapid aquifer restoration systems using flow simulation and quadratic programming.

    USGS Publications Warehouse

    Lefkoff, L.J.; Gorelick, S.M.

    1986-01-01

    Detailed two-dimensional flow simulation of a complex ground-water system is combined with quadratic and linear programming to evaluate design alternatives for rapid aquifer restoration. Results show how treatment and pumping costs depend dynamically on the type of treatment process, and capacity of pumping and injection wells, and the number of wells. The design for an inexpensive treatment process minimizes pumping costs, while an expensive process results in the minimization of treatment costs. Substantial reductions in pumping costs occur with increases in injection capacity or in the number of wells. Treatment costs are reduced by expansions in pumping capacity or injecion capacity. The analysis identifies maximum pumping and injection capacities.-from Authors

  7. Updated MDRIZTAB Parameters for ACS/WFC

    NASA Astrophysics Data System (ADS)

    Hoffman, S. L.; Avila, R. J.

    2017-03-01

    The Mikulski Archive for Space Telescopes (MAST) pipeline performs geometric distortion corrections, associated image combinations, and cosmic ray rejections with AstroDrizzle. The MDRIZTAB reference table contains a list of relevant parameters that controls this program. This document details our photometric analysis of Advanced Camera for Surveys Wide Field Channel (ACS/WFC) data processed by AstroDrizzle. Based on this analysis, we update the MDRIZTAB table to improve the quality of the drizzled products delivered by MAST.

  8. An analysis of a typology of family health nursing practice.

    PubMed

    Macduff, Colin

    2006-01-01

    In this article, Colin Macduff analyses the construction and testing of a typology of family health nursing practice. Following a summary of relevant methods and findings from two linked empirical research studies, more detailed analysis of the conceptual foundations, nature and purpose of the typology is presented. This process serves to exemplify and address some of the issues highlighted in the associated article that reviews the use of typologies within nursing.

  9. Source Code Analysis Laboratory (SCALe) for Energy Delivery Systems

    DTIC Science & Technology

    2010-12-01

    the software for reevaluation. Once the ree- valuation process is completed, CERT provides the client a report detailing the software’s con - formance...Flagged Nonconformities (FNC) Software System TP/FNC Ratio Mozilla Firefox version 2.0 6/12 50% Linux kernel version 2.6.15 10/126 8% Wine...inappropriately tuned for analysis of the Linux kernel, which has anomalous results. Customizing SCALe to work with energy system software will help

  10. Thermomechanical Stresses Analysis of a Single Event Burnout Process

    NASA Astrophysics Data System (ADS)

    Tais, Carlos E.; Romero, Eduardo; Demarco, Gustavo L.

    2009-06-01

    This work analyzes the thermal and mechanical effects arising in a power Diffusion Metal Oxide Semiconductor (DMOS) during a Single Event Burnout (SEB) process. For studying these effects we propose a more detailed simulation structure than the previously used by other authors, solving the mathematical models by means of the Finite Element Method. We use a cylindrical heat generation region, with 5 W, 10 W, 50 W and 100 W for emulating the thermal phenomena occurring during SEB processes, avoiding the complexity of the mathematical treatment of the ion-semiconductor interaction.

  11. Design optimization of a prescribed vibration system using conjoint value analysis

    NASA Astrophysics Data System (ADS)

    Malinga, Bongani; Buckner, Gregory D.

    2016-12-01

    This article details a novel design optimization strategy for a prescribed vibration system (PVS) used to mechanically filter solids from fluids in oil and gas drilling operations. A dynamic model of the PVS is developed, and the effects of disturbance torques are detailed. This model is used to predict the effects of design parameters on system performance and efficiency, as quantified by system attributes. Conjoint value analysis, a statistical technique commonly used in marketing science, is utilized to incorporate designer preferences. This approach effectively quantifies and optimizes preference-based trade-offs in the design process. The effects of designer preferences on system performance and efficiency are simulated. This novel optimization strategy yields improvements in all system attributes across all simulated vibration profiles, and is applicable to other industrial electromechanical systems.

  12. Dynamical phenomena in fast sliding nanotube models

    NASA Astrophysics Data System (ADS)

    Zhang, X. H.; Santoro, G. E.; Tartaglino, U.; Tosatti, E.

    2013-03-01

    The experimentally known fact that coaxial carbon nanotubes can be forced to slide one inside the other stimulated in the past much detailed modelling of the dynamical sliding process. Molecular dynamics simulations of sliding coaxial nanotubes showed the existence of strong frictional peaks when, at large speed, one tube excites the other with a 'washboard' frequency that happens to resonate with some intrinsic vibration frequency. At some of these special speeds we discover a striking example of dynamical symmetry breaking taking place at the nanoscale. Even when both nanotubes are perfectly left-right symmetric and nonchiral, precisely in correspondence with the large peaks of sliding friction occurring at a series of critical sliding velocities, a nonzero angular momentum spontaneously appears. A detailed analysis shows that this internal angular momentum is of phonon origin, in particular arising from preferential excitation of a right polarized (or, with equal probability, of a left polarized) outer-tube 'pseudorotation' mode, thus spontaneously breaking their exact twofold right-left degeneracy. We present and discuss a detailed analysis of nonlinear continuum equations governing this phenomenon, showing the close similarity of this phenomenon with the well-known rotational instability of a forced string, which takes place under sufficiently strong periodic forcing of the string. We also point out new elements appearing in the present problem which are 'nano', in particular the involvement of Umklapp processes and the role of sliding nanofriction.

  13. Application of structured analysis to a telerobotic system

    NASA Technical Reports Server (NTRS)

    Dashman, Eric; Mclin, David; Harrison, F. W.; Soloway, Donald; Young, Steven

    1990-01-01

    The analysis and evaluation of a multiple arm telerobotic research and demonstration system developed by the NASA Intelligent Systems Research Laboratory (ISRL) is described. Structured analysis techniques were used to develop a detailed requirements model of an existing telerobotic testbed. Performance models generated during this process were used to further evaluate the total system. A commercial CASE tool called Teamwork was used to carry out the structured analysis and development of the functional requirements model. A structured analysis and design process using the ISRL telerobotic system as a model is described. Evaluation of this system focused on the identification of bottlenecks in this implementation. The results demonstrate that the use of structured methods and analysis tools can give useful performance information early in a design cycle. This information can be used to ensure that the proposed system meets its design requirements before it is built.

  14. GEMAS: Spatial pattern analysis of Ni by using digital image processing techniques on European agricultural soil data

    NASA Astrophysics Data System (ADS)

    Jordan, Gyozo; Petrik, Attila; De Vivo, Benedetto; Albanese, Stefano; Demetriades, Alecos; Sadeghi, Martiya

    2017-04-01

    Several studies have investigated the spatial distribution of chemical elements in topsoil (0-20 cm) within the framework of the EuroGeoSurveys Geochemistry Expert Group's 'Geochemical Mapping of Agricultural and Grazing Land Soil' project . Most of these studies used geostatistical analyses and interpolated concentration maps, Exploratory and Compositional Data and Analysis to identify anomalous patterns. The objective of our investigation is to demonstrate the use of digital image processing techniques for reproducible spatial pattern recognition and quantitative spatial feature characterisation. A single element (Ni) concentration in agricultural topsoil is used to perform the detailed spatial analysis, and to relate these features to possible underlying processes. In this study, simple univariate statistical methods were implemented first, and Tukey's inner-fence criterion was used to delineate statistical outliers. The linear and triangular irregular network (TIN) interpolation was used on the outlier-free Ni data points, which was resampled to a 10*10 km grid. Successive moving average smoothing was applied to generalise the TIN model and to suppress small- and at the same time enhance significant large-scale features of Nickel concentration spatial distribution patterns in European topsoil. The TIN map smoothed with a moving average filter revealed the spatial trends and patterns without losing much detail, and it was used as the input into digital image processing, such as local maxima and minima determination, digital cross sections, gradient magnitude and gradient direction calculation, second derivative profile curvature calculation, edge detection, local variability assessment, lineament density and directional variogram analyses. The detailed image processing analysis revealed several NE-SW, E-W and NW-SE oriented elongated features, which coincide with different spatial parameter classes and alignment with local maxima and minima. The NE-SW oriented linear pattern is the dominant feature to the south of the last glaciation limit. Some of these linear features are parallel to the suture zone of the Iapetus Ocean, while the others follow the Alpine and Carpathian Chains. The highest variability zones of Ni concentration in topsoil are located in the Alps and in the Balkans where mafic and ultramafic rocks outcrop. The predominant NE-SW oriented pattern is also captured by the strong anisotropy in the semi-variograms in this direction. A single major E-W oriented north-facing feature runs along the southern border of the last glaciation zone. This zone also coincides with a series of local maxima in Ni concentration along the glaciofluvial deposits. The NW-SE elongated spatial features are less dominant and are located in the Pyrenees and Scandinavia. This study demonstrates the efficiency of systematic image processing analysis in identifying and characterising spatial geochemical patterns that often remain uncovered by the usual visual map interpretation techniques.

  15. Shortcomings of low-cost imaging systems for viewing computed radiographs.

    PubMed

    Ricke, J; Hänninen, E L; Zielinski, C; Amthauer, H; Stroszczynski, C; Liebig, T; Wolf, M; Hosten, N

    2000-01-01

    To assess potential advantages of a new PC-based viewing tool featuring image post-processing for viewing computed radiographs on low-cost hardware (PC) with a common display card and color monitor, and to evaluate the effect of using color versus monochrome monitors. Computed radiographs of a statistical phantom were viewed on a PC, with and without post-processing (spatial frequency and contrast processing), employing a monochrome or a color monitor. Findings were compared with the viewing on a radiological Workstation and evaluated with ROC analysis. Image post-processing improved the perception of low-contrast details significantly irrespective of the monitor used. No significant difference in perception was observed between monochrome and color monitors. The review at the radiological Workstation was superior to the review done using the PC with image processing. Lower quality hardware (graphic card and monitor) used in low cost PCs negatively affects perception of low-contrast details in computed radiographs. In this situation, it is highly recommended to use spatial frequency and contrast processing. No significant quality gain has been observed for the high-end monochrome monitor compared to the color display. However, the color monitor was affected stronger by high ambient illumination.

  16. Signal-processing analysis of the MC2823 radar fuze: an addendum concerning clutter effects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jelinek, D.A.

    1978-07-01

    A detailed analysis of the signal processing of the MC2823 radar fuze was published by Thompson in 1976 which enabled the computation of dud probability versus signal-to-noise ratio where the noise was receiver noise. An addendum to Thompson's work was published by Williams in 1978 that modified the weighting function used by Thompson. The analysis presented herein extends the work of Thompson to include the effects of clutter (the non-signal portion of the echo from a terrain) using the new weighting function. This extension enables computation of dud probability versus signal-to-total-noise ratio where total noise is the sum of themore » receiver-noise power and the clutter power.« less

  17. 78 FR 66929 - Intent To Conduct a Detailed Economic Impact Analysis

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-07

    ... EXPORT-IMPORT BANK Intent To Conduct a Detailed Economic Impact Analysis AGENCY: Policy and... Federal Register notice informing the public of its intent to conduct a detailed economic impact analysis... subject to a detailed economic impact analysis. DATES: The Federal Register notice published on August 5...

  18. Training Guide for Severe Weather Forecasters

    DTIC Science & Technology

    1979-11-01

    that worked very well for the example forecast is used to show the importance of parameter intensities and the actual thought processes that go into the...simplify the explanation of the complete level analysis. This entire process will be repeated for the 700 mb and 500 mb levels. Details in Figures 1 through...parameters of moderate to strong intensity must occur, in the same place at the same time. A description of what constitutes a weak, moderate, or strong

  19. Costing child protective services staff turnover.

    PubMed

    Graef, M I; Hill, E L

    2000-01-01

    This article details the process used in one state to determine the financial costs to the child welfare agency accrued over the course of one year that were directly attributable to CPS staff turnover. The formulas and process for calculating specific cost elements due to separation, replacement and training are provided. The practical considerations inherent in this type of analysis are highlighted, as well as the use of this type of data to inform agency human resource strategies.

  20. Research status of wave energy conversion (WEC) device of raft structure

    NASA Astrophysics Data System (ADS)

    Dong, Jianguo; Gao, Jingwei; Tao, Liang; Zheng, Peng

    2017-10-01

    This paper has briefly described the concept of wave energy generation and six typical conversion devices. As for raft structure, detailed analysis is provided from its development process to typical devices. Taking the design process and working principle of Plamis as an example, the general principle of raft structure is briefly described. After that, a variety of raft structure models are introduced. Finally, the advantages and disadvantages, and development trend of raft structure are pointed out.

  1. A Study in Difference: Structures and Cultures in Registered Training Organisations. Support Document 3

    ERIC Educational Resources Information Center

    Clayton, Berwyn; Fisher, Thea; Harris, Roger; Bateman, Andrea; Brown, Mike

    2008-01-01

    This document supports the report "A Study in Difference: Structures and Cultures in Registered Training Organisations." The first section outlines the methodology used to undertake the research and covers the design of the research, sample details, the data collection process and the strategy for data analysis and reporting. The…

  2. SeaSat-A Satellite Scatterometer (SASS) Validation and Experiment Plan

    NASA Technical Reports Server (NTRS)

    Schroeder, L. C. (Editor)

    1978-01-01

    This plan was generated by the SeaSat-A satellite scatterometer experiment team to define the pre-and post-launch activities necessary to conduct sensor validation and geophysical evaluation. Details included are an instrument and experiment description/performance requirements, success criteria, constraints, mission requirements, data processing requirement and data analysis responsibilities.

  3. Photo-Elicitation: Reflexivity on Method, Analysis, and Graphic Portraits

    ERIC Educational Resources Information Center

    Richard, Veronica M.; Lahman, Maria K. E.

    2015-01-01

    In this methodological discussion, the authors detail and reflect on the processes of using photo-elicitation interviewing as a way to align with positive qualitative methodologies, to gain access to participant beliefs and values, and to highlight participant voices through their choices of words and visuals. A review of the literature and an…

  4. Gesture and the Mathematics of Motion.

    ERIC Educational Resources Information Center

    Noble, Tracy

    This paper investigates one high school student's use of gestures in an interview context in which he worked on the problem of understanding graphical representations of motion. The goal of the investigation was to contribute a detailed analysis of the process of learning as it occurred over a short time period in order to contribute to the…

  5. The EGRET data products

    NASA Technical Reports Server (NTRS)

    Mattox, J. R.; Bertsch, D. L.; Fichtel, C. E.; Hartman, R. C.; Hunter, S. D.; Kanbach, G.; Kniffen, D. A.; Kwok, P. W.; Lin, Y. C.; Mayer-Hasselwander, H. A.

    1992-01-01

    We describe the Energetic Gamma Ray Experiment Telescope (EGRET) data products which we anticipate will suffice for virtually all guest and archival investigations. The production process, content, availability, format, and the associated software of each product is described. Supplied here is sufficient detail for each researcher to do analysis which is not supported by extant software.

  6. Optical method for measuring the surface area of a threaded fastener

    Treesearch

    Douglas Rammer; Samuel Zelinka

    2010-01-01

    This article highlights major aspects of a new optical technique to determine the surface area of a threaded fastener; the theoretical framework has been reported elsewhere. Specifically, this article describes general surface area expressions used in the analysis, details of image acquisition system, and major image processing steps contained within the measurement...

  7. 76 FR 53500 - Notice of the Nuclear Regulatory Commission Issuance of Materials License SUA-1598 and Record of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-26

    ... (ADAMS), which provides text and image files of the NRC's public documents in the NRC Library at http... considered, but eliminated from detailed analysis, include conventional uranium mining and milling, conventional mining and heap leach processing, alternate lixiviants, and alternative wastewater disposal...

  8. Analysis of data from spacecraft (stratospheric warmings)

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The details of the stratospheric warming processes as to time, area, and intensity were established, and the warmings with other terrestrial and solar phenomena occurring at satellite platform altitudes, or observable from satellite platforms, were correlated. Links were sought between the perturbed upper atmosphere (mesosphere and thermosphere) and the stratosphere that might explain stratospheric warmings.

  9. An Analysis of Educational Policy: Implications for Minority Community Concerns.

    ERIC Educational Resources Information Center

    Harris, J. John, III; Ogle, Terry

    The paper presents a detailed overview of educational policymaking and discusses the need for minority groups to be involved in policy formation. The first section describes the distinguishing characteristics of the main elements of the functions of administration and policymaking process. The second section examines the following three models of…

  10. A Human Factor Analysis to Mitigate Fall Risk Factors in an Aerospace Environment

    NASA Technical Reports Server (NTRS)

    Ware, Joylene H.

    2010-01-01

    This slide presentation reviews the study done to quanitfy the risks from falls from three locations (i.e., Shuttle Landing Facility Launch Complex Payloads and Vehicle Assembly Building) at the Kennedy Space Center. The Analytical Hierarchy Process (AHP) is reviewed and the mathematical model developed is detailed.

  11. A broken krebs cycle in macrophages.

    PubMed

    O'Neill, Luke A J

    2015-03-17

    Macrophages undergo metabolic rewiring during polarization but details of this process are unclear. In this issue of Immunity, Jha et al. (2015) report a systems approach for unbiased analysis of cellular metabolism that reveals key metabolites and metabolic pathways required for distinct macrophage polarization states. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Team Design Communication Patterns in e-Learning Design and Development

    ERIC Educational Resources Information Center

    Rapanta, Chrysi; Maina, Marcelo; Lotz, Nicole; Bacchelli, Alberto

    2013-01-01

    Prescriptive stage models have been found insufficient to describe the dynamic aspects of designing, especially in interdisciplinary e-learning design teams. There is a growing need for a systematic empirical analysis of team design processes that offer deeper and more detailed insights into instructional design (ID) than general models can offer.…

  13. Student Teachers' Patterns of Reflection in the Context of Teaching Practice

    ERIC Educational Resources Information Center

    Toom, Auli; Husu, Jukka; Patrikainen, Sanna

    2015-01-01

    This study clarifies the basic structure of student teachers' reflective thinking. It presents a constructivist account of teacher knowledge through a detailed analysis of various patterns of reflection in student teacher portfolios. We aim to gain a greater understanding of the process and outcomes of portfolio writing in the context of teaching…

  14. Measurement precision and noise analysis of CCD cameras

    NASA Astrophysics Data System (ADS)

    Wu, ZhenSen; Li, Zhiyang; Zhang, Ping

    1993-09-01

    CHINA The lirait precision of CCD camera with 1O. bit analogue to digital conversion is estimated in this paper . The noise effect on ineasurenent precision and the noise characteristics are analyzed in details. The noise process means are also discussed and the diagram of noise properties is given in this paper.

  15. Digital signal processing of the phonocardiogram: review of the most recent advancements.

    PubMed

    Durand, L G; Pibarot, P

    1995-01-01

    The objective of the present paper is to provide a detailed review of the most recent developments in instrumentation and signal processing of digital phonocardiography and heart auscultation. After a short introduction, the paper presents a brief history of heart auscultation and phonocardiography, which is followed by a summary of the basic theories and controversies regarding the genesis of the heart sounds. The application of spectral analysis and the potential of new time-frequency representations and cardiac acoustic mapping to resolve the controversies and better understand the genesis and transmission of heart sounds and murmurs within the heart-thorax acoustic system are reviewed. The most recent developments in the application of linear predictive coding, spectral analysis, time-frequency representation techniques, and pattern recognition for the detection and follow-up of native and prosthetic valve degeneration and dysfunction are also presented in detail. New areas of research and clinical applications and areas of potential future developments are then highlighted. The final section is a discussion about a multidegree of freedom theory on the origin of the heart sounds and murmurs, which is completed by the authors' conclusion.

  16. Aircraft Loss of Control: Problem Analysis for the Development and Validation of Technology Solutions

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Newman, Richard L.; Crider, Dennis A.; Klyde, David H.; Foster, John V.; Groff, Loren

    2016-01-01

    Aircraft loss of control (LOC) is a leading cause of fatal accidents across all transport airplane and operational classes. LOC can result from a wide spectrum of precursors (or hazards), often occurring in combination. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of conditions and uncertainties, including multiple hazards, and the validation process must provide a means of assessing system effectiveness and coverage of these hazards. This paper provides a detailed description of a methodology for analyzing LOC as a dynamics and control problem for the purpose of developing effective technology solutions. The paper includes a definition of LOC based on several recent publications, a detailed description of a refined LOC accident analysis process that is illustrated via selected example cases, and a description of planned follow-on activities for identifying future potential LOC risks and the development of LOC test scenarios. Some preliminary considerations for LOC of Unmanned Aircraft Systems (UAS) and for their safe integration into the National Airspace System (NAS) are also discussed.

  17. Carbothermic synthesis of 820 μm uranium nitride kernels: Literature review, thermodynamics, analysis, and related experiments

    NASA Astrophysics Data System (ADS)

    Lindemer, T. B.; Voit, S. L.; Silva, C. M.; Besmann, T. M.; Hunt, R. D.

    2014-05-01

    The US Department of Energy is developing a new nuclear fuel that would be less susceptible to ruptures during a loss-of-coolant accident. The fuel would consist of tristructural isotropic coated particles with uranium nitride (UN) kernels with diameters near 825 μm. This effort explores factors involved in the conversion of uranium oxide-carbon microspheres into UN kernels. An analysis of previous studies with sufficient experimental details is provided. Thermodynamic calculations were made to predict pressures of carbon monoxide and other relevant gases for several reactions that can be involved in the conversion of uranium oxides and carbides into UN. Uranium oxide-carbon microspheres were heated in a microbalance with an attached mass spectrometer to determine details of calcining and carbothermic conversion in argon, nitrogen, and vacuum. A model was derived from experiments on the vacuum conversion to uranium oxide-carbide kernels. UN-containing kernels were fabricated using this vacuum conversion as part of the overall process. Carbonitride kernels of ∼89% of theoretical density were produced along with several observations concerning the different stages of the process.

  18. The Developmental Process of the Growing Motile Ciliary Tip Region.

    PubMed

    Reynolds, Matthew J; Phetruen, Tanaporn; Fisher, Rebecca L; Chen, Ke; Pentecost, Brian T; Gomez, George; Ounjai, Puey; Sui, Haixin

    2018-05-22

    Eukaryotic motile cilia/flagella play vital roles in various physiological processes in mammals and some protists. Defects in cilia formation underlie multiple human disorders, known as ciliopathies. The detailed processes of cilia growth and development are still far from clear despite extensive studies. In this study, we characterized the process of cilium formation (ciliogenesis) by investigating the newly developed motile cilia of deciliated protists using complementary techniques in electron microscopy and image analysis. Our results demonstrated that the distal tip region of motile cilia exhibit progressive morphological changes as cilia develop. This developmental process is time-dependent and continues after growing cilia reach their full lengths. The structural analysis of growing ciliary tips revealed that B-tubules of axonemal microtubule doublets terminate far away from the tip end, which is led by the flagellar tip complex (FTC), demonstrating that the FTC might not directly mediate the fast turnover of intraflagellar transport (IFT).

  19. Making the Hubble Space Telescope servicing mission safe

    NASA Technical Reports Server (NTRS)

    Bahr, N. J.; Depalo, S. V.

    1992-01-01

    The implementation of the HST system safety program is detailed. Numerous safety analyses are conducted through various phases of design, test, and fabrication, and results are presented to NASA management for discussion during dedicated safety reviews. Attention is given to the system safety assessment and risk analysis methodologies used, i.e., hazard analysis, fault tree analysis, and failure modes and effects analysis, and to how they are coupled with engineering and test analysis for a 'synergistic picture' of the system. Some preliminary safety analysis results, showing the relationship between hazard identification, control or abatement, and finally control verification, are presented as examples of this safety process.

  20. Safeguards Approaches for Black Box Processes or Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diaz-Marcano, Helly; Gitau, Ernest TN; Hockert, John

    2013-09-25

    The objective of this study is to determine whether a safeguards approach can be developed for “black box” processes or facilities. These are facilities where a State or operator may limit IAEA access to specific processes or portions of a facility; in other cases, the IAEA may be prohibited access to the entire facility. The determination of whether a black box process or facility is safeguardable is dependent upon the details of the process type, design, and layout; the specific limitations on inspector access; and the restrictions placed upon the design information that can be provided to the IAEA. Thismore » analysis identified the necessary conditions for safeguardability of black box processes and facilities.« less

  1. Pan-sharpening algorithm to remove thin cloud via mask dodging and nonsampled shift-invariant shearlet transform

    NASA Astrophysics Data System (ADS)

    Shi, Cheng; Liu, Fang; Li, Ling-Ling; Hao, Hong-Xia

    2014-01-01

    The goal of pan-sharpening is to get an image with higher spatial resolution and better spectral information. However, the resolution of the pan-sharpened image is seriously affected by the thin clouds. For a single image, filtering algorithms are widely used to remove clouds. These kinds of methods can remove clouds effectively, but the detail lost in the cloud removal image is also serious. To solve this problem, a pan-sharpening algorithm to remove thin cloud via mask dodging and nonsampled shift-invariant shearlet transform (NSST) is proposed. For the low-resolution multispectral (LR MS) and high-resolution panchromatic images with thin clouds, a mask dodging method is used to remove clouds. For the cloud removal LR MS image, an adaptive principal component analysis transform is proposed to balance the spectral information and spatial resolution in the pan-sharpened image. Since the clouds removal process causes the detail loss problem, a weight matrix is designed to enhance the details of the cloud regions in the pan-sharpening process, but noncloud regions remain unchanged. And the details of the image are obtained by NSST. Experimental results over visible and evaluation metrics demonstrate that the proposed method can keep better spectral information and spatial resolution, especially for the images with thin clouds.

  2. Rotorblades for large wind turbines

    NASA Astrophysics Data System (ADS)

    Wackerle, P. M.; Hahn, M.

    1981-09-01

    Details of the design work and manufacturing process for a running prototype production of 25 m long composite rotor blades for wind energy generators are presented. The blades are of the 'integrated spar design' type and consist of a glass fiber skin and a PVC core. A computer program (and its action tree) is used for the analysis of the multi-connected hybrid cross-section, in order to achieve optimal design specifications. Four tools are needed for the production of two blade types, including two molds, and milling, cutting and drilling jigs. The manufacturing processes for the molds, jigs and blades are discussed in detail. The final acceptance of the blade is based on a static test where the flexibility of the blade is checked by magnitude of load and deflection, and a dynamic test evaluating the natural frequencies in bending and torsion.

  3. Cost model relationships between textile manufacturing processes and design details for transport fuselage elements

    NASA Technical Reports Server (NTRS)

    Metschan, Stephen L.; Wilden, Kurtis S.; Sharpless, Garrett C.; Andelman, Rich M.

    1993-01-01

    Textile manufacturing processes offer potential cost and weight advantages over traditional composite materials and processes for transport fuselage elements. In the current study, design cost modeling relationships between textile processes and element design details were developed. Such relationships are expected to help future aircraft designers to make timely decisions on the effect of design details and overall configurations on textile fabrication costs. The fundamental advantage of a design cost model is to insure that the element design is cost effective for the intended process. Trade studies on the effects of processing parameters also help to optimize the manufacturing steps for a particular structural element. Two methods of analyzing design detail/process cost relationships developed for the design cost model were pursued in the current study. The first makes use of existing databases and alternative cost modeling methods (e.g. detailed estimating). The second compares design cost model predictions with data collected during the fabrication of seven foot circumferential frames for ATCAS crown test panels. The process used in this case involves 2D dry braiding and resin transfer molding of curved 'J' cross section frame members having design details characteristic of the baseline ATCAS crown design.

  4. Metabolomics-Based Elucidation of Active Metabolic Pathways in Erythrocytes and HSC-Derived Reticulocytes.

    PubMed

    Srivastava, Anubhav; Evans, Krystal J; Sexton, Anna E; Schofield, Louis; Creek, Darren J

    2017-04-07

    A detailed analysis of the metabolic state of human-stem-cell-derived erythrocytes allowed us to characterize the existence of active metabolic pathways in younger reticulocytes and compare them to mature erythrocytes. Using high-resolution LC-MS-based untargeted metabolomics, we found that reticulocytes had a comparatively much richer repertoire of metabolites, which spanned a range of metabolite classes. An untargeted metabolomics analysis using stable-isotope-labeled glucose showed that only glycolysis and the pentose phosphate pathway actively contributed to the biosynthesis of metabolites in erythrocytes, and these pathways were upregulated in reticulocytes. Most metabolite species found to be enriched in reticulocytes were residual pools of metabolites produced by earlier erythropoietic processes, and their systematic depletion in mature erythrocytes aligns with the simplification process, which is also seen at the cellular and the structural level. Our work shows that high-resolution LC-MS-based untargeted metabolomics provides a global coverage of the biochemical species that are present in erythrocytes. However, the incorporation of stable isotope labeling provides a more accurate description of the active metabolic processes that occur in each developmental stage. To our knowledge, this is the first detailed characterization of the active metabolic pathways of the erythroid lineage, and it provides a rich database for understanding the physiology of the maturation of reticulocytes into mature erythrocytes.

  5. Research on the use of space resources

    NASA Technical Reports Server (NTRS)

    Carroll, W. F. (Editor)

    1983-01-01

    The second year of a multiyear research program on the processing and use of extraterrestrial resources is covered. The research tasks included: (1) silicate processing, (2) magma electrolysis, (3) vapor phase reduction, and (4) metals separation. Concomitant studies included: (1) energy systems, (2) transportation systems, (3) utilization analysis, and (4) resource exploration missions. Emphasis in fiscal year 1982 was placed on the magma electrolysis and vapor phase reduction processes (both analytical and experimental) for separation of oxygen and metals from lunar regolith. The early experimental work on magma electrolysis resulted in gram quantities of iron (mixed metals) and the identification of significant anode, cathode, and container problems. In the vapor phase reduction tasks a detailed analysis of various process concepts led to the selection of two specific processes designated as ""Vapor Separation'' and ""Selective Ionization.'' Experimental work was deferred to fiscal year 1983. In the Silicate Processing task a thermophysical model of the casting process was developed and used to study the effect of variations in material properties on the cooling behavior of lunar basalt.

  6. A review of biostratigraphic studies in the olistostrome deposits of Karangsambung Formation

    NASA Astrophysics Data System (ADS)

    Hendrizan, Marfasran

    2018-02-01

    Planktonic foraminifera is widely used for marine sediment biostratigraphy. Foraminiferal biostratigraphy of Karangsambung Formation is relatively rare to be investigated by previous researchers. A review of foraminiferal biostratigraphy is expected to be early work to perform a research about the ages of Tertiary rock formations in Karangsambung. The research area is formed by olistostrome process; a sedimentary slide deposit characterized by bodies of harder rock mixed and dispersed in a matrix. Biostratigraphic studies based on foraminifera and nannoplankton in Karangsambung Formation are still qualitative analysis using fossils biomarker. However, the age of this formation is still debatable based on foraminifera and nannofossil analysis. Two explanations of debatable ages in Karangsambung Formation that is possibly developed in Karangsambung area: firstly, Karangsambung Formation is characterized by normal sedimentation in some places and other regions such Kali Welaran and Clebok, Village as a product of olistostrome, and secondly, Karangsambung Formation is olistostrome deposit. However, micropaleontology sampling and analysis in matrix clays from olistostrome were ignored causing biostratigraphical results in those matrix clays occurred in normal sedimentation process and achieving the age of middle Eocene to Oligocene. We suppose previous authors picked samples in matrix of Karangsambung Formation from several river sections, which will make misinterpretation of the age of Karangsambung Formation. The age of middle to late Eocene probably is the dates of the older sediment that was reworked by sliding and sampling process and accumulated in Karangsambung Formation. The date of Karangsambung Fm is in Oligocene period based on a finding of several calcareous nannofossils. Detailed micropaleontological analysis of olistostrome deposits in Karangsambung Formation should be reevaluated for new finding of the accurate dating. Re-evaluation should start from detailed sedimentological mapping of Karangsambung Fm transects based on previous authors especially Kali Welaran, Jatibungkus transect and Clebok section followed by systematic sampling of normal sedimentation process from olistostrome products and matrix clays of olistostrome Karangsambung Formation. Finally, quantitative method of micropaleontological analysis can be applied to identify the age of Karangsambung Formation.

  7. Research in interactive scene analysis

    NASA Technical Reports Server (NTRS)

    Tenenbaum, J. M.; Barrow, H. G.; Weyl, S. A.

    1976-01-01

    Cooperative (man-machine) scene analysis techniques were developed whereby humans can provide a computer with guidance when completely automated processing is infeasible. An interactive approach promises significant near-term payoffs in analyzing various types of high volume satellite imagery, as well as vehicle-based imagery used in robot planetary exploration. This report summarizes the work accomplished over the duration of the project and describes in detail three major accomplishments: (1) the interactive design of texture classifiers; (2) a new approach for integrating the segmentation and interpretation phases of scene analysis; and (3) the application of interactive scene analysis techniques to cartography.

  8. Uncertainty Analysis of NASA Glenn's 8- by 6-Foot Supersonic Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Stephens, Julia E.; Hubbard, Erin P.; Walter, Joel A.; McElroy, Tyler

    2016-01-01

    An analysis was performed to determine the measurement uncertainty of the Mach Number of the 8- by 6-foot Supersonic Wind Tunnel at the NASA Glenn Research Center. This paper details the analysis process used, including methods for handling limited data and complicated data correlations. Due to the complexity of the equations used, a Monte Carlo Method was utilized for this uncertainty analysis. A summary of the findings are presented as pertains to understanding what the uncertainties are, how they impact various research tests in the facility, and methods of reducing the uncertainties in the future.

  9. Inlet Development for a Rocket Based Combined Cycle, Single Stage to Orbit Vehicle Using Computational Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    DeBonis, J. R.; Trefny, C. J.; Steffen, C. J., Jr.

    1999-01-01

    Design and analysis of the inlet for a rocket based combined cycle engine is discussed. Computational fluid dynamics was used in both the design and subsequent analysis. Reynolds averaged Navier-Stokes simulations were performed using both perfect gas and real gas assumptions. An inlet design that operates over the required Mach number range from 0 to 12 was produced. Performance data for cycle analysis was post processed using a stream thrust averaging technique. A detailed performance database for cycle analysis is presented. The effect ot vehicle forebody compression on air capture is also examined.

  10. Galileo and Ulysses missions safety analysis and launch readiness status

    NASA Technical Reports Server (NTRS)

    Cork, M. Joseph; Turi, James A.

    1989-01-01

    The Galileo spacecraft, which will release probes to explore the Jupiter system, was launched in October, 1989 as the payload on STS-34, and the Ulysses spacecraft, which will fly by Jupiter en route to a polar orbit of the sun, is presently entering system-test activity in preparation for an October, 1990 launch. This paper reviews the Galileo and Ulysses mission objectives and design approaches and presents details of the missions' safety analysis. The processes used to develop the safety analysis are described and the results of safety tests are presented.

  11. Performance Analysis of Multilevel Parallel Applications on Shared Memory Architectures

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Caubet, Jordi; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    In this paper we describe how to apply powerful performance analysis techniques to understand the behavior of multilevel parallel applications. We use the Paraver/OMPItrace performance analysis system for our study. This system consists of two major components: The OMPItrace dynamic instrumentation mechanism, which allows the tracing of processes and threads and the Paraver graphical user interface for inspection and analyses of the generated traces. We describe how to use the system to conduct a detailed comparative study of a benchmark code implemented in five different programming paradigms applicable for shared memory

  12. Assessment of documentation requirements under DOE 5481. 1, Safety Analysis and Review System (SARS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Browne, E.T.

    1981-03-01

    This report assesses the requirements of DOE Order 5481.1, Safety Analysis and Review System for DOE Operations (SARS) in regard to maintaining SARS documentation. Under SARS, all pertinent details of the entire safety analysis and review process for each DOE operation are to be traceable from the initial identification of a hazard. This report is intended to provide assistance in identifying the points in the SARS cycle at which documentation is required, what type of documentation is most appropriate, and where it ultimately should be maintained.

  13. Probabilistic Structural Analysis of the SRB Aft Skirt External Fitting Modification

    NASA Technical Reports Server (NTRS)

    Townsend, John S.; Peck, J.; Ayala, S.

    1999-01-01

    NASA has funded several major programs (the PSAM Project is an example) to develop Probabilistic Structural Analysis Methods and tools for engineers to apply in the design and assessment of aerospace hardware. A probabilistic finite element design tool, known as NESSUS, is used to determine the reliability of the Space Shuttle Solid Rocket Booster (SRB) aft skirt critical weld. An external bracket modification to the aft skirt provides a comparison basis for examining the details of the probabilistic analysis and its contributions to the design process.

  14. Infusing Reliability Techniques into Software Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  15. Detailed Characterization of Human Induced Pluripotent Stem Cells Manufactured for Therapeutic Applications.

    PubMed

    Baghbaderani, Behnam Ahmadian; Syama, Adhikarla; Sivapatham, Renuka; Pei, Ying; Mukherjee, Odity; Fellner, Thomas; Zeng, Xianmin; Rao, Mahendra S

    2016-08-01

    We have recently described manufacturing of human induced pluripotent stem cells (iPSC) master cell banks (MCB) generated by a clinically compliant process using cord blood as a starting material (Baghbaderani et al. in Stem Cell Reports, 5(4), 647-659, 2015). In this manuscript, we describe the detailed characterization of the two iPSC clones generated using this process, including whole genome sequencing (WGS), microarray, and comparative genomic hybridization (aCGH) single nucleotide polymorphism (SNP) analysis. We compare their profiles with a proposed calibration material and with a reporter subclone and lines made by a similar process from different donors. We believe that iPSCs are likely to be used to make multiple clinical products. We further believe that the lines used as input material will be used at different sites and, given their immortal status, will be used for many years or even decades. Therefore, it will be important to develop assays to monitor the state of the cells and their drift in culture. We suggest that a detailed characterization of the initial status of the cells, a comparison with some calibration material and the development of reporter sublcones will help determine which set of tests will be most useful in monitoring the cells and establishing criteria for discarding a line.

  16. Computer program to perform cost and weight analysis of transport aircraft. Volume 1: Summary

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A digital computer program for evaluating the weight and costs of advanced transport designs was developed. The resultant program, intended for use at the preliminary design level, incorporates both batch mode and interactive graphics run capability. The basis of the weight and cost estimation method developed is a unique way of predicting the physical design of each detail part of a vehicle structure at a time when only configuration concept drawings are available. In addition, the technique relies on methods to predict the precise manufacturing processes and the associated material required to produce each detail part. Weight data are generated in four areas of the program. Overall vehicle system weights are derived on a statistical basis as part of the vehicle sizing process. Theoretical weights, actual weights, and the weight of the raw material to be purchased are derived as part of the structural synthesis and part definition processes based on the computed part geometry.

  17. Process Design and Economics for the Conversion of Lignocellulosic Biomass to Hydrocarbon Fuels. Thermochemical Research Pathways with In Situ and Ex Situ Upgrading of Fast Pyrolysis Vapors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dutta, Abhijit; Sahir, Asad; Tan, Eric

    This report was developed as part of the U.S. Department of Energy’s Bioenergy Technologies Office’s efforts to enable the development of technologies for the production of infrastructurecompatible, cost-competitive liquid hydrocarbon fuels from biomass. Specifically, this report details two conceptual designs based on projected product yields and quality improvements via catalyst development and process integration. It is expected that these research improvements will be made within the 2022 timeframe. The two conversion pathways detailed are (1) in situ and (2) ex situ upgrading of vapors produced from the fast pyrolysis of biomass. While the base case conceptual designs and underlying assumptionsmore » outline performance metrics for feasibility, it should be noted that these are only two of many other possibilities in this area of research. Other promising process design options emerging from the research will be considered for future techno-economic analysis.« less

  18. Texture as a basis for acoustic classification of substrate in the nearshore region

    NASA Astrophysics Data System (ADS)

    Dennison, A.; Wattrus, N. J.

    2016-12-01

    Segmentation and classification of substrate type from two locations in Lake Superior, are predicted using multivariate statistical processing of textural measures derived from shallow-water, high-resolution multibeam bathymetric data. During a multibeam sonar survey, both bathymetric and backscatter data are collected. It is well documented that the statistical characteristic of a sonar backscatter mosaic is dependent on substrate type. While classifying the bottom-type on the basis on backscatter alone can accurately predict and map bottom-type, it lacks the ability to resolve and capture fine textural details, an important factor in many habitat mapping studies. Statistical processing can capture the pertinent details about the bottom-type that are rich in textural information. Further multivariate statistical processing can then isolate characteristic features, and provide the basis for an accurate classification scheme. Preliminary results from an analysis of bathymetric data and ground-truth samples collected from the Amnicon River, Superior, Wisconsin, and the Lester River, Duluth, Minnesota, demonstrate the ability to process and develop a novel classification scheme of the bottom type in two geomorphologically distinct areas.

  19. Biosurfactant production by Aureobasidium pullulans in stirred tank bioreactor: New approach to understand the influence of important variables in the process.

    PubMed

    Brumano, Larissa Pereira; Antunes, Felipe Antonio Fernandes; Souto, Sara Galeno; Dos Santos, Júlio Cesar; Venus, Joachim; Schneider, Roland; da Silva, Silvio Silvério

    2017-11-01

    Surfactants are amphiphilic molecules with large industrial applications produced currently by chemical routes mainly derived from oil industry. However, biotechnological process, aimed to develop new sustainable process configurations by using favorable microorganisms, already requires investigations in more details. Thus, we present a novel approach for biosurfactant production using the promising yeast Aureobasidium pullulans LB 83, in stirred tank reactor. A central composite face-centered design was carried out to evaluate the effect of the aeration rate (0.1-1.1min -1 ) and sucrose concentration (20-80g.L -1 ) in the biosurfactant maximum tensoactivity and productivity. Statistical analysis showed that the use of variables at high levels enhanced tensoactivity, showing 8.05cm in the oil spread test and productivity of 0.0838cm.h -1 . Also, unprecedented investigation of aeration rate and sucrose concentration relevance in biosurfactant production by A. pullulans in stirred tank reactor was detailed, demonstrating the importance to establish adequate conditions in bioreactors, aimed to scale-up process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. SERS as a tool for in vitro toxicology.

    PubMed

    Fisher, Kate M; McLeish, Jennifer A; Jamieson, Lauren E; Jiang, Jing; Hopgood, James R; McLaughlin, Stephen; Donaldson, Ken; Campbell, Colin J

    2016-06-23

    Measuring markers of stress such as pH and redox potential are important when studying toxicology in in vitro models because they are markers of oxidative stress, apoptosis and viability. While surface enhanced Raman spectroscopy is ideally suited to the measurement of redox potential and pH in live cells, the time-intensive nature and perceived difficulty in signal analysis and interpretation can be a barrier to its broad uptake by the biological community. In this paper we detail the development of signal processing and analysis algorithms that allow SERS spectra to be automatically processed so that the output of the processing is a pH or redox potential value. By automating signal processing we were able to carry out a comparative evaluation of the toxicology of silver and zinc oxide nanoparticles and correlate our findings with qPCR analysis. The combination of these two analytical techniques sheds light on the differences in toxicology between these two materials from the perspective of oxidative stress.

  1. Digital signal processing for velocity measurements in dynamical material's behaviour studies.

    PubMed

    Devlaminck, Julien; Luc, Jérôme; Chanal, Pierre-Yves

    2014-03-01

    In this work, we describe different configurations of optical fiber interferometers (types Michelson and Mach-Zehnder) used to measure velocities during dynamical material's behaviour studies. We detail the algorithms of processing developed and optimized to improve the performance of these interferometers especially in terms of time and frequency resolutions. Three methods of analysis of interferometric signals were studied. For Michelson interferometers, the time-frequency analysis of signals by Short-Time Fourier Transform (STFT) is compared to a time-frequency analysis by Continuous Wavelet Transform (CWT). The results have shown that the CWT was more suitable than the STFT for signals with low signal-to-noise, and low velocity and high acceleration areas. For Mach-Zehnder interferometers, the measurement is carried out by analyzing the phase shift between three interferometric signals (Triature processing). These three methods of digital signal processing were evaluated, their measurement uncertainties estimated, and their restrictions or operational limitations specified from experimental results performed on a pulsed power machine.

  2. Perturbations from strings don't look like strings!

    NASA Technical Reports Server (NTRS)

    Albrecht, Andreas; Stebbins, Albert

    1991-01-01

    A systematic analysis is challenging popular ideas about perturbation from cosmic strings. One way in which the picture has changed is reviewed. It is concluded that, while the scaling properties of cosmic strings figure significantly in the analysis, care must be taken when thinking in terms of single time snapshots. The process of seeding density perturbations is not fundamentally localized in time, and this fact can wash out many of the details which appear in a single snapshot.

  3. Modern Methods of Rail Welding

    NASA Astrophysics Data System (ADS)

    Kozyrev, Nikolay A.; Kozyreva, Olga A.; Usoltsev, Aleksander A.; Kryukov, Roman E.; Shevchenko, Roman A.

    2017-10-01

    Existing methods of rail welding, which are enable to get continuous welded rail track, are observed in this article. Analysis of existing welding methods allows considering an issue of continuous rail track in detail. Metallurgical and welding technologies of rail welding and also process technologies reducing aftereffects of temperature exposure are important factors determining the quality and reliability of the continuous rail track. Analysis of the existing methods of rail welding enable to find the research line for solving this problem.

  4. Techno-economic assessment of hybrid extraction and distillation processes for furfural production from lignocellulosic biomass.

    PubMed

    Nhien, Le Cao; Long, Nguyen Van Duc; Kim, Sangyong; Lee, Moonyong

    2017-01-01

    Lignocellulosic biomass is one of the most promising alternatives for replacing mineral resources to overcome global warming, which has become the most important environmental issue in recent years. Furfural was listed by the National Renewable Energy Laboratory as one of the top 30 potential chemicals arising from biomass. However, the current production of furfural is energy intensive and uses inefficient technology. Thus, a hybrid purification process that combines extraction and distillation to produce furfural from lignocellulosic biomass was considered and investigated in detail to improve the process efficiency. This effective hybrid process depends on the extracting solvent, which was selected based on a comprehensive procedure that ranged from solvent screening to complete process design. Various solvents were first evaluated in terms of their extraction ability. Then, the most promising solvents were selected to study the separation feasibility. Eventually, processes that used the three best solvents (toluene, benzene, and butyl chloride) were designed and optimized in detail using Aspen Plus. Sustainability analysis was performed to evaluate these processes in terms of their energy requirements, total annual costs (TAC), and carbon dioxide (CO 2 ) emissions. The results showed that butyl chloride was the most suitable solvent for the hybrid furfural process because it could save 44.7% of the TAC while reducing the CO 2 emissions by 45.5% compared to the toluene process. In comparison with the traditional purification process using distillation, this suggested hybrid extraction/distillation process can save up to 19.2% of the TAC and reduce 58.3% total annual CO 2 emissions. Furthermore, a sensitivity analysis of the feed composition and its effect on the performance of the proposed hybrid system was conducted. Butyl chloride was found to be the most suitable solvent for the hybrid extraction/distillation process of furfural production. The proposed hybrid sequence was more favorable than the traditional distillation process when the methanol fraction of the feed stream was <3% and more benefit could be obtained when that fraction decreased.

  5. Identification of the isomers using principal component analysis (PCA) method

    NASA Astrophysics Data System (ADS)

    Kepceoǧlu, Abdullah; Gündoǧdu, Yasemin; Ledingham, Kenneth William David; Kilic, Hamdi Sukur

    2016-03-01

    In this work, we have carried out a detailed statistical analysis for experimental data of mass spectra from xylene isomers. Principle Component Analysis (PCA) was used to identify the isomers which cannot be distinguished using conventional statistical methods for interpretation of their mass spectra. Experiments have been carried out using a linear TOF-MS coupled to a femtosecond laser system as an energy source for the ionisation processes. We have performed experiments and collected data which has been analysed and interpreted using PCA as a multivariate analysis of these spectra. This demonstrates the strength of the method to get an insight for distinguishing the isomers which cannot be identified using conventional mass analysis obtained through dissociative ionisation processes on these molecules. The PCA results dependending on the laser pulse energy and the background pressure in the spectrometers have been presented in this work.

  6. Thermal Deformation and RF Performance Analyses for the SWOT Large Deployable Ka-Band Reflectarray

    NASA Technical Reports Server (NTRS)

    Fang, H.; Sunada, E.; Chaubell, J.; Esteban-Fernandez, D.; Thomson, M.; Nicaise, F.

    2010-01-01

    A large deployable antenna technology for the NASA Surface Water and Ocean Topography (SWOT) Mission is currently being developed by JPL in response to NRC Earth Science Tier 2 Decadal Survey recommendations. This technology is required to enable the SWOT mission due to the fact that no currently available antenna is capable of meeting SWOT's demanding Ka-Band remote sensing requirements. One of the key aspects of this antenna development is to minimize the effect of the on-orbit thermal distortion to the antenna RF performance. An analysis process which includes: 1) the on-orbit thermal analysis to obtain the temperature distribution; 2) structural deformation analysis to get the geometry of the antenna surface; and 3) the RF performance with the given deformed antenna surface has been developed to accommodate the development of this antenna technology. The detailed analysis process and some analysis results will be presented and discussed by this paper.

  7. The Flight Optimization System Weights Estimation Method

    NASA Technical Reports Server (NTRS)

    Wells, Douglas P.; Horvath, Bryce L.; McCullers, Linwood A.

    2017-01-01

    FLOPS has been the primary aircraft synthesis software used by the Aeronautics Systems Analysis Branch at NASA Langley Research Center. It was created for rapid conceptual aircraft design and advanced technology impact assessments. FLOPS is a single computer program that includes weights estimation, aerodynamics estimation, engine cycle analysis, propulsion data scaling and interpolation, detailed mission performance analysis, takeoff and landing performance analysis, noise footprint estimation, and cost analysis. It is well known as a baseline and common denominator for aircraft design studies. FLOPS is capable of calibrating a model to known aircraft data, making it useful for new aircraft and modifications to existing aircraft. The weight estimation method in FLOPS is known to be of high fidelity for conventional tube with wing aircraft and a substantial amount of effort went into its development. This report serves as a comprehensive documentation of the FLOPS weight estimation method. The development process is presented with the weight estimation process.

  8. Fixed Eigenvector Analysis of Thermographic NDE Data

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Winfree, William P.

    2011-01-01

    Principal Component Analysis (PCA) has been shown effective for reducing thermographic NDE data. This paper will discuss an alternative method of analysis that has been developed where a predetermined set of eigenvectors is used to process the thermal data from both reinforced carbon-carbon (RCC) and graphiteepoxy honeycomb materials. These eigenvectors can be generated either from an analytic model of the thermal response of the material system under examination, or from a large set of experimental data. This paper provides the details of the analytic model, an overview of the PCA process, as well as a quantitative signal-to-noise comparison of the results of performing both conventional PCA and fixed eigenvector analysis on thermographic data from two specimens, one Reinforced Carbon-Carbon with flat bottom holes and the second a sandwich construction with graphite-epoxy face sheets and aluminum honeycomb core.

  9. An advanced software suite for the processing and analysis of silicon luminescence images

    NASA Astrophysics Data System (ADS)

    Payne, D. N. R.; Vargas, C.; Hameiri, Z.; Wenham, S. R.; Bagnall, D. M.

    2017-06-01

    Luminescence imaging is a versatile characterisation technique used for a broad range of research and industrial applications, particularly for the field of photovoltaics where photoluminescence and electroluminescence imaging is routinely carried out for materials analysis and quality control. Luminescence imaging can reveal a wealth of material information, as detailed in extensive literature, yet these techniques are often only used qualitatively instead of being utilised to their full potential. Part of the reason for this is the time and effort required for image processing and analysis in order to convert image data to more meaningful results. In this work, a custom built, Matlab based software suite is presented which aims to dramatically simplify luminescence image processing and analysis. The suite includes four individual programs which can be used in isolation or in conjunction to achieve a broad array of functionality, including but not limited to, point spread function determination and deconvolution, automated sample extraction, image alignment and comparison, minority carrier lifetime calibration and iron impurity concentration mapping.

  10. QUAGOL: a guide for qualitative data analysis.

    PubMed

    Dierckx de Casterlé, Bernadette; Gastmans, Chris; Bryon, Els; Denier, Yvonne

    2012-03-01

    Data analysis is a complex and contested part of the qualitative research process, which has received limited theoretical attention. Researchers are often in need of useful instructions or guidelines on how to analyze the mass of qualitative data, but face the lack of clear guidance for using particular analytic methods. The aim of this paper is to propose and discuss the Qualitative Analysis Guide of Leuven (QUAGOL), a guide that was developed in order to be able to truly capture the rich insights of qualitative interview data. The article describes six major problems researchers are often struggling with during the process of qualitative data analysis. Consequently, the QUAGOL is proposed as a guide to facilitate the process of analysis. Challenges emerged and lessons learned from own extensive experiences with qualitative data analysis within the Grounded Theory Approach, as well as from those of other researchers (as described in the literature), were discussed and recommendations were presented. Strengths and pitfalls of the proposed method were discussed in detail. The Qualitative Analysis Guide of Leuven (QUAGOL) offers a comprehensive method to guide the process of qualitative data analysis. The process consists of two parts, each consisting of five stages. The method is systematic but not rigid. It is characterized by iterative processes of digging deeper, constantly moving between the various stages of the process. As such, it aims to stimulate the researcher's intuition and creativity as optimal as possible. The QUAGOL guide is a theory and practice-based guide that supports and facilitates the process of analysis of qualitative interview data. Although the method can facilitate the process of analysis, it cannot guarantee automatic quality. The skills of the researcher and the quality of the research team remain the most crucial components of a successful process of analysis. Additionally, the importance of constantly moving between the various stages throughout the research process cannot be overstated. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Universality and diversity of folding mechanics for three-helix bundle proteins.

    PubMed

    Yang, Jae Shick; Wallin, Stefan; Shakhnovich, Eugene I

    2008-01-22

    In this study we evaluate, at full atomic detail, the folding processes of two small helical proteins, the B domain of protein A and the Villin headpiece. Folding kinetics are studied by performing a large number of ab initio Monte Carlo folding simulations using a single transferable all-atom potential. Using these trajectories, we examine the relaxation behavior, secondary structure formation, and transition-state ensembles (TSEs) of the two proteins and compare our results with experimental data and previous computational studies. To obtain a detailed structural information on the folding dynamics viewed as an ensemble process, we perform a clustering analysis procedure based on graph theory. Moreover, rigorous p(fold) analysis is used to obtain representative samples of the TSEs and a good quantitative agreement between experimental and simulated Phi values is obtained for protein A. Phi values for Villin also are obtained and left as predictions to be tested by future experiments. Our analysis shows that the two-helix hairpin is a common partially stable structural motif that gets formed before entering the TSE in the studied proteins. These results together with our earlier study of Engrailed Homeodomain and recent experimental studies provide a comprehensive, atomic-level picture of folding mechanics of three-helix bundle proteins.

  12. Microspectroscopic Analysis of Anthropogenic- and Biogenic-Influenced Aerosol Particles during the SOAS Field Campaign

    NASA Astrophysics Data System (ADS)

    Ault, A. P.; Bondy, A. L.; Nhliziyo, M. V.; Bertman, S. B.; Pratt, K.; Shepson, P. B.

    2013-12-01

    During the summer, the southeastern United States experiences a cooling haze due to the interaction of anthropogenic and biogenic aerosol sources. An objective of the summer 2013 Southern Oxidant and Aerosol Study (SOAS) was to improve our understanding of how trace gases and aerosols are contributing to this relative cooling through light scattering and absorption. To improve understanding of biogenic-anthropogenic interactions through secondary organic aerosol (SOA) formation on primary aerosol cores requires detailed physicochemical characterization of the particles after uptake and processing. Our measurements focus on single particle analysis of aerosols in the accumulation mode (300-1000 nm) collected using a multi orifice uniform deposition impactor (MOUDI) at the Centreville, Alabama SEARCH site. Particles were characterized using an array of microscopic and spectroscopic techniques, including: scanning electron microscopy (SEM), transmission electron microscopy (TEM), energy dispersive X-ray analysis (EDX), and Raman microspectroscopy. These analyses provide detailed information on particle size, morphology, elemental composition, and functional groups. This information is combined with mapping capabilities to explore individual particle spatial patterns and how that impacts structural characteristics. The improved understanding will be used to explore how sources and processing (such as SOA coating of soot) change particle structure (i.e. core shell) and how the altered optical properties impact air quality/climate effects on a regional scale.

  13. The influence of polarization on millimeter wave propagation through rain. [radio signals

    NASA Technical Reports Server (NTRS)

    Bostian, C. W.; Stutzman, W. L.; Wiley, P. H.; Marshall, R. E.

    1973-01-01

    The measurement and analysis of the depolarization and attenuation that occur when millimeter wave radio signals propagate through rain are described. Progress was made in three major areas: the processing of recorded 1972 data, acquisition and processing of a large amount of 1973 data, and the development of a new theoretical model to predict rain cross polarization and attenuation. Each of these topics is described in detail along with radio frequency system design for cross polarization measurements.

  14. Comparing digital data processing techniques for surface mine and reclamation monitoring

    NASA Technical Reports Server (NTRS)

    Witt, R. G.; Bly, B. G.; Campbell, W. J.; Bloemer, H. H. L.; Brumfield, J. O.

    1982-01-01

    The results of three techniques used for processing Landsat digital data are compared for their utility in delineating areas of surface mining and subsequent reclamation. An unsupervised clustering algorithm (ISOCLS), a maximum-likelihood classifier (CLASFY), and a hybrid approach utilizing canonical analysis (ISOCLS/KLTRANS/ISOCLS) were compared by means of a detailed accuracy assessment with aerial photography at NASA's Goddard Space Flight Center. Results show that the hybrid approach was superior to the traditional techniques in distinguishing strip mined and reclaimed areas.

  15. Coup d’Oeil: Military Geography and the Operational Level of War

    DTIC Science & Technology

    1991-05-16

    afUP D’OEIL Every day I feel nmre rnd more in need of an atlas, as geogrphvy iv the minutest details. is essential to a true nli "tary education. I...categorizing terrain have provided the essential prerequisites for the development of the IPB process. The process allows for an in-depth technical analysis of...is theater whiidch define the lines of essential to the c •umnrs plan. operation. ... defined by a conpetent authority. CENTER Zi f GRAVIMI Center of

  16. Ageing management of french NPP civil work structures

    NASA Astrophysics Data System (ADS)

    Gallitre, E.; Dauffer, D.

    2011-04-01

    This paper presents EDF practice about concrete structure ageing management, from the mechanisms analysis to the formal procedure which allows the French company to increase 900 MWe NPP lifetime until 40 years; it will also introduce its action plan for 60 years lifetime extension. This practice is based on a methodology which identifies every ageing mechanism; both plants feedback and state of the art are screened and conclusions are drawn up into an "ageing analysis data sheet". That leads at first to a collection of 57 data sheets which give the mechanism identification, the components that are concerned and an analysis grid which is designed to assess the safety risk. This analysis screens the reference documents describing the mechanism, the design lifetime hypotheses, the associated regulation or codification, the feedback experiences, the accessibility, the maintenance actions, the repair possibility and so one. This analysis has to lead to a conclusion about the risk taking into account monitoring and maintenance. If the data sheet conclusion is not clear enough, then a more detailed report is launched. The technical document which is needed, is a formal detailed report which summarizes every theoretical knowledge and monitoring data: its objective is to propose a solution for ageing management: this solution can include more inspections or specific research development, or additional maintenance. After a first stage on the 900 MWe units, only two generic ageing management detailed reports have been needed for the civil engineering part: one about reactor building containment, and one about other structures which focuses on concrete inflating reactions. The second stage consists on deriving this generic analysis (ageing mechanism and detailed reports) to every plant where a complete ageing report is required (one report for all equipments and structures of the plant, but specific for each reactor). This ageing management is a continuous process because the 57 generic data sheets set is updated every year and the detailed generic reports every five years. After this 40 year lifetime extension, EDF is preparing a 60 years lifetime action plan which includes R&D actions, specific industrial studies and also monitoring improvements.

  17. Mathematical Analysis and Optimization of Infiltration Processes

    NASA Technical Reports Server (NTRS)

    Chang, H.-C.; Gottlieb, D.; Marion, M.; Sheldon, B. W.

    1997-01-01

    A variety of infiltration techniques can be used to fabricate solid materials, particularly composites. In general these processes can be described with at least one time dependent partial differential equation describing the evolution of the solid phase, coupled to one or more partial differential equations describing mass transport through a porous structure. This paper presents a detailed mathematical analysis of a relatively simple set of equations which is used to describe chemical vapor infiltration. The results demonstrate that the process is controlled by only two parameters, alpha and beta. The optimization problem associated with minimizing the infiltration time is also considered. Allowing alpha and beta to vary with time leads to significant reductions in the infiltration time, compared with the conventional case where alpha and beta are treated as constants.

  18. Fully-coupled analysis of jet mixing problems. Part 1. Shock-capturing model, SCIPVIS

    NASA Technical Reports Server (NTRS)

    Dash, S. M.; Wolf, D. E.

    1984-01-01

    A computational model, SCIPVIS, is described which predicts the multiple cell shock structure in imperfectly expanded, turbulent, axisymmetric jets. The model spatially integrates the parabolized Navier-Stokes jet mixing equations using a shock-capturing approach in supersonic flow regions and a pressure-split approximation in subsonic flow regions. The regions are coupled using a viscous-characteristic procedure. Turbulence processes are represented via the solution of compressibility-corrected two-equation turbulence models. The formation of Mach discs in the jet and the interactive analysis of the wake-like mixing process occurring behind Mach discs is handled in a rigorous manner. Calculations are presented exhibiting the fundamental interactive processes occurring in supersonic jets and the model is assessed via comparisons with detailed laboratory data for a variety of under- and overexpanded jets.

  19. High efficiency processing for reduced amplitude zones detection in the HRECG signal

    NASA Astrophysics Data System (ADS)

    Dugarte, N.; Álvarez, A.; Balacco, J.; Mercado, G.; Gonzalez, A.; Dugarte, E.; Olivares, A.

    2016-04-01

    Summary - This article presents part of a more detailed research proposed in the medium to long term, with the intention of establishing a new philosophy of electrocardiogram surface analysis. This research aims to find indicators of cardiovascular disease in its early stage that may go unnoticed with conventional electrocardiography. This paper reports the development of a software processing which collect some existing techniques and incorporates novel methods for detection of reduced amplitude zones (RAZ) in high resolution electrocardiographic signal (HRECG).The algorithm consists of three stages, an efficient processing for QRS detection, averaging filter using correlation techniques and a step for RAZ detecting. Preliminary results show the efficiency of system and point to incorporation of techniques new using signal analysis with involving 12 leads.

  20. Soil Components in Heterogeneous Impact Glass in Martian Meteorite EETA79001

    NASA Technical Reports Server (NTRS)

    Schrader, C. M.; Cohen, B. A.; Donovan, J. J.; Vicenzi, E. P.

    2010-01-01

    Martian soil composition can illuminate past and ongoing near-surface processes such as impact gardening [2] and hydrothermal and volcanic activity [3,4]. Though the Mars Exploration Rovers (MER) have analyzed the major-element composition of Martian soils, no soil samples have been returned to Earth for detailed chemical analysis. Rao et al. [1] suggested that Martian meteorite EETA79001 contains melted Martian soil in its impact glass (Lithology C) based on sulfur enrichment of Lithology C relative to the meteorite s basaltic lithologies (A and B) [1,2]. If true, it may be possible to extract detailed soil chemical analyses using this meteoritic sample. We conducted high-resolution (0.3 m/pixel) element mapping of Lithology C in thin section EETA79001,18 by energy dispersive spectrometry (EDS). We use these data for principal component analysis (PCA).

  1. Optimization process planning using hybrid genetic algorithm and intelligent search for job shop machining.

    PubMed

    Salehi, Mojtaba; Bahreininejad, Ardeshir

    2011-08-01

    Optimization of process planning is considered as the key technology for computer-aided process planning which is a rather complex and difficult procedure. A good process plan of a part is built up based on two elements: (1) the optimized sequence of the operations of the part; and (2) the optimized selection of the machine, cutting tool and Tool Access Direction (TAD) for each operation. In the present work, the process planning is divided into preliminary planning, and secondary/detailed planning. In the preliminary stage, based on the analysis of order and clustering constraints as a compulsive constraint aggregation in operation sequencing and using an intelligent searching strategy, the feasible sequences are generated. Then, in the detailed planning stage, using the genetic algorithm which prunes the initial feasible sequences, the optimized operation sequence and the optimized selection of the machine, cutting tool and TAD for each operation based on optimization constraints as an additive constraint aggregation are obtained. The main contribution of this work is the optimization of sequence of the operations of the part, and optimization of machine selection, cutting tool and TAD for each operation using the intelligent search and genetic algorithm simultaneously.

  2. Optimization process planning using hybrid genetic algorithm and intelligent search for job shop machining

    PubMed Central

    Salehi, Mojtaba

    2010-01-01

    Optimization of process planning is considered as the key technology for computer-aided process planning which is a rather complex and difficult procedure. A good process plan of a part is built up based on two elements: (1) the optimized sequence of the operations of the part; and (2) the optimized selection of the machine, cutting tool and Tool Access Direction (TAD) for each operation. In the present work, the process planning is divided into preliminary planning, and secondary/detailed planning. In the preliminary stage, based on the analysis of order and clustering constraints as a compulsive constraint aggregation in operation sequencing and using an intelligent searching strategy, the feasible sequences are generated. Then, in the detailed planning stage, using the genetic algorithm which prunes the initial feasible sequences, the optimized operation sequence and the optimized selection of the machine, cutting tool and TAD for each operation based on optimization constraints as an additive constraint aggregation are obtained. The main contribution of this work is the optimization of sequence of the operations of the part, and optimization of machine selection, cutting tool and TAD for each operation using the intelligent search and genetic algorithm simultaneously. PMID:21845020

  3. Practitioner Expectations and Experiences with the Certificate IV in Training and Assessment (TAA40104): Support Document

    ERIC Educational Resources Information Center

    Clayton, Berwyn; Meyers, Dave; Bateman, Andrea; Bluer, Robert

    2010-01-01

    This document supports the report "Practitioner Expectations and Experiences with the Certificate IV in Training and Assessment (TAA40104)". The first section outlines the methodology used to undertake the research and covers the design of the research, sample details, data collection processes and the strategy for data analysis and…

  4. A Process Evaluation of Project Developmental Continuity. Interim Report VI: Executive Summary. Recommendations for Continuing the Impact Study.

    ERIC Educational Resources Information Center

    Granville, Arthur C.; Love, John M.

    This brief report summarizes the analysis and conclusions presented in detail in Interim Report VI regarding the feasibility of conducting a longitudinal study of Project Developmental Continuity (PDC). This project is a Head Start demonstration program aimed at providing educational and developmental continuity between children's Head Start and…

  5. Stories and Gossip in English: The Macro-Structure of Casual Talk.

    ERIC Educational Resources Information Center

    Slade, Diana

    1997-01-01

    A discussion of two text-types commonly occurring in casual conversation, stories and gossip, (1) details four kinds of stories told in casual talk, (2) demonstrates that gossip is a culturally-determined process with a distinctive structure, and (3) considers implications for teaching English-as-a- Second-Language. Analysis is based on over three…

  6. Overview of the Enhanced Natural Gestures Instructional Approach and Illustration of Its Use with Three Students with Angelman Syndrome

    ERIC Educational Resources Information Center

    Calculator, Stephen; Diaz-Caneja Sela, Patricia

    2015-01-01

    Background: This investigation details procedures used to teach enhanced natural gestures (ENGs) and illustrates its use with three students with Angelman syndrome (AS). Materials and Methods: Themes were extracted, using a process of content analysis, to organize individuals' feedback pertaining to previous versions of the instructional…

  7. Early results from the ultra heavy cosmic ray experiment

    NASA Technical Reports Server (NTRS)

    Osullivan, D.; Thompson, A.; Bosch, J.; Keegan, R.; Wenzel, K.-P.; Jansen, F.; Domingo, C.

    1995-01-01

    Data extraction and analysis of the LDEF Ultra Heavy Cosmic Ray Experiment is continuing. Almost twice the pre LDEF world sample has been investigated and some details of the charge spectrum in the region from Z approximately 70 up to and including the actinides are presented. The early results indicate r process enhancement over solar system source abundances.

  8. Managers in the Making: Careers, Development and Control in Corporate Britain and Japan.

    ERIC Educational Resources Information Center

    Storey, John; Edwards, Paul; Sisson, Keith

    This book presents an analysis of the processes by which managers are made in Britain and Japan. It provides a detailed comparative study of the careers, training, developmental experience, and job demands of managers in eight companies in four sectors: engineering, banking, retail, and communications. Data are from the following sources:…

  9. Citizenship by Design: Art and Identity in the Early Republic

    ERIC Educational Resources Information Center

    Sienkewicz, Julia A.

    2009-01-01

    This dissertation offers a study of the ways in which some artists active in the United States between 1790 and 1850 theorized that their work could participate in the process of creating and shaping the nation's citizens. Through the detailed analysis of four artists--Benjamin Henry Latrobe (1764-1820), Charles Willson Peale (1741-1827), Thomas…

  10. Defining and Applying Limits for Test and Flight Through the Project Lifecycle GSFC Standard. [Scope: Non-Cryogenic Systems Tested in Vacuum

    NASA Technical Reports Server (NTRS)

    Mosier, Carol

    2015-01-01

    The presentation will be given at the Annual Thermal Fluids Analysis Workshop (TFAWS 2015, NCTS 21070-15) hosted by the Goddard SpaceFlight Center (GSFC) Thermal Engineering Branch (Code 545). The powerpoint presentation details the process of defining limits throughout the lifecycle of a flight project.

  11. Upward Mobility Programs in the Service Sector for Disadvantaged and Dislocated Workers. Volume II: Technical Appendices.

    ERIC Educational Resources Information Center

    Tao, Fumiyo; And Others

    This volume contains technical and supporting materials that supplement Volume I, which describes upward mobility programs for disadvantaged and dislocated workers in the service sector. Appendix A is a detailed description of the project methodology, including data collection methods and information on data compilation, processing, and analysis.…

  12. Reading: Tests and Assessment Techniques. Second Edition. United Kingdom Reading Association Teaching of Reading Monograph Series.

    ERIC Educational Resources Information Center

    Pumfrey, Peter D.

    The second edition of this British publication provides details of recent developments in the assessment of reading attainments and the analysis of reading processes. The book begins with a description of various types of reading tests and assessment techniques with consideration given to the purposes for which normative, criterion-referenced, and…

  13. An Analysis of Conceptual Flow Patterns and Structures in the Physics Classroom

    ERIC Educational Resources Information Center

    Eshach, Haim

    2010-01-01

    The aim of the current research is to characterize the conceptual flow processes occurring in whole-class dialogic discussions with a high level of interanimation; in the present case, of a high-school class learning about image creation on plane mirrors. Using detailed chains of interaction and conceptual flow discourse maps--both developed for…

  14. Analysis of pultrusion processing for long fiber reinforced thermoplastic composite system

    NASA Technical Reports Server (NTRS)

    Tso, W.; Hou, T. H.; Tiwari, S. N.

    1993-01-01

    Pultrusion is one of the composite processing technology, commonly recognized as a simple and cost-effective means for the manufacturing of fiber-reinforced, resin matrix composite parts with different regular geometries. Previously, because the majority of the pultruded composite parts were made of thermosetting resin matrix, emphasis of the analysis on the process has been on the conservation of energy from various sources, such as heat conduction and the curing kinetics of the resin system. Analysis on the flow aspect of the process was almost absent in the literature for thermosetting process. With the increasing uses of thermoplastic materials, it is desirable to obtain the detailed velocity and pressure profiles inside the pultrusion die. Using a modified Darcy's law for flow through porous media, closed form analytical solutions for the velocity and pressure distributions inside the pultrusion die are obtained for the first time. This enables us to estimate the magnitude of viscous dissipation and it's effects on the pultruded parts. Pulling forces refined in the pultrusion processing are also analyzed. The analytical model derived in this study can be used to advance our knowledge and control of the pultrusion process for fiber reinforced thermoplastic composite parts.

  15. The integration methods of fuzzy fault mode and effect analysis and fault tree analysis for risk analysis of yogurt production

    NASA Astrophysics Data System (ADS)

    Aprilia, Ayu Rizky; Santoso, Imam; Ekasari, Dhita Murita

    2017-05-01

    Yogurt is a product based on milk, which has beneficial effects for health. The process for the production of yogurt is very susceptible to failure because it involves bacteria and fermentation. For an industry, the risks may cause harm and have a negative impact. In order for a product to be successful and profitable, it requires the analysis of risks that may occur during the production process. Risk analysis can identify the risks in detail and prevent as well as determine its handling, so that the risks can be minimized. Therefore, this study will analyze the risks of the production process with a case study in CV.XYZ. The method used in this research is the Fuzzy Failure Mode and Effect Analysis (fuzzy FMEA) and Fault Tree Analysis (FTA). The results showed that there are 6 risks from equipment variables, raw material variables, and process variables. Those risks include the critical risk, which is the risk of a lack of an aseptic process, more specifically if starter yogurt is damaged due to contamination by fungus or other bacteria and a lack of sanitation equipment. The results of quantitative analysis of FTA showed that the highest probability is the probability of the lack of an aseptic process, with a risk of 3.902%. The recommendations for improvement include establishing SOPs (Standard Operating Procedures), which include the process, workers, and environment, controlling the starter of yogurt and improving the production planning and sanitation equipment using hot water immersion.

  16. Orbiter data reduction complex data processing requirements for the OFT mission evaluation team (level C)

    NASA Technical Reports Server (NTRS)

    1979-01-01

    This document addresses requirements for post-test data reduction in support of the Orbital Flight Tests (OFT) mission evaluation team, specifically those which are planned to be implemented in the ODRC (Orbiter Data Reduction Complex). Only those requirements which have been previously baselined by the Data Systems and Analysis Directorate configuration control board are included. This document serves as the control document between Institutional Data Systems Division and the Integration Division for OFT mission evaluation data processing requirements, and shall be the basis for detailed design of ODRC data processing systems.

  17. HiCUP: pipeline for mapping and processing Hi-C data.

    PubMed

    Wingett, Steven; Ewels, Philip; Furlan-Magaril, Mayra; Nagano, Takashi; Schoenfelder, Stefan; Fraser, Peter; Andrews, Simon

    2015-01-01

    HiCUP is a pipeline for processing sequence data generated by Hi-C and Capture Hi-C (CHi-C) experiments, which are techniques used to investigate three-dimensional genomic organisation. The pipeline maps data to a specified reference genome and removes artefacts that would otherwise hinder subsequent analysis. HiCUP also produces an easy-to-interpret yet detailed quality control (QC) report that assists in refining experimental protocols for future studies. The software is freely available and has already been used for processing Hi-C and CHi-C data in several recently published peer-reviewed studies.

  18. Coupled Loads Analysis of the Modified NASA Barge Pegasus and Space Launch System Hardware

    NASA Technical Reports Server (NTRS)

    Knight, J. Brent

    2015-01-01

    A Coupled Loads Analysis (CLA) has been performed for barge transport of Space Launch System hardware on the recently modified NASA barge Pegasus. The barge re-design was facilitated with detailed finite element analyses by the ARMY Corps of Engineers - Marine Design Center. The Finite Element Model (FEM) utilized in the design was also used in the subject CLA. The Pegasus FEM and CLA results are presented as well as a comparison of the analysis process to that of a payload being transported to space via the Space Shuttle. Discussion of the dynamic forcing functions is included as well. The process of performing a dynamic CLA of NASA hardware during marine transport is thought to be a first and can likely support minimization of undue conservatism.

  19. Validating data analysis of broadband laser ranging

    NASA Astrophysics Data System (ADS)

    Rhodes, M.; Catenacci, J.; Howard, M.; La Lone, B.; Kostinski, N.; Perry, D.; Bennett, C.; Patterson, J.

    2018-03-01

    Broadband laser ranging combines spectral interferometry and a dispersive Fourier transform to achieve high-repetition-rate measurements of the position of a moving surface. Telecommunications fiber is a convenient tool for generating the large linear dispersions required for a dispersive Fourier transform, but standard fiber also has higher-order dispersion that distorts the Fourier transform. Imperfections in the dispersive Fourier transform significantly complicate the ranging signal and must be dealt with to make high-precision measurements. We describe in detail an analysis process for interpreting ranging data when standard telecommunications fiber is used to perform an imperfect dispersive Fourier transform. This analysis process is experimentally validated over a 27-cm scan of static positions, showing an accuracy of 50 μm and a root-mean-square precision of 4.7 μm.

  20. Analyzing women's roles through graphic representation of narratives.

    PubMed

    Hall, Joanne M

    2003-08-01

    A 1992 triangulated international nursing study of women's health was reported. The researchers used the perspectives of feminism and symbolic interactionism, specifically role theory. A narrative analysis was done to clarify the concept of role integration. The narrative analysis was reported in 1992, but graphic/visual techniques used in the team dialogue process of narrative analysis were not reported due to space limitations. These techniques have not been reported elsewhere and thus remain innovative. Specific steps in the method are outlined here in detail as an audit trail. The process would be useful to other qualitative researchers as an exemplar of one novel way that verbal data can be abstracted visually/graphically. Suggestions are included for aspects of narrative, in addition to roles, that could be depicted graphically in qualitative research.

  1. Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis

    PubMed Central

    Steele, Joe; Bastola, Dhundy

    2014-01-01

    Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base–base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel–Ziv techniques from data compression. PMID:23904502

  2. Practical considerations for obtaining high quality quantitative computed tomography data of the skeletal system.

    PubMed

    Troy, Karen L; Edwards, W Brent

    2018-05-01

    Quantitative CT (QCT) analysis involves the calculation of specific parameters such as bone volume and density from CT image data, and can be a powerful tool for understanding bone quality and quantity. However, without careful attention to detail during all steps of the acquisition and analysis process, data can be of poor- to unusable-quality. Good quality QCT for research requires meticulous attention to detail and standardization of all aspects of data collection and analysis to a degree that is uncommon in a clinical setting. Here, we review the literature to summarize practical and technical considerations for obtaining high quality QCT data, and provide examples of how each recommendation affects calculated variables. We also provide an overview of the QCT analysis technique to illustrate additional opportunities to improve data reproducibility and reliability. Key recommendations include: standardizing the scanner and data acquisition settings, minimizing image artifacts, selecting an appropriate reconstruction algorithm, and maximizing repeatability and objectivity during QCT analysis. The goal of the recommendations is to reduce potential sources of error throughout the analysis, from scan acquisition to the interpretation of results. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Operational CryoSat Product Quality Assessment

    NASA Astrophysics Data System (ADS)

    Mannan, Rubinder; Webb, Erica; Hall, Amanda; Bouzinac, Catherine

    2013-12-01

    The performance and quality of the CryoSat data products are routinely assessed by the Instrument Data quality Evaluation and Analysis Service (IDEAS). This information is then conveyed to the scientific and user community in order to allow them to utilise CryoSat data with confidence. This paper presents details of the Quality Control (QC) activities performed for CryoSat products under the IDEAS contract. Details of the different QC procedures and tools deployed by IDEAS to assess the quality of operational data are presented. The latest updates to the Instrument Processing Facility (IPF) for the Fast Delivery Marine (FDM) products and the future update to Baseline-C are discussed.

  4. Information transfer satellite concept study. Volume 1: Summary

    NASA Technical Reports Server (NTRS)

    Bergin, P.; Kincade, C.; Kurpiewski, D.; Leinhaupel, F.; Millican, F.; Onstad, R.

    1971-01-01

    A wide range of information transfer demands were identified and analyzed. They were then combined into an appropriate set of requirements for satellite communication services. In this process the demands were ranked and combined into single and multipurpose satellite systems. A detailed analysis was performed on each satellite system to determine: total system cost, including both ground and space segments; sensitivities of the systems to various system tradeoffs; and forcing functions which control the system variations. A listing of candidate missions for detailed study is presented, along with a description of the conceptual system design and an identification of the technology developments required to bring these systems to fruition.

  5. Analysis of Existing Guidelines for the Systematic Planning Process of Clinical Registries.

    PubMed

    Löpprich, Martin; Knaup, Petra

    2016-01-01

    Clinical registries are a powerful method to observe the clinical practice and natural disease history. In contrast to clinical trials, where guidelines and standardized methods exist and are mandatory, only a few initiatives have published methodological guidelines for clinical registries. The objective of this paper was to review these guidelines and systematically assess their completeness, usability and feasibility according to a SWOT analysis. The results show that each guideline has its own strengths and weaknesses. While one supports the systematic planning process, the other discusses clinical registries in great detail. However, the feasibility was mostly limited and the special requirements of clinical registries, their flexible, expandable and adaptable technological structure was not addressed consistently.

  6. Analysis and modeling of leakage current sensor under pulsating direct current

    NASA Astrophysics Data System (ADS)

    Li, Kui; Dai, Yihua; Wang, Yao; Niu, Feng; Chen, Zhao; Huang, Shaopo

    2017-05-01

    In this paper, the transformation characteristics of current sensor under pulsating DC leakage current is investigated. The mathematical model of current sensor is proposed to accurately describe the secondary side current and excitation current. The transformation process of current sensor is illustrated in details and the transformation error is analyzed from multi aspects. A simulation model is built and a sensor prototype is designed to conduct comparative evaluation, and both simulation and experimental results are presented to verify the correctness of theoretical analysis.

  7. Improving Image Drizzling in the HST Archive: Advanced Camera for Surveys

    NASA Astrophysics Data System (ADS)

    Hoffmann, Samantha L.; Avila, Roberto J.

    2017-06-01

    The Mikulski Archive for Space Telescopes (MAST) pipeline performs geometric distortion corrections, associated image combinations, and cosmic ray rejections with AstroDrizzle on Hubble Space Telescope (HST) data. The MDRIZTAB reference table contains a list of relevant parameters that controls this program. This document details our photometric analysis of Advanced Camera for Surveys Wide Field Channel (ACS/WFC) data processed by AstroDrizzle. Based on this analysis, we update the MDRIZTAB table to improve the quality of the drizzled products delivered by MAST.

  8. Analysis of microgravity space experiments Space Shuttle programmatic safety requirements

    NASA Technical Reports Server (NTRS)

    Terlep, Judith A.

    1996-01-01

    This report documents the results of an analysis of microgravity space experiments space shuttle programmatic safety requirements and recommends the creation of a Safety Compliance Data Package (SCDP) Template for both flight and ground processes. These templates detail the programmatic requirements necessary to produce a complete SCDP. The templates were developed from various NASA centers' requirement documents, previously written guidelines on safety data packages, and from personal experiences. The templates are included in the back as part of this report.

  9. Draft Environmental Impact Statement for Divert Activities and Exercises, Guam and Commonwealth of the Northern Mariana Islands

    DTIC Science & Technology

    2012-06-01

    information indicate that at least one designated use 22 (e.g., recreation, support of aquatic life and coral reef conservation, fishing and the consumption ...34 This EIS provides an analysis of environmental effects associated with the proposed action. The 35 following summarizes the formal NEPA process...alternatives and the No Action Alternative have been summarized in Table ES-2. A detailed analysis of 28 effects is provided in Chapter 4. 29 ES 7

  10. An analysis of the sliding pressure start-up of SCWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, F.; Yang, J.; Li, H.

    In this paper, the preliminary sliding pressure start-up system and scheme of supercritical water-cooled reactor in CGNPC (CGN-SCWR) were proposed. Thermal-hydraulic behavior in start-up procedures was analyzed in detail by employing advanced reactor subchannel analysis software ATHAS. The maximum cladding temperature (MCT for short) and core power of fuel assembly during the whole start-up process were investigated comparatively. The results show that the recommended start-up scheme meets the design requirements from the perspective of thermal-hydraulic. (authors)

  11. A detailed analysis of codon usage patterns and influencing factors in Zika virus.

    PubMed

    Singh, Niraj K; Tyagi, Anuj

    2017-07-01

    Recent outbreaks of Zika virus (ZIKV) in Africa, Latin America, Europe, and Southeast Asia have resulted in serious health concerns. To understand more about evolution and transmission of ZIKV, detailed codon usage analysis was performed for all available strains. A high effective number of codons (ENC) value indicated the presence of low codon usage bias in ZIKV. The effect of mutational pressure on codon usage bias was confirmed by significant correlations between nucleotide compositions at third codon positions and ENCs. Correlation analysis between Gravy values, Aroma values and nucleotide compositions at third codon positions also indicated some influence of natural selection. However, the low codon adaptation index (CAI) value of ZIKV with reference to human and mosquito indicated poor adaptation of ZIKV codon usage towards its hosts, signifying that natural selection has a weaker influence than mutational pressure. Additionally, relative dinucleotide frequencies, geographical distribution, and evolutionary processes also influenced the codon usage pattern to some extent.

  12. Cnn Based Retinal Image Upscaling Using Zero Component Analysis

    NASA Astrophysics Data System (ADS)

    Nasonov, A.; Chesnakov, K.; Krylov, A.

    2017-05-01

    The aim of the paper is to obtain high quality of image upscaling for noisy images that are typical in medical image processing. A new training scenario for convolutional neural network based image upscaling method is proposed. Its main idea is a novel dataset preparation method for deep learning. The dataset contains pairs of noisy low-resolution images and corresponding noiseless highresolution images. To achieve better results at edges and textured areas, Zero Component Analysis is applied to these images. The upscaling results are compared with other state-of-the-art methods like DCCI, SI-3 and SRCNN on noisy medical ophthalmological images. Objective evaluation of the results confirms high quality of the proposed method. Visual analysis shows that fine details and structures like blood vessels are preserved, noise level is reduced and no artifacts or non-existing details are added. These properties are essential in retinal diagnosis establishment, so the proposed algorithm is recommended to be used in real medical applications.

  13. Global Analysis of Perovskite Photophysics Reveals Importance of Geminate Pathways

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manger, Lydia H.; Rowley, Matthew B.; Fu, Yongping

    Hybrid organic-inorganic perovskites demonstrate desirable photophysical behaviors and promising applications from efficient photovoltaics to lasing, but the fundamental nature of excited state species is still under debate. We also collected time-resolved photoluminescence of single-crystal nanoplates of methylammonium lead iodide perovskite (MAPbI3), with excitation over a range of fluences and repetition rates, to provide a more complete photophysical picture. A fundamentally different way of simulating the photophysics is developed that relies on unnormalized decays, global analysis over a large array of conditions, and inclusion of steady-state behavior; these details are critical to capturing observed behaviors. These additional constraints require inclusion ofmore » spatially-correlated pairs, along with free carriers and traps, demonstrating the importance of our comprehensive analysis. Modeling geminate and non-geminate pathways shows geminate processes are dominant at high carrier densities and early times. This combination of data and simulation provides a detailed picture of perovskite photophysics across multiple excitation regimes that was not previously available.« less

  14. Global Analysis of Perovskite Photophysics Reveals Importance of Geminate Pathways

    DOE PAGES

    Manger, Lydia H.; Rowley, Matthew B.; Fu, Yongping; ...

    2016-12-20

    Hybrid organic-inorganic perovskites demonstrate desirable photophysical behaviors and promising applications from efficient photovoltaics to lasing, but the fundamental nature of excited state species is still under debate. We also collected time-resolved photoluminescence of single-crystal nanoplates of methylammonium lead iodide perovskite (MAPbI3), with excitation over a range of fluences and repetition rates, to provide a more complete photophysical picture. A fundamentally different way of simulating the photophysics is developed that relies on unnormalized decays, global analysis over a large array of conditions, and inclusion of steady-state behavior; these details are critical to capturing observed behaviors. These additional constraints require inclusion ofmore » spatially-correlated pairs, along with free carriers and traps, demonstrating the importance of our comprehensive analysis. Modeling geminate and non-geminate pathways shows geminate processes are dominant at high carrier densities and early times. This combination of data and simulation provides a detailed picture of perovskite photophysics across multiple excitation regimes that was not previously available.« less

  15. Process-based, morphodynamic hindcast of decadal deposition patterns in San Pablo Bay, California, 1856-1887

    USGS Publications Warehouse

    van der Wegen, M.; Jaffe, B.E.; Roelvink, J.A.

    2011-01-01

    This study investigates the possibility of hindcasting-observed decadal-scale morphologic change in San Pablo Bay, a subembayment of the San Francisco Estuary, California, USA, by means of a 3-D numerical model (Delft3D). The hindcast period, 1856-1887, is characterized by upstream hydraulic mining that resulted in a high sediment input to the estuary. The model includes wind waves, salt water and fresh water interactions, and graded sediment transport, among others. Simplified initial conditions and hydrodynamic forcing were necessary because detailed historic descriptions were lacking. Model results show significant skill. The river discharge and sediment concentration have a strong positive influence on deposition volumes. Waves decrease deposition rates and have, together with tidal movement, the greatest effect on sediment distribution within San Pablo Bay. The applied process-based (or reductionist) modeling approach is valuable once reasonable values for model parameters and hydrodynamic forcing are obtained. Sensitivity analysis reveals the dominant forcing of the system and suggests that the model planform plays a dominant role in the morphodynamic development. A detailed physical explanation of the model outcomes is difficult because of the high nonlinearity of the processes. Process formulation refinement, a more detailed description of the forcing, or further model parameter variations may lead to an enhanced model performance, albeit to a limited extent. The approach potentially provides a sound basis for prediction of future developments. Parallel use of highly schematized box models and a process-based approach as described in the present work is probably the most valuable method to assess decadal morphodynamic development. Copyright ?? 2011 by the American Geophysical Union.

  16. Forty years of temporal analysis of products

    DOE PAGES

    Morgan, K.; Maguire, N.; Fushimi, R.; ...

    2017-05-16

    Detailed understanding of mechanisms and reaction kinetics are required in order to develop and optimize catalysts and catalytic processes. While steady state investigations are known to give a global view of the catalytic system, transient studies are invaluable since they can provide more detailed insight into elementary steps. For almost thirty years temporal analysis of products (TAP) has been successfully utilized for transient studies of gas phase heterogeneous catalysis, and there have been a number of advances in instrumentation and numerical modeling methods in that time. In the current work, the range of available TAP apparatus will be discussed whilemore » detailed explanations of the types of TAP experiment, the information that can be determined from these experiments and the analysis methods are also included. TAP is a complex methodology and is often viewed as a niche specialty. Here, part of the intention of this work is to highlight the significant contributions TAP can make to catalytic research, while also discussing the issues which will make TAP more relevant and approachable to a wider segment of the catalytic research community. With this in mind, an outlook is also disclosed for the technique in terms of what is needed to revitalize the field and make it more applicable to the recent advances in catalyst characterization (e.g. operando modes).« less

  17. Forty years of temporal analysis of products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morgan, K.; Maguire, N.; Fushimi, R.

    Detailed understanding of mechanisms and reaction kinetics are required in order to develop and optimize catalysts and catalytic processes. While steady state investigations are known to give a global view of the catalytic system, transient studies are invaluable since they can provide more detailed insight into elementary steps. For almost thirty years temporal analysis of products (TAP) has been successfully utilized for transient studies of gas phase heterogeneous catalysis, and there have been a number of advances in instrumentation and numerical modeling methods in that time. In the current work, the range of available TAP apparatus will be discussed whilemore » detailed explanations of the types of TAP experiment, the information that can be determined from these experiments and the analysis methods are also included. TAP is a complex methodology and is often viewed as a niche specialty. Here, part of the intention of this work is to highlight the significant contributions TAP can make to catalytic research, while also discussing the issues which will make TAP more relevant and approachable to a wider segment of the catalytic research community. With this in mind, an outlook is also disclosed for the technique in terms of what is needed to revitalize the field and make it more applicable to the recent advances in catalyst characterization (e.g. operando modes).« less

  18. A new automated assessment method for contrast-detail images by applying support vector machine and its robustness to nonlinear image processing.

    PubMed

    Takei, Takaaki; Ikeda, Mitsuru; Imai, Kuniharu; Yamauchi-Kawaura, Chiyo; Kato, Katsuhiko; Isoda, Haruo

    2013-09-01

    The automated contrast-detail (C-D) analysis methods developed so-far cannot be expected to work well on images processed with nonlinear methods, such as noise reduction methods. Therefore, we have devised a new automated C-D analysis method by applying support vector machine (SVM), and tested for its robustness to nonlinear image processing. We acquired the CDRAD (a commercially available C-D test object) images at a tube voltage of 120 kV and a milliampere-second product (mAs) of 0.5-5.0. A partial diffusion equation based technique was used as noise reduction method. Three radiologists and three university students participated in the observer performance study. The training data for our SVM method was the classification data scored by the one radiologist for the CDRAD images acquired at 1.6 and 3.2 mAs and their noise-reduced images. We also compared the performance of our SVM method with the CDRAD Analyser algorithm. The mean C-D diagrams (that is a plot of the mean of the smallest visible hole diameter vs. hole depth) obtained from our devised SVM method agreed well with the ones averaged across the six human observers for both original and noise-reduced CDRAD images, whereas the mean C-D diagrams from the CDRAD Analyser algorithm disagreed with the ones from the human observers for both original and noise-reduced CDRAD images. In conclusion, our proposed SVM method for C-D analysis will work well for the images processed with the non-linear noise reduction method as well as for the original radiographic images.

  19. The Numerical Propulsion System Simulation: A Multidisciplinary Design System for Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    1999-01-01

    Advances in computational technology and in physics-based modeling are making large scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze ma or propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of design process and to provide the designer with critical information about the components early in the design process. This paper describes the development of the Numerical Propulsion System Simulation (NPSS), a multidisciplinary system of analysis tools that is focussed on extending the simulation capability from components to the full system. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  20. DFT-derived reactive potentials for the simulation of activated processes: the case of CdTe and CdTe:S.

    PubMed

    Hu, Xiao Liang; Ciaglia, Riccardo; Pietrucci, Fabio; Gallet, Grégoire A; Andreoni, Wanda

    2014-06-19

    We introduce a new ab initio derived reactive potential for the simulation of CdTe within density functional theory (DFT) and apply it to calculate both static and dynamical properties of a number of systems (bulk solid, defective structures, liquid, surfaces) at finite temperature. In particular, we also consider cases with low sulfur concentration (CdTe:S). The analysis of DFT and classical molecular dynamics (MD) simulations performed with the same protocol leads to stringent performance tests and to a detailed comparison of the two schemes. Metadynamics techniques are used to empower both Car-Parrinello and classical molecular dynamics for the simulation of activated processes. For the latter, we consider surface reconstruction and sulfur diffusion in the bulk. The same procedures are applied using previously proposed force fields for CdTe and CdTeS materials, thus allowing for a detailed comparison of the various schemes.

  1. The General Mission Analysis Tool (GMAT) System Test Plan

    NASA Technical Reports Server (NTRS)

    Conway, Darrel J.; Hughes, Steven P.

    2007-01-01

    This document serves as the System Test Approach for the GMAT Project. Preparation for system testing consists of three major stages: 1) The Test Approach sets the scope of system testing, the overall strategy to be adopted, the activities to be completed, the general resources required and the methods and processes to be used to test the release. It also details the activities, dependencies and effort required to conduct the System Test. 2) Test Planning details the activities, dependencies and effort required to conduct the System Test. 3) Test Cases documents the tests to be applied, the data to be processed, the automated testing coverage and the expected results. This document covers the first two of these items, and established the framework used for the GMAT test case development. The test cases themselves exist as separate components, and are managed outside of and concurrently with this System Test Plan.

  2. The McDonald Observatory lunar laser ranging project

    NASA Technical Reports Server (NTRS)

    Silverberg, E. C.

    1978-01-01

    A summary of the activities of the McDonald lunar laser ranging station at Fort Davis for the FY 77-78 fiscal year is presented. The lunar laser experiment uses the observatory 2.7m reflecting telescope on a thrice-per-day, 21-day-per-lunation schedule. Data are recorded on magnetic tapes and sent to the University of Texas at Austin where the data is processed. After processing, the data is distributed to interested analysis centers and later to the National Space Science Data Center where it is available for routine distribution. Detailed reports are published on the McDonald operations after every fourth lunation or approximately once every 115 days. These reports contain a day-by-day documentation of the ranging activity, detailed discussions of the equipment development efforts, and an abundance of other information as is needed to document and archive this important data type.

  3. Spatiotemporal shoreline dynamics of Namibian coastal lagoons derived by a dense remote sensing time series approach

    NASA Astrophysics Data System (ADS)

    Behling, Robert; Milewski, Robert; Chabrillat, Sabine

    2018-06-01

    This paper proposes the remote sensing time series approach WLMO (Water-Land MOnitor) to monitor spatiotemporal shoreline changes. The approach uses a hierarchical classification system based on temporal MNDWI-trajectories with the goal to accommodate typical uncertainties in remote sensing shoreline extraction techniques such as existence of clouds and geometric mismatches between images. Applied to a dense Landsat time series between 1984 and 2014 for the two Namibian coastal lagoons at Walvis Bay and Sandwich Harbour the WLMO was able to identify detailed accretion and erosion progressions at the sand spits forming these lagoons. For both lagoons a northward expansion of the sand spits of up to 1000 m was identified, which corresponds well with the prevailing northwards directed ocean current and wind processes that are responsible for the material transport along the shore. At Walvis Bay we could also show that in the 30 years of analysis the sand spit's width has decreased by more than a half from 750 m in 1984-360 m in 2014. This ongoing cross-shore erosion process is a severe risk for future sand spit breaching, which would expose parts of the lagoon and the city to the open ocean. One of the major advantages of WLMO is the opportunity to analyze detailed spatiotemporal shoreline changes. Thus, it could be shown that the observed long-term accretion and erosion processes underwent great variations over time and cannot a priori be assumed as linear processes. Such detailed spatiotemporal process patterns are a prerequisite to improve the understanding of the processes forming the Namibian shorelines. Moreover, the approach has also the potential to be used in other coastal areas, because the focus on MNDWI-trajectories allows the transfer to many multispectral satellite sensors (e.g. Sentinel-2, ASTER) available worldwide.

  4. Extending methods: using Bourdieu's field analysis to further investigate taste

    NASA Astrophysics Data System (ADS)

    Schindel Dimick, Alexandra

    2015-06-01

    In this commentary on Per Anderhag, Per-Olof Wickman and Karim Hamza's article Signs of taste for science, I consider how their study is situated within the concern for the role of science education in the social and cultural production of inequality. Their article provides a finely detailed methodology for analyzing the constitution of taste within science education classrooms. Nevertheless, because the authors' socially situated methodology draws upon Bourdieu's theories, it seems equally important to extend these methods to consider how and why students make particular distinctions within a relational context—a key aspect of Bourdieu's theory of cultural production. By situating the constitution of taste within Bourdieu's field analysis, researchers can explore the ways in which students' tastes and social positionings are established and transformed through time, space, place, and their ability to navigate the field. I describe the process of field analysis in relation to the authors' paper and suggest that combining the authors' methods with a field analysis can provide a strong methodological and analytical framework in which theory and methods combine to create a detailed understanding of students' interest in relation to their context.

  5. Strengthening of competence planning truss through instructional media development details

    NASA Astrophysics Data System (ADS)

    Handayani, Sri; Nurcahyono, M. Hadi

    2017-03-01

    Competency-Based Learning is a model of learning in which the planning, implementation, and assessment refers to the mastery of competencies. Learning in lectures conducted in the framework for comprehensively realizing student competency. Competence means the orientation of the learning activities in the classroom must be given to the students to be more active learning, active search for information themselves and explore alone or with friends in learning activities in pairs or in groups, learn to use a variety of learning resources and printed materials, electronic media, as well as environment. Analysis of learning wooden structure known weakness in the understanding of the truss detail. Hence the need for the development of media that can provide a clear picture of what the structure of the wooden horses and connection details. Development of instructional media consisted of three phases of activity, namely planning, production and assessment. Learning Media planning should be tailored to the needs and conditions necessary to provide reinforcement to the mastery of competencies, through the table material needs. The production process of learning media is done by using hardware (hardware) and software (software) to support the creation of a medium of learning. Assessment of the media poduk yan include feasibility studies, namely by subject matter experts, media experts, while testing was done according to the student's perception of the product. The results of the analysis of the materials for the instructional aspects of the results obtained 100% (very good) and media analysis for the design aspects of the media expressed very good with a percentage of 88.93%. While the analysis of student perceptions expressed very good with a percentage of 84.84%. Media Learning Truss Details feasible and can be used in the implementation of learning wooden structure to provide capacity-building in planning truss

  6. Cost Analysis In A Multi-Mission Operations Environment

    NASA Technical Reports Server (NTRS)

    Newhouse, M.; Felton, L.; Bornas, N.; Botts, D.; Roth, K.; Ijames, G.; Montgomery, P.

    2014-01-01

    Spacecraft control centers have evolved from dedicated, single-mission or single missiontype support to multi-mission, service-oriented support for operating a variety of mission types. At the same time, available money for projects is shrinking and competition for new missions is increasing. These factors drive the need for an accurate and flexible model to support estimating service costs for new or extended missions; the cost model in turn drives the need for an accurate and efficient approach to service cost analysis. The National Aeronautics and Space Administration (NASA) Huntsville Operations Support Center (HOSC) at Marshall Space Flight Center (MSFC) provides operations services to a variety of customers around the world. HOSC customers range from launch vehicle test flights; to International Space Station (ISS) payloads; to small, short duration missions; and has included long duration flagship missions. The HOSC recently completed a detailed analysis of service costs as part of the development of a complete service cost model. The cost analysis process required the team to address a number of issues. One of the primary issues involves the difficulty of reverse engineering individual mission costs in a highly efficient multimission environment, along with a related issue of the value of detailed metrics or data to the cost model versus the cost of obtaining accurate data. Another concern is the difficulty of balancing costs between missions of different types and size and extrapolating costs to different mission types. The cost analysis also had to address issues relating to providing shared, cloud-like services in a government environment, and then assigning an uncertainty or risk factor to cost estimates that are based on current technology, but will be executed using future technology. Finally the cost analysis needed to consider how to validate the resulting cost models taking into account the non-homogeneous nature of the available cost data and the decreasing flight rate. This paper presents the issues encountered during the HOSC cost analysis process, and the associated lessons learned. These lessons can be used when planning for a new multi-mission operations center or in the transformation from a dedicated control center to multi-center operations, as an aid in defining processes that support future cost analysis and estimation. The lessons can also be used by mature serviceoriented, multi-mission control centers to streamline or refine their cost analysis process.

  7. Cost Analysis in a Multi-Mission Operations Environment

    NASA Technical Reports Server (NTRS)

    Felton, Larry; Newhouse, Marilyn; Bornas, Nick; Botts, Dennis; Ijames, Gayleen; Montgomery, Patty; Roth, Karl

    2014-01-01

    Spacecraft control centers have evolved from dedicated, single-mission or single mission-type support to multi-mission, service-oriented support for operating a variety of mission types. At the same time, available money for projects is shrinking and competition for new missions is increasing. These factors drive the need for an accurate and flexible model to support estimating service costs for new or extended missions; the cost model in turn drives the need for an accurate and efficient approach to service cost analysis. The National Aeronautics and Space Administration (NASA) Huntsville Operations Support Center (HOSC) at Marshall Space Flight Center (MSFC) provides operations services to a variety of customers around the world. HOSC customers range from launch vehicle test flights; to International Space Station (ISS) payloads; to small, short duration missions; and has included long duration flagship missions. The HOSC recently completed a detailed analysis of service costs as part of the development of a complete service cost model. The cost analysis process required the team to address a number of issues. One of the primary issues involves the difficulty of reverse engineering individual mission costs in a highly efficient multi-mission environment, along with a related issue of the value of detailed metrics or data to the cost model versus the cost of obtaining accurate data. Another concern is the difficulty of balancing costs between missions of different types and size and extrapolating costs to different mission types. The cost analysis also had to address issues relating to providing shared, cloud-like services in a government environment, and then assigning an uncertainty or risk factor to cost estimates that are based on current technology, but will be executed using future technology. Finally the cost analysis needed to consider how to validate the resulting cost models taking into account the non-homogeneous nature of the available cost data and the decreasing flight rate. This paper presents the issues encountered during the HOSC cost analysis process, and the associated lessons learned. These lessons can be used when planning for a new multi-mission operations center or in the transformation from a dedicated control center to multi-center operations, as an aid in defining processes that support future cost analysis and estimation. The lessons can also be used by mature service-oriented, multi-mission control centers to streamline or refine their cost analysis process.

  8. Application of enhanced modern structured analysis techniques to Space Station Freedom electric power system requirements

    NASA Technical Reports Server (NTRS)

    Biernacki, John; Juhasz, John; Sadler, Gerald

    1991-01-01

    A team of Space Station Freedom (SSF) system engineers are in the process of extensive analysis of the SSF requirements, particularly those pertaining to the electrical power system (EPS). The objective of this analysis is the development of a comprehensive, computer-based requirements model, using an enhanced modern structured analysis methodology (EMSA). Such a model provides a detailed and consistent representation of the system's requirements. The process outlined in the EMSA methodology is unique in that it allows the graphical modeling of real-time system state transitions, as well as functional requirements and data relationships, to be implemented using modern computer-based tools. These tools permit flexible updating and continuous maintenance of the models. Initial findings resulting from the application of EMSA to the EPS have benefited the space station program by linking requirements to design, providing traceability of requirements, identifying discrepancies, and fostering an understanding of the EPS.

  9. Automatic Analysis of Critical Incident Reports: Requirements and Use Cases.

    PubMed

    Denecke, Kerstin

    2016-01-01

    Increasingly, critical incident reports are used as a means to increase patient safety and quality of care. The entire potential of these sources of experiential knowledge remains often unconsidered since retrieval and analysis is difficult and time-consuming, and the reporting systems often do not provide support for these tasks. The objective of this paper is to identify potential use cases for automatic methods that analyse critical incident reports. In more detail, we will describe how faceted search could offer an intuitive retrieval of critical incident reports and how text mining could support in analysing relations among events. To realise an automated analysis, natural language processing needs to be applied. Therefore, we analyse the language of critical incident reports and derive requirements towards automatic processing methods. We learned that there is a huge potential for an automatic analysis of incident reports, but there are still challenges to be solved.

  10. Human Research Program Unique Processes, Criteria, and Guidelines (UPCG). Revision C, July 28, 2011

    NASA Technical Reports Server (NTRS)

    Chin, Duane

    2011-01-01

    This document defines the processes, criteria, and guidelines exclusive to managing the Human Research Program (HRP). The intent of this document is to provide instruction to the reader in the form of processes, criteria, and guidelines. Of the three instructional categories, processes contain the most detail because of the need for a systematic series of actions directed to some end. In contrast, criteria have lesser detail than processes with the idea of creating a rule or principle structure for evaluating or testing something. Guidelines are a higher level indication of a course of action typically with the least amount of detail. The lack of detail in guidelines allows the reader flexibility when performing an action or actions.

  11. Program budgeting and marginal analysis: a case study in chronic airflow limitation.

    PubMed

    Crockett, A; Cranston, J; Moss, J; Scown, P; Mooney, G; Alpers, J

    1999-01-01

    Program budgeting and marginal analysis is a method of priority-setting in health care. This article describes how this method was applied to the management of a disease-specific group, chronic airflow limitation. A sub-program flow chart clarified the major cost drivers. After assessment of the technical efficiency of the sub-programs and careful and detailed analysis, incremental and decremental wish lists of activities were established. Program budgeting and marginal analysis provides a framework for rational resource allocation. The nurturing of a vigorous program management group, with members representing all participants in the process (including patients/consumers), is the key to a successful outcome.

  12. Design of Computer-aided Instruction for Radiology Interpretation: The Role of Cognitive Task Analysis

    PubMed Central

    Pusic, Martin V.; LeBlanc, Vicki; Patel, Vimla L.

    2001-01-01

    Traditional task analysis for instructional design has emphasized the importance of precisely defining behavioral educational objectives and working back to select objective-appropriate instructional strategies. However, this approach may miss effective strategies. Cognitive task analysis, on the other hand, breaks a process down into its component knowledge representations. Selection of instructional strategies based on all such representations in a domain is likely to lead to optimal instructional design. In this demonstration, using the interpretation of cervical spine x-rays as an educational example, we show how a detailed cognitive task analysis can guide the development of computer-aided instruction.

  13. Enabling Rapid and Robust Structural Analysis During Conceptual Design

    NASA Technical Reports Server (NTRS)

    Eldred, Lloyd B.; Padula, Sharon L.; Li, Wu

    2015-01-01

    This paper describes a multi-year effort to add a structural analysis subprocess to a supersonic aircraft conceptual design process. The desired capabilities include parametric geometry, automatic finite element mesh generation, static and aeroelastic analysis, and structural sizing. The paper discusses implementation details of the new subprocess, captures lessons learned, and suggests future improvements. The subprocess quickly compares concepts and robustly handles large changes in wing or fuselage geometry. The subprocess can rank concepts with regard to their structural feasibility and can identify promising regions of the design space. The automated structural analysis subprocess is deemed robust and rapid enough to be included in multidisciplinary conceptual design and optimization studies.

  14. Description and Results of the Air Force Research and Development Program for the Improvement of Maintenance Efficiency.

    ERIC Educational Resources Information Center

    Foley, John P., Jr.

    An overview of the Air Force's Research and Development Program for the Improvement of Maintenance Efficiency is provided. First described are the steps found in any detailed task analysis, a process which results in the complete specification of each task involved in an overall maintenance effort. The factors influencing maintenance effectiveness…

  15. Finite Element Analysis of Eutectic Structures

    DTIC Science & Technology

    2014-03-12

    Reported are the details of processing conditions, microstructure development, and temperature dependent thermoelectric properties . The material system...Sootsman et al ., Microstructure and Thermoelectric Properties of Mechanically Robust PbTe-Si Eutectic Composites, Chem. Mater. 22 (2010) 869. 7. J...Professor) CASE WESTERN RESERVE UNIVERSTY Thermoelectric Properties of WSi2-SixGe1-x Composites Thermoelectric properties of the W/Si/Ge alloy

  16. International Space Station Alpha (ISSA) Integrated Traffic Model

    NASA Technical Reports Server (NTRS)

    Gates, R. E.

    1995-01-01

    The paper discusses the development process of the International Space Station Alpha (ISSA) Integrated Traffic Model which is a subsystem analyses tool utilized in the ISSA design analysis cycles. Fast-track prototyping of the detailed relationships between daily crew and station consumables, propellant needs, maintenance requirements and crew rotation via spread sheets provide adequate benchmarks to assess cargo vehicle design and performance characteristics.

  17. Transcription and Analysis of Qualitative Data in a Study of Women Who Sexually Offended against Children

    ERIC Educational Resources Information Center

    McNulty, Elizabeth Anne

    2012-01-01

    Research on sexual violence is often conducted within the qualitative paradigm. However, many writers have described the lack of specific detail provided with regard to decisions and processes involved in transcribing and analyzing this type of data. In this article, I will provide a description and discussion of the organization, categorization,…

  18. Changing the Peer Review or Changing the Peers--Recent Development in Assessment of Large Research Collaborations

    ERIC Educational Resources Information Center

    Hansson, Finn; Monsted, Mette

    2012-01-01

    Peer review of research programmes is changing. The problem is discussed through detailed study of a selection process to a call for collaborations in the energy sector for the European Institute of Innovation and Technology. The authors were involved in the application for a Knowledge Innovation Community. Through the analysis of the case the…

  19. Calcium inputs and transport in a base-poor forest ecosystem as interpreted by Sr isotopes

    Treesearch

    Scott W. Bailey; James W. Hornbeck; Charles T. Driscoll; Henri E. Gaudette

    1996-01-01

    Depletion of Ca in forests and its effects on forest health are poorly quantified. Depletion has been difficult to document due to limitations in determining rates at which Ca becomes available for ecosystem processes through weathering, and difficulty in determining changes in ecosystem storage. We coupled a detailed analysis of Sr isotopic composition with a mass...

  20. Job Profiling Guide. Results of 1994 Job Profiling. Part of the Ohio Vocational Competency Assessment (OVCA) Package.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Vocational Instructional Materials Lab.

    This guide explains the process of job profiling and details the results of a 1994 profiling of 34 occupations. Discussed in section 1 are the following: purpose and components of the Ohio Vocational Competency Assessment (OVCA) package; purpose, contents, and use of the Ohio Competency Analysis Profiles and Work Keys components of the OVCA…

  1. Simulation of Electric Propulsion Thrusters

    DTIC Science & Technology

    2011-01-01

    and operational lifetime. The second area of modelling activity concerns the plumes produced by electric thrusters. Detailed information on the plumes ...to reproduce the in-orbit space environment using ground-based laboratory facilities. Device modelling also plays an important role in plume ...of the numerical analysis of other aspects of thruster design, such as thermal and structural processes, is omitted here. There are two fundamental

  2. Algorithms and programming tools for image processing on the MPP:3

    NASA Technical Reports Server (NTRS)

    Reeves, Anthony P.

    1987-01-01

    This is the third and final report on the work done for NASA Grant 5-403 on Algorithms and Programming Tools for Image Processing on the MPP:3. All the work done for this grant is summarized in the introduction. Work done since August 1986 is reported in detail. Research for this grant falls under the following headings: (1) fundamental algorithms for the MPP; (2) programming utilities for the MPP; (3) the Parallel Pascal Development System; and (4) performance analysis. In this report, the results of two efforts are reported: region growing, and performance analysis of important characteristic algorithms. In each case, timing results from MPP implementations are included. A paper is included in which parallel algorithms for region growing on the MPP is discussed. These algorithms permit different sized regions to be merged in parallel. Details on the implementation and peformance of several important MPP algorithms are given. These include a number of standard permutations, the FFT, convolution, arbitrary data mappings, image warping, and pyramid operations, all of which have been implemented on the MPP. The permutation and image warping functions have been included in the standard development system library.

  3. Rotating permanent magnet excitation for blood flow measurement.

    PubMed

    Nair, Sarath S; Vinodkumar, V; Sreedevi, V; Nagesh, D S

    2015-11-01

    A compact, portable and improved blood flow measurement system for an extracorporeal circuit having a rotating permanent magnetic excitation scheme is described in this paper. The system consists of a set of permanent magnets rotating near blood or any conductive fluid to create high-intensity alternating magnetic field in it and inducing a sinusoidal varying voltage across the column of fluid. The induced voltage signal is acquired, conditioned and processed to determine its flow rate. Performance analysis shows that a sensitivity of more than 250 mV/lpm can be obtained, which is more than five times higher than conventional flow measurement systems. Choice of rotating permanent magnet instead of an electromagnetic core generates alternate magnetic field of smooth sinusoidal nature which in turn reduces switching and interference noises. These results in reduction in complex electronic circuitry required for processing the signal to a great extent and enable the flow measuring device to be much less costlier, portable and light weight. The signal remains steady even with changes in environmental conditions and has an accuracy of greater than 95%. This paper also describes the construction details of the prototype, the factors affecting sensitivity and detailed performance analysis at various operating conditions.

  4. Detailed Modeling and Analysis of the CPFM Dataset

    NASA Technical Reports Server (NTRS)

    Swartz, William H.; Lloyd, Steven A.; DeMajistre, Robert

    2004-01-01

    A quantitative understanding of photolysis rate coefficients (or "j-values") is essential to determining the photochemical reaction rates that define ozone loss and other crucial processes in the atmosphere. j-Values can be calculated with radiative transfer models, derived from actinic flux observations, or inferred from trace gas measurements. The principal objective of this study is to cross-validate j-values from the Composition and Photodissociative Flux Measurement (CPFM) instrument during the Photochemistry of Ozone Loss in the Arctic Region In Summer (POLARIS) and SAGE I11 Ozone Loss and Validation Experiment (SOLVE) field campaigns with model calculations and other measurements and to use this detailed analysis to improve our ability to determine j-values. Another objective is to analyze the spectral flux from the CPFM (not just the j-values) and, using a multi-wavelength/multi-species spectral fitting technique, determine atmospheric composition.

  5. A photometric study of the Orion OB 1 association. 2: Photometric analysis

    NASA Technical Reports Server (NTRS)

    Warren, W. H., Jr.; Hesser, J. E.

    1976-01-01

    The procedures adopted for analysis of photometric data in terms of color excesses, intrinsic color indexes, absolute visual magnitudes, and rotational-velocity effects are discussed in detail for Orion association B-, intermediate (I)-, and AF-type stars. The effects of the nebular environment and a comparison of various calibrations of Balmer-line and four-color indexes are considered for the determination of individual absolute magnitudes for B-type stars. When absolute magnitudes of stars in the region of the Orion Nebula are determined from the beta index, emission mechanisms appear to spuriously brighten them. A detailed comparison of absolute magnitudes derived from Balmer-line indexes and MK spectral-type calibrations is presented. The data are also examined with regard to the effects of polarization and infrared excesses. The results suggest a complex combination of intracluster and circumstellar origins for these processes.

  6. Extended performance electric propulsion power processor design study. Volume 2: Technical summary

    NASA Technical Reports Server (NTRS)

    Biess, J. J.; Inouye, L. Y.; Schoenfeld, A. D.

    1977-01-01

    Electric propulsion power processor technology has processed during the past decade to the point that it is considered ready for application. Several power processor design concepts were evaluated and compared. Emphasis was placed on a 30 cm ion thruster power processor with a beam power rating supply of 2.2KW to 10KW for the main propulsion power stage. Extension in power processor performance were defined and were designed in sufficient detail to determine efficiency, component weight, part count, reliability and thermal control. A detail design was performed on a microprocessor as the thyristor power processor controller. A reliability analysis was performed to evaluate the effect of the control electronics redesign. Preliminary electrical design, mechanical design and thermal analysis were performed on a 6KW power transformer for the beam supply. Bi-Mod mechanical, structural and thermal control configurations were evaluated for the power processor and preliminary estimates of mechanical weight were determined.

  7. Direct optical detection of protein-ligand interactions.

    PubMed

    Gesellchen, Frank; Zimmermann, Bastian; Herberg, Friedrich W

    2005-01-01

    Direct optical detection provides an excellent means to investigate interactions of molecules in biological systems. The dynamic equilibria inherent to these systems can be described in greater detail by recording the kinetics of a biomolecular interaction. Optical biosensors allow direct detection of interaction patterns without the need for labeling. An overview covering several commercially available biosensors is given, with a focus on instruments based on surface plasmon resonance (SPR) and reflectometric interference spectroscopy (RIFS). Potential assay formats and experimental design, appropriate controls, and calibration procedures, especially when handling low molecular weight substances, are discussed. The single steps of an interaction analysis combined with practical tips for evaluation, data processing, and interpretation of kinetic data are described in detail. In a practical example, a step-by-step procedure for the analysis of a low molecular weight compound interaction with serum protein, determined on a commercial SPR sensor, is presented.

  8. Army-NASA aircrew/aircraft integration program: Phase 4 A(3)I Man-Machine Integration Design and Analysis System (MIDAS) software detailed design document

    NASA Technical Reports Server (NTRS)

    Banda, Carolyn; Bushnell, David; Chen, Scott; Chiu, Alex; Constantine, Betsy; Murray, Jerry; Neukom, Christian; Prevost, Michael; Shankar, Renuka; Staveland, Lowell

    1991-01-01

    The Man-Machine Integration Design and Analysis System (MIDAS) is an integrated suite of software components that constitutes a prototype workstation to aid designers in applying human factors principles to the design of complex human-machine systems. MIDAS is intended to be used at the very early stages of conceptual design to provide an environment wherein designers can use computational representations of the crew station and operator, instead of hardware simulators and man-in-the-loop studies, to discover problems and ask 'what if' questions regarding the projected mission, equipment, and environment. This document is the Software Product Specification for MIDAS. Introductory descriptions of the processing requirements, hardware/software environment, structure, I/O, and control are given in the main body of the document for the overall MIDAS system, with detailed discussion of the individual modules included in Annexes A-J.

  9. Interactive mechanism of working environments and construction behaviors with cognitive work analysis: an elevator installation case study.

    PubMed

    Wang, Yanqing; Chong, Heap-Yih; Liao, Pin-Chao; Ren, Hantao

    2017-09-25

    Unsafe behavior is a leading factor in accidents, and the working environment significantly affects behaviors. However, few studies have focused on detailed mechanisms for addressing unsafe behaviors resulting from environmental constraints. This study aims to delineate these mechanisms using cognitive work analysis (CWA) for an elevator installation case study. Elevator installation was selected for study because it involves operations at heights: falls from heights remain a major cause of construction worker mortality. This study adopts a mixed research approach based on three research methodology stages. This research deconstructs the details of the working environment, the workers' decision-making processes, the strategies chosen given environmental conditions and the conceptual model for workers' behaviors, which jointly depict environment-behavior mechanisms at length. By applying CWA to the construction industry, environmental constraints can easily be identified, and targeted engineering suggestions can be generated.

  10. Self-organizing maps: a versatile tool for the automatic analysis of untargeted imaging datasets.

    PubMed

    Franceschi, Pietro; Wehrens, Ron

    2014-04-01

    MS-based imaging approaches allow for location-specific identification of chemical components in biological samples, opening up possibilities of much more detailed understanding of biological processes and mechanisms. Data analysis, however, is challenging, mainly because of the sheer size of such datasets. This article presents a novel approach based on self-organizing maps, extending previous work in order to be able to handle the large number of variables present in high-resolution mass spectra. The key idea is to generate prototype images, representing spatial distributions of ions, rather than prototypical mass spectra. This allows for a two-stage approach, first generating typical spatial distributions and associated m/z bins, and later analyzing the interesting bins in more detail using accurate masses. The possibilities and advantages of the new approach are illustrated on an in-house dataset of apple slices. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Advanced composite elevator for Boeing 727 aircraft, volume 2

    NASA Technical Reports Server (NTRS)

    Chovil, D. V.; Grant, W. D.; Jamison, E. S.; Syder, H.; Desper, O. E.; Harvey, S. T.; Mccarty, J. E.

    1980-01-01

    Preliminary design activity consisted of developing and analyzing alternate design concepts and selecting the optimum elevator configuration. This included trade studies in which durability, inspectability, producibility, repairability, and customer acceptance were evaluated. Preliminary development efforts consisted of evaluating and selecting material, identifying ancillary structural development test requirements, and defining full scale ground and flight test requirements necessary to obtain Federal Aviation Administration (FAA) certification. After selection of the optimum elevator configuration, detail design was begun and included basic configuration design improvements resulting from manufacturing verification hardware, the ancillary test program, weight analysis, and structural analysis. Detail and assembly tools were designed and fabricated to support a full-scope production program, rather than a limited run. The producibility development programs were used to verify tooling approaches, fabrication processes, and inspection methods for the production mode. Quality parts were readily fabricated and assembled with a minimum rejection rate, using prior inspection methods.

  12. Three-Dimensional Integrated Survey for Building Investigations.

    PubMed

    Costantino, Domenica; Angelini, Maria Giuseppa

    2015-11-01

    The study shows the results of a survey aimed to represent a building collapse and the feasibility of the modellation as a support of structure analysis. An integrated survey using topographic, photogrammetric, and terrestrial laser techniques was carried out to obtain a three-dimensional (3D) model of the building, plans and prospects, and the particulars of the collapsed area. Authors acquired, by a photogrammetric survey, information about regular parties of the structure; while using laser scanner data they reconstructed a set of more interesting architectural details and areas with higher surface curvature. Specifically, the process of texture provided a detailed 3D structure of the areas under investigation. The analysis of the data acquired resulted to be very useful both in identifying the causes of the disaster and also in helping the reconstruction of the collapsed corner showing the contribution that the integrated surveys can give in preserving architectural and historic heritage. © 2015 American Academy of Forensic Sciences.

  13. Glycan Remodeling with Processing Inhibitors and Lectin-Resistant Eukaryotic Cells.

    PubMed

    Chang, Veronica T; Spooner, Robert A; Crispin, Max; Davis, Simon J

    2015-01-01

    Some of the most important and interesting molecules in metazoan biology are glycoproteins. The importance of the carbohydrate component of these structures is often revealed by the disease phenotypes that manifest when the biosynthesis of particular glycoforms is disrupted. On the other hand, the presence of large amounts of carbohydrate can often hinder the structural and functional analysis of glycoproteins. There are often good reasons, therefore, for wanting to engineer and predefine the N-glycans present on glycoproteins, e.g., in order to characterize the functions of the glycans or facilitate their subsequent removal. Here, we describe in detail two distinct ways in which to usefully interfere with oligosaccharide processing, one involving the use of specific processing inhibitors, and the other the selection of cell lines mutated at gene loci that control oligosaccharide processing, using cytotoxic lectins. Both approaches have the capacity for controlled, radical alteration of oligosaccharide processing in eukaryotic cells used for heterologous protein expression, and have great utility in the structural analysis of glycoproteins.

  14. MTI science, data products, and ground-data processing overview

    NASA Astrophysics Data System (ADS)

    Szymanski, John J.; Atkins, William H.; Balick, Lee K.; Borel, Christoph C.; Clodius, William B.; Christensen, R. Wynn; Davis, Anthony B.; Echohawk, J. C.; Galbraith, Amy E.; Hirsch, Karen L.; Krone, James B.; Little, Cynthia K.; McLachlan, Peter M.; Morrison, Aaron; Pollock, Kimberly A.; Pope, Paul A.; Novak, Curtis; Ramsey, Keri A.; Riddle, Emily E.; Rohde, Charles A.; Roussel-Dupre, Diane C.; Smith, Barham W.; Smith, Kathy; Starkovich, Kim; Theiler, James P.; Weber, Paul G.

    2001-08-01

    The mission of the Multispectral Thermal Imager (MTI) satellite is to demonstrate the efficacy of highly accurate multispectral imaging for passive characterization of urban and industrial areas, as well as sites of environmental interest. The satellite makes top-of-atmosphere radiance measurements that are subsequently processed into estimates of surface properties such as vegetation health, temperatures, material composition and others. The MTI satellite also provides simultaneous data for atmospheric characterization at high spatial resolution. To utilize these data the MTI science program has several coordinated components, including modeling, comprehensive ground-truth measurements, image acquisition planning, data processing and data interpretation and analysis. Algorithms have been developed to retrieve a multitude of physical quantities and these algorithms are integrated in a processing pipeline architecture that emphasizes automation, flexibility and programmability. In addition, the MTI science team has produced detailed site, system and atmospheric models to aid in system design and data analysis. This paper provides an overview of the MTI research objectives, data products and ground data processing.

  15. Numerical analysis of the heating phase and densification mechanism in polymers selective laser melting process

    NASA Astrophysics Data System (ADS)

    Mokrane, Aoulaiche; Boutaous, M'hamed; Xin, Shihe

    2018-05-01

    The aim of this work is to address a modeling of the SLS process at the scale of the part in PA12 polymer powder bed. The powder bed is considered as a continuous medium with homogenized properties, meanwhile understanding multiple physical phenomena occurring during the process and studying the influence of process parameters on the quality of final product. A thermal model, based on enthalpy approach, will be presented with details on the multiphysical couplings that allow the thermal history: laser absorption, melting, coalescence, densification, volume shrinkage and on numerical implementation using FV method. The simulations were carried out in 3D with an in-house developed FORTRAN code. After validation of the model with comparison to results from literature, a parametric analysis will be proposed. Some original results as densification process and the thermal history with the evolution of the material, from the granular solid state to homogeneous melted state will be discussed with regards to the involved physical phenomena.

  16. A Large-Scale Design Integration Approach Developed in Conjunction with the Ares Launch Vehicle Program

    NASA Technical Reports Server (NTRS)

    Redmon, John W.; Shirley, Michael C.; Kinard, Paul S.

    2012-01-01

    This paper presents a method for performing large-scale design integration, taking a classical 2D drawing envelope and interface approach and applying it to modern three dimensional computer aided design (3D CAD) systems. Today, the paradigm often used when performing design integration with 3D models involves a digital mockup of an overall vehicle, in the form of a massive, fully detailed, CAD assembly; therefore, adding unnecessary burden and overhead to design and product data management processes. While fully detailed data may yield a broad depth of design detail, pertinent integration features are often obscured under the excessive amounts of information, making them difficult to discern. In contrast, the envelope and interface method results in a reduction in both the amount and complexity of information necessary for design integration while yielding significant savings in time and effort when applied to today's complex design integration projects. This approach, combining classical and modern methods, proved advantageous during the complex design integration activities of the Ares I vehicle. Downstream processes, benefiting from this approach by reducing development and design cycle time, include: Creation of analysis models for the Aerodynamic discipline; Vehicle to ground interface development; Documentation development for the vehicle assembly.

  17. Probabilistic Structural Analysis of the Solid Rocket Booster Aft Skirt External Fitting Modification

    NASA Technical Reports Server (NTRS)

    Townsend, John S.; Peck, Jeff; Ayala, Samuel

    2000-01-01

    NASA has funded several major programs (the Probabilistic Structural Analysis Methods Project is an example) to develop probabilistic structural analysis methods and tools for engineers to apply in the design and assessment of aerospace hardware. A probabilistic finite element software code, known as Numerical Evaluation of Stochastic Structures Under Stress, is used to determine the reliability of a critical weld of the Space Shuttle solid rocket booster aft skirt. An external bracket modification to the aft skirt provides a comparison basis for examining the details of the probabilistic analysis and its contributions to the design process. Also, analysis findings are compared with measured Space Shuttle flight data.

  18. PanDA for COMPASS at JINR

    NASA Astrophysics Data System (ADS)

    Petrosyan, A. Sh.

    2016-09-01

    PanDA (Production and Distributed Analysis System) is a workload management system, widely used for data processing at experiments on Large Hadron Collider and others. COMPASS is a high-energy physics experiment at the Super Proton Synchrotron. Data processing for COMPASS runs locally at CERN, on lxbatch, the data itself stored in CASTOR. In 2014 an idea to start running COMPASS production through PanDA arose. Such transformation in experiment's data processing will allow COMPASS community to use not only CERN resources, but also Grid resources worldwide. During the spring and summer of 2015 installation, validation and migration work is being performed at JINR. Details and results of this process are presented in this paper.

  19. Pea Border Cell Maturation and Release Involve Complex Cell Wall Structural Dynamics1[OPEN

    PubMed Central

    2017-01-01

    The adhesion of plant cells is vital for support and protection of the plant body and is maintained by a variety of molecular associations between cell wall components. In some specialized cases, though, plant cells are programmed to detach, and root cap-derived border cells are examples of this. Border cells (in some species known as border-like cells) provide an expendable barrier between roots and the environment. Their maturation and release is an important but poorly characterized cell separation event. To gain a deeper insight into the complex cellular dynamics underlying this process, we undertook a systematic, detailed analysis of pea (Pisum sativum) root tip cell walls. Our study included immunocarbohydrate microarray profiling, monosaccharide composition determination, Fourier-transformed infrared microspectroscopy, quantitative reverse transcription-PCR of cell wall biosynthetic genes, analysis of hydrolytic activities, transmission electron microscopy, and immunolocalization of cell wall components. Using this integrated glycobiology approach, we identified multiple novel modes of cell wall structural and compositional rearrangement during root cap growth and the release of border cells. Our findings provide a new level of detail about border cell maturation and enable us to develop a model of the separation process. We propose that loss of adhesion by the dissolution of homogalacturonan in the middle lamellae is augmented by an active biophysical process of cell curvature driven by the polarized distribution of xyloglucan and extensin epitopes. PMID:28400496

  20. Adaptation of a Control Center Development Environment for Industrial Process Control

    NASA Technical Reports Server (NTRS)

    Killough, Ronnie L.; Malik, James M.

    1994-01-01

    In the control center, raw telemetry data is received for storage, display, and analysis. This raw data must be combined and manipulated in various ways by mathematical computations to facilitate analysis, provide diversified fault detection mechanisms, and enhance display readability. A development tool called the Graphical Computation Builder (GCB) has been implemented which provides flight controllers with the capability to implement computations for use in the control center. The GCB provides a language that contains both general programming constructs and language elements specifically tailored for the control center environment. The GCB concept allows staff who are not skilled in computer programming to author and maintain computer programs. The GCB user is isolated from the details of external subsystem interfaces and has access to high-level functions such as matrix operators, trigonometric functions, and unit conversion macros. The GCB provides a high level of feedback during computation development that improves upon the often cryptic errors produced by computer language compilers. An equivalent need can be identified in the industrial data acquisition and process control domain: that of an integrated graphical development tool tailored to the application to hide the operating system, computer language, and data acquisition interface details. The GCB features a modular design which makes it suitable for technology transfer without significant rework. Control center-specific language elements can be replaced by elements specific to industrial process control.

  1. DEFINING THE RELEVANT OUTCOME MEASURES IN MEDICAL DEVICE ASSESSMENTS: AN ANALYSIS OF THE DEFINITION PROCESS IN HEALTH TECHNOLOGY ASSESSMENT.

    PubMed

    Jacobs, Esther; Antoine, Sunya-Lee; Prediger, Barbara; Neugebauer, Edmund; Eikermann, Michaela

    2017-01-01

    Defining relevant outcome measures for clinical trials on medical devices (MD) is complex, as there is a large variety of potentially relevant outcomes. The chosen outcomes vary widely across clinical trials making the assessment in evidence syntheses very challenging. The objective is to provide an overview on the current common procedures of health technology assessment (HTA) institutions in defining outcome measures in MD trials. In 2012-14, the Web pages of 126 institutions involved in HTA were searched for methodological manuals written in English or German that describe methods for the predefinition process of outcome measures. Additionally, the institutions were contacted by email. Relevant information was extracted. All process steps were performed independently by two reviewers. Twenty-four manuals and ten responses from the email request were included in the analysis. Overall, 88.5 percent of the institutions describe the type of outcomes that should be considered in detail and 84.6 percent agree that the main focus should be on patient relevant outcomes. Specifically related to MD, information could be obtained in 26 percent of the included manuals and email responses. Eleven percent of the institutions report a particular consideration of MD related outcomes. This detailed analysis on common procedures of HTA institutions in the context of defining relevant outcome measures for the assessment of MD shows that standardized procedures for MD from the perspective of HTA institutions are not widespread. This leads to the question if a homogenous approach should be implemented in the field of HTA on MD.

  2. Intelligent Work Process Engineering System

    NASA Technical Reports Server (NTRS)

    Williams, Kent E.

    2003-01-01

    Optimizing performance on work activities and processes requires metrics of performance for management to monitor and analyze in order to support further improvements in efficiency, effectiveness, safety, reliability and cost. Information systems are therefore required to assist management in making timely, informed decisions regarding these work processes and activities. Currently information systems regarding Space Shuttle maintenance and servicing do not exist to make such timely decisions. The work to be presented details a system which incorporates various automated and intelligent processes and analysis tools to capture organize and analyze work process related data, to make the necessary decisions to meet KSC organizational goals. The advantages and disadvantages of design alternatives to the development of such a system will be discussed including technologies, which would need to bedesigned, prototyped and evaluated.

  3. Cognitive task analysis of network analysts and managers for network situational awareness

    NASA Astrophysics Data System (ADS)

    Erbacher, Robert F.; Frincke, Deborah A.; Wong, Pak Chung; Moody, Sarah; Fink, Glenn

    2010-01-01

    The goal of our project is to create a set of next-generation cyber situational-awareness capabilities with applications to other domains in the long term. The situational-awareness capabilities being developed focus on novel visualization techniques as well as data analysis techniques designed to improve the comprehensibility of the visualizations. The objective is to improve the decision-making process to enable decision makers to choose better actions. To this end, we put extensive effort into ensuring we had feedback from network analysts and managers and understanding what their needs truly are. This paper discusses the cognitive task analysis methodology we followed to acquire feedback from the analysts. This paper also provides the details we acquired from the analysts on their processes, goals, concerns, etc. A final result we describe is the generation of a task-flow diagram.

  4. Flat plate vs. concentrator solar photovoltaic cells - A manufacturing cost analysis

    NASA Technical Reports Server (NTRS)

    Granon, L. A.; Coleman, M. G.

    1980-01-01

    The choice of which photovoltaic system (flat plate or concentrator) to use for utilizing solar cells to generate electricity depends mainly on the cost. A detailed, comparative manufacturing cost analysis of the two types of systems is presented. Several common assumptions, i.e., cell thickness, interest rate, power rate, factory production life, polysilicon cost, and direct labor rate are utilized in this analysis. Process sequences, cost variables, and sensitivity analyses have been studied, and results of the latter show that the most important parameters which determine manufacturing costs are concentration ratio, manufacturing volume, and cell efficiency. The total cost per watt of the flat plate solar cell is $1.45, and that of the concentrator solar cell is $1.85, the higher cost being due to the increased process complexity and material costs.

  5. Towards a typology of business process management professionals: identifying patterns of competences through latent semantic analysis

    NASA Astrophysics Data System (ADS)

    Müller, Oliver; Schmiedel, Theresa; Gorbacheva, Elena; vom Brocke, Jan

    2016-01-01

    While researchers have analysed the organisational competences that are required for successful Business Process Management (BPM) initiatives, individual BPM competences have not yet been studied in detail. In this study, latent semantic analysis is used to examine a collection of 1507 BPM-related job advertisements in order to develop a typology of BPM professionals. This empirical analysis reveals distinct ideal types and profiles of BPM professionals on several levels of abstraction. A closer look at these ideal types and profiles confirms that BPM is a boundary-spanning field that requires interdisciplinary sets of competence that range from technical competences to business and systems competences. Based on the study's findings, it is posited that individual and organisational alignment with the identified ideal types and profiles is likely to result in high employability and organisational BPM success.

  6. Atomistic details of protein dynamics and the role of hydration water

    DOE PAGES

    Khodadadi, Sheila; Sokolov, Alexei P.

    2016-05-04

    The importance of protein dynamics for their biological activity is nowwell recognized. Different experimental and computational techniques have been employed to study protein dynamics, hierarchy of different processes and the coupling between protein and hydration water dynamics. But, understanding the atomistic details of protein dynamics and the role of hydration water remains rather limited. Based on overview of neutron scattering, molecular dynamic simulations, NMR and dielectric spectroscopy results we present a general picture of protein dynamics covering time scales from faster than ps to microseconds and the influence of hydration water on different relaxation processes. Internal protein dynamics spread overmore » a wide time range fromfaster than picosecond to longer than microseconds. We suggest that the structural relaxation in hydrated proteins appears on the microsecond time scale, while faster processes present mostly motion of side groups and some domains. Hydration water plays a crucial role in protein dynamics on all time scales. It controls the coupled protein-hydration water relaxation on 10 100 ps time scale. Our process defines the friction for slower protein dynamics. Analysis suggests that changes in amount of hydration water affect not only general friction, but also influence significantly the protein's energy landscape.« less

  7. Atomistic details of protein dynamics and the role of hydration water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khodadadi, Sheila; Sokolov, Alexei P.

    The importance of protein dynamics for their biological activity is nowwell recognized. Different experimental and computational techniques have been employed to study protein dynamics, hierarchy of different processes and the coupling between protein and hydration water dynamics. But, understanding the atomistic details of protein dynamics and the role of hydration water remains rather limited. Based on overview of neutron scattering, molecular dynamic simulations, NMR and dielectric spectroscopy results we present a general picture of protein dynamics covering time scales from faster than ps to microseconds and the influence of hydration water on different relaxation processes. Internal protein dynamics spread overmore » a wide time range fromfaster than picosecond to longer than microseconds. We suggest that the structural relaxation in hydrated proteins appears on the microsecond time scale, while faster processes present mostly motion of side groups and some domains. Hydration water plays a crucial role in protein dynamics on all time scales. It controls the coupled protein-hydration water relaxation on 10 100 ps time scale. Our process defines the friction for slower protein dynamics. Analysis suggests that changes in amount of hydration water affect not only general friction, but also influence significantly the protein's energy landscape.« less

  8. An engineering and economic evaluation of quick germ-quick fiber process for dry-grind ethanol facilities: analysis.

    PubMed

    Rodríguez, Luis F; Li, Changying; Khanna, Madhu; Spaulding, Aslihan D; Lin, Tao; Eckhoff, Steven R

    2010-07-01

    An engineering economic model, which is mass balanced and compositionally driven, was developed to compare the conventional corn dry-grind process and the pre-fractionation process called quick germ-quick fiber (QQ). In this model, documented in a companion article, the distillers dried grains with solubles (DDGS) price was linked with its protein and fiber content as well as with the long-term average relationship with the corn price. The detailed economic analysis showed that the QQ plant retrofitted from conventional dry-grind ethanol plant reduces the manufacturing cost of ethanol by 13.5 cent/gallon and has net present value of nearly $4 million greater than the conventional dry-grind plant at an interest rate of 4% in 15years. Ethanol and feedstock price sensitivity analysis showed that the QQ plant gains more profits when ethanol price increases than conventional dry-grind ethanol plant. An optimistic analysis of the QQ process suggests that the greater value of the modified DDGS would provide greater resistance to fluctuations in corn price for QQ facilities. This model can be used to provide decision support for ethanol producers. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  9. Out of the frying pan? Streamlining the ethics review process of multisite qualitative research projects.

    PubMed

    Iedema, Rick A M; Allen, Suellen; Britton, Kate; Hor, Suyin

    2013-05-01

    This paper describes the ethics approval processes for two multicentre, nationwide, qualitative health service research projects. The paper explains that the advent of the National Ethics Application Form has brought many improvements, but that attendant processes put in place at local health network and Human Research Ethics Committee levels may have become significantly more complicated, particularly for innovative qualitative research projects. The paper raises several questions based on its analysis of ethics application processes currently in place. WHAT IS KNOWN ABOUT THE TOPIC? The complexity of multicentre research ethics applications for research in health services has been addressed by the introduction of the National Ethics Application Form. Uptake of the form across the country's human research ethics committees has been uneven. WHAT DOES THIS PAPER ADD? This paper adds detailed insight into the ethics application process as it is currently enacted across the country. The paper details this process with reference to difficulties faced by multisite and qualitative studies in negotiating access to research sites, ethics committees' relative unfamiliarity with qualitative research , and apparent tensions between harmonisation and local sites' autonomy in approving research. WHAT ARE THE IMPLICATIONS FOR PRACTITIONERS? Practitioners aiming to engage in research need to be aware that ethics approval takes place in an uneven procedural landscape, made up of variable levels of ethics approval harmonization and intricate governance or site-specific assessment processes.

  10. Evaluation of stabilization techniques for ion implant processing

    NASA Astrophysics Data System (ADS)

    Ross, Matthew F.; Wong, Selmer S.; Minter, Jason P.; Marlowe, Trey; Narcy, Mark E.; Livesay, William R.

    1999-06-01

    With the integration of high current ion implant processing into volume CMOS manufacturing, the need for photoresist stabilization to achieve a stable ion implant process is critical. This study compares electron beam stabilization, a non-thermal process, with more traditional thermal stabilization techniques such as hot plate baking and vacuum oven processing. The electron beam processing is carried out in a flood exposure system with no active heating of the wafer. These stabilization techniques are applied to typical ion implant processes that might be found in a CMOS production process flow. The stabilization processes are applied to a 1.1 micrometers thick PFI-38A i-line photoresist film prior to ion implant processing. Post stabilization CD variation is detailed with respect to wall slope and feature integrity. SEM photographs detail the effects of the stabilization technique on photoresist features. The thermal stability of the photoresist is shown for different levels of stabilization and post stabilization thermal cycling. Thermal flow stability of the photoresist is detailed via SEM photographs. A significant improvement in thermal stability is achieved with the electron beam process, such that photoresist features are stable to temperatures in excess of 200 degrees C. Ion implant processing parameters are evaluated and compared for the different stabilization methods. Ion implant system end-station chamber pressure is detailed as a function of ion implant process and stabilization condition. The ion implant process conditions are detailed for varying factors such as ion current, energy, and total dose. A reduction in the ion implant systems end-station chamber pressure is achieved with the electron beam stabilization process over the other techniques considered. This reduction in end-station chamber pressure is shown to provide a reduction in total process time for a given ion implant dose. Improvements in the ion implant process are detailed across several combinations of current and energy.

  11. Automatic Fringe Detection for Oil Film Interferometry Measurement of Skin Friction

    NASA Technical Reports Server (NTRS)

    Naughton, Jonathan W.; Decker, Robert K.; Jafari, Farhad

    2001-01-01

    This report summarizes two years of work on investigating algorithms for automatically detecting fringe patterns in images acquired using oil-drop interferometry for the determination of skin friction. Several different analysis methods were tested, and a combination of a windowed Fourier transform followed by a correlation was found to be most effective. The implementation of this method is discussed and details of the process are described. The results indicate that this method shows promise for automating the fringe detection process, but further testing is required.

  12. System software for the finite element machine

    NASA Technical Reports Server (NTRS)

    Crockett, T. W.; Knott, J. D.

    1985-01-01

    The Finite Element Machine is an experimental parallel computer developed at Langley Research Center to investigate the application of concurrent processing to structural engineering analysis. This report describes system-level software which has been developed to facilitate use of the machine by applications researchers. The overall software design is outlined, and several important parallel processing issues are discussed in detail, including processor management, communication, synchronization, and input/output. Based on experience using the system, the hardware architecture and software design are critiqued, and areas for further work are suggested.

  13. Monitoring the spring-summer surface energy budget transition in the Gobi Desert using AVHRR GAC data. [Global Area Coverage

    NASA Technical Reports Server (NTRS)

    Smith, Eric A.; Reiter, Elmar R.

    1986-01-01

    A research program has been started in which operationally available weather satellites radiance data are used to reconstruct various properties of the diurnal surface energy budget over sites for which detailed estimates of the complete radiation, heat, and moisture exchange process are available. In this paper, preliminary analysis of the 1985 Gobi Desert summer period results is presented. The findings demonstrate various important relationships concerning the feasibility of retrieving the amplitudes of the diurnal surface energy budget processes for daytime and nighttime conditions.

  14. Spacelab data analysis and interactive control study

    NASA Technical Reports Server (NTRS)

    Tarbell, T. D.; Drake, J. F.

    1980-01-01

    The study consisted of two main tasks, a series of interviews of Spacelab users and a survey of data processing and display equipment. Findings from the user interviews on questions of interactive control, downlink data formats, and Spacelab computer software development are presented. Equipment for quick look processing and display of scientific data in the Spacelab Payload Operations Control Center (POCC) was surveyed. Results of this survey effort are discussed in detail, along with recommendations for NASA development of several specific display systems which meet common requirements of many Spacelab experiments.

  15. The (Mathematical) Modeling Process in Biosciences.

    PubMed

    Torres, Nestor V; Santos, Guido

    2015-01-01

    In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology.

  16. Pathways towards instability in financial networks

    NASA Astrophysics Data System (ADS)

    Bardoscia, Marco; Battiston, Stefano; Caccioli, Fabio; Caldarelli, Guido

    2017-02-01

    Following the financial crisis of 2007-2008, a deep analogy between the origins of instability in financial systems and complex ecosystems has been pointed out: in both cases, topological features of network structures influence how easily distress can spread within the system. However, in financial network models, the details of how financial institutions interact typically play a decisive role, and a general understanding of precisely how network topology creates instability remains lacking. Here we show how processes that are widely believed to stabilize the financial system, that is, market integration and diversification, can actually drive it towards instability, as they contribute to create cyclical structures which tend to amplify financial distress, thereby undermining systemic stability and making large crises more likely. This result holds irrespective of the details of how institutions interact, showing that policy-relevant analysis of the factors affecting financial stability can be carried out while abstracting away from such details.

  17. Pathways towards instability in financial networks

    PubMed Central

    Bardoscia, Marco; Battiston, Stefano; Caccioli, Fabio; Caldarelli, Guido

    2017-01-01

    Following the financial crisis of 2007–2008, a deep analogy between the origins of instability in financial systems and complex ecosystems has been pointed out: in both cases, topological features of network structures influence how easily distress can spread within the system. However, in financial network models, the details of how financial institutions interact typically play a decisive role, and a general understanding of precisely how network topology creates instability remains lacking. Here we show how processes that are widely believed to stabilize the financial system, that is, market integration and diversification, can actually drive it towards instability, as they contribute to create cyclical structures which tend to amplify financial distress, thereby undermining systemic stability and making large crises more likely. This result holds irrespective of the details of how institutions interact, showing that policy-relevant analysis of the factors affecting financial stability can be carried out while abstracting away from such details. PMID:28221338

  18. Pathways towards instability in financial networks.

    PubMed

    Bardoscia, Marco; Battiston, Stefano; Caccioli, Fabio; Caldarelli, Guido

    2017-02-21

    Following the financial crisis of 2007-2008, a deep analogy between the origins of instability in financial systems and complex ecosystems has been pointed out: in both cases, topological features of network structures influence how easily distress can spread within the system. However, in financial network models, the details of how financial institutions interact typically play a decisive role, and a general understanding of precisely how network topology creates instability remains lacking. Here we show how processes that are widely believed to stabilize the financial system, that is, market integration and diversification, can actually drive it towards instability, as they contribute to create cyclical structures which tend to amplify financial distress, thereby undermining systemic stability and making large crises more likely. This result holds irrespective of the details of how institutions interact, showing that policy-relevant analysis of the factors affecting financial stability can be carried out while abstracting away from such details.

  19. A detail enhancement and dynamic range adjustment algorithm for high dynamic range images

    NASA Astrophysics Data System (ADS)

    Xu, Bo; Wang, Huachuang; Liang, Mingtao; Yu, Cong; Hu, Jinlong; Cheng, Hua

    2014-08-01

    Although high dynamic range (HDR) images contain large amounts of information, they have weak texture and low contrast. What's more, these images are difficult to be reproduced on low dynamic range displaying mediums. If much more information is to be acquired when these images are displayed on PCs, some specific transforms, such as compressing the dynamic range, enhancing the portions of little difference in original contrast and highlighting the texture details on the premise of keeping the parts of large contrast, are needed. To this ends, a multi-scale guided filter enhancement algorithm which derives from the single-scale guided filter based on the analysis of non-physical model is proposed in this paper. Firstly, this algorithm decomposes the original HDR images into base image and detail images of different scales, and then it adaptively selects a transform function which acts on the enhanced detail images and original images. By comparing the treatment effects of HDR images and low dynamic range (LDR) images of different scene features, it proves that this algorithm, on the basis of maintaining the hierarchy and texture details of images, not only improves the contrast and enhances the details of images, but also adjusts the dynamic range well. Thus, it is much suitable for human observation or analytical processing of machines.

  20. MATLAB for laser speckle contrast analysis (LASCA): a practice-based approach

    NASA Astrophysics Data System (ADS)

    Postnikov, Eugene B.; Tsoy, Maria O.; Postnov, Dmitry E.

    2018-04-01

    Laser Speckle Contrast Analysis (LASCA) is one of the most powerful modern methods for revealing blood dynamics. The experimental design and theory for this method are well established, and the computational recipie is often regarded to be trivial. However, the achieved performance and spatial resolution may considerable differ for different implementations. We comprise a minireview of known approaches to the spatial laser speckle contrast data processing and their realization in MATLAB code providing an explicit correspondence to the mathematical representation, a discussion of available implementations. We also present the algorithm based on the 2D Haar wavelet transform, also supplied with the program code. This new method provides an opportunity to introduce horizontal, vertical and diagonal speckle contrasts; it may be used for processing highly anisotropic images of vascular trees. We provide the comparative analysis of the accuracy of vascular pattern detection and the processing times with a special attention to details of the used MATLAB procedures.

  1. Characterisation of two-stage ignition in diesel engine-relevant thermochemical conditions using direct numerical simulation

    DOE PAGES

    Krisman, Alex; Hawkes, Evatt R.; Talei, Mohsen; ...

    2016-08-30

    With the goal of providing a more detailed fundamental understanding of ignition processes in diesel engines, this study reports analysis of a direct numerical simulation (DNS) database. In the DNS, a pseudo turbulent mixing layer of dimethyl ether (DME) at 400 K and air at 900 K is simulated at a pressure of 40 atmospheres. At these conditions, DME exhibits a two-stage ignition and resides within the negative temperature coefficient (NTC) regime of ignition delay times, similar to diesel fuel. The analysis reveals a complex ignition process with several novel features. Autoignition occurs as a distributed, two-stage event. The high-temperaturemore » stage of ignition establishes edge flames that have a hybrid premixed/autoignition flame structure similar to that previously observed for lifted laminar flames at similar thermochemical conditions. In conclusion, a combustion mode analysis based on key radical species illustrates the multi-stage and multi-mode nature of the ignition process and highlights the substantial modelling challenge presented by diesel combustion.« less

  2. An updated comprehensive techno-economic analysis of algae biodiesel.

    PubMed

    Nagarajan, Sanjay; Chou, Siaw Kiang; Cao, Shenyan; Wu, Chen; Zhou, Zhi

    2013-10-01

    Algae biodiesel is a promising but expensive alternative fuel to petro-diesel. To overcome cost barriers, detailed cost analyses are needed. A decade-old cost analysis by the U.S. National Renewable Energy Laboratory indicated that the costs of algae biodiesel were in the range of $0.53-0.85/L (2012 USD values). However, the cost of land and transesterification were just roughly estimated. In this study, an updated comprehensive techno-economic analysis was conducted with optimized processes and improved cost estimations. Latest process improvement, quotes from vendors, government databases, and other relevant data sources were used to calculate the updated algal biodiesel costs, and the final costs of biodiesel are in the range of $0.42-0.97/L. Additional improvements on cost-effective biodiesel production around the globe to cultivate algae was also recommended. Overall, the calculated costs seem promising, suggesting that a single step biodiesel production process is close to commercial reality. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Practical, transparent prospective risk analysis for the clinical laboratory.

    PubMed

    Janssens, Pim Mw

    2014-11-01

    Prospective risk analysis (PRA) is an essential element in quality assurance for clinical laboratories. Practical approaches to conducting PRA in laboratories, however, are scarce. On the basis of the classical Failure Mode and Effect Analysis method, an approach to PRA was developed for application to key laboratory processes. First, the separate, major steps of the process under investigation are identified. Scores are then given for the Probability (P) and Consequence (C) of predefined types of failures and the chances of Detecting (D) these failures. Based on the P and C scores (on a 10-point scale), an overall Risk score (R) is calculated. The scores for each process were recorded in a matrix table. Based on predetermined criteria for R and D, it was determined whether a more detailed analysis was required for potential failures and, ultimately, where risk-reducing measures were necessary, if any. As an illustration, this paper presents the results of the application of PRA to our pre-analytical and analytical activities. The highest R scores were obtained in the stat processes, the most common failure type in the collective process steps was 'delayed processing or analysis', the failure type with the highest mean R score was 'inappropriate analysis' and the failure type most frequently rated as suboptimal was 'identification error'. The PRA designed is a useful semi-objective tool to identify process steps with potential failures rated as risky. Its systematic design and convenient output in matrix tables makes it easy to perform, practical and transparent. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  4. Temporally flexible feedback signal to foveal cortex for peripheral object recognition

    PubMed Central

    Fan, Xiaoxu; Wang, Lan; Shao, Hanyu; Kersten, Daniel; He, Sheng

    2016-01-01

    Recent studies have shown that information from peripherally presented images is present in the human foveal retinotopic cortex, presumably because of feedback signals. We investigated this potential feedback signal by presenting noise in fovea at different object–noise stimulus onset asynchronies (SOAs), whereas subjects performed a discrimination task on peripheral objects. Results revealed a selective impairment of performance when foveal noise was presented at 250-ms SOA, but only for tasks that required comparing objects’ spatial details, suggesting a task- and stimulus-dependent foveal processing mechanism. Critically, the temporal window of foveal processing was shifted when mental rotation was required for the peripheral objects, indicating that the foveal retinotopic processing is not automatically engaged at a fixed time following peripheral stimulation; rather, it occurs at a stage when detailed information is required. Moreover, fMRI measurements using multivoxel pattern analysis showed that both image and object category-relevant information of peripheral objects was represented in the foveal cortex. Taken together, our results support the hypothesis of a temporally flexible feedback signal to the foveal retinotopic cortex when discriminating objects in the visual periphery. PMID:27671651

  5. Refractive microlensarray made of silver-halide sensitized gelatin (SHSG) etched by enzyme with SLM-based lithography

    NASA Astrophysics Data System (ADS)

    Guo, Xiaowei; Chen, Mingyong; Zhu, Jianhua; Ma, Yanqin; Du, Jinglei; Guo, Yongkang; Du, Chunlei

    2006-01-01

    A novel method for the fabrication of continuous micro-optical components is presented in this paper. It employs a computer controlled digital-micromirror-device(DMD TM) as a switchable projection mask and silver-halide sensitized gelatin (SHSG) as recording material. By etching SHSG with enzyme solution, the micro-optical components with relief modulation can be generated through special processing procedures. The principles of etching SHSG with enzyme and theoretical analysis for deep etching are also discussed in detail, and the detailed quantitative experiments on the processing procedures are conducted to determine optimum technique parameters. A good linear relationship within a depth range of 4μm was experimentally obtained between exposure dose and relief depth. At last, the microlensarray with 256.8μm radius and 2.572μm depth was achieved. This method is simple, cheap and the aberration in processing procedures can be corrected in the step of designing mask, so it is a practical method to fabricate good continuous profile for low-volume production.

  6. Research and Implementation of Heart Sound Denoising

    NASA Astrophysics Data System (ADS)

    Liu, Feng; Wang, Yutai; Wang, Yanxiang

    Heart sound is one of the most important signals. However, the process of getting heart sound signal can be interfered with many factors outside. Heart sound is weak electric signal and even weak external noise may lead to the misjudgment of pathological and physiological information in this signal, thus causing the misjudgment of disease diagnosis. As a result, it is a key to remove the noise which is mixed with heart sound. In this paper, a more systematic research and analysis which is involved in heart sound denoising based on matlab has been made. The study of heart sound denoising based on matlab firstly use the powerful image processing function of matlab to transform heart sound signals with noise into the wavelet domain through wavelet transform and decomposition these signals in muli-level. Then for the detail coefficient, soft thresholding is made using wavelet transform thresholding to eliminate noise, so that a signal denoising is significantly improved. The reconstructed signals are gained with stepwise coefficient reconstruction for the processed detail coefficient. Lastly, 50HZ power frequency and 35 Hz mechanical and electrical interference signals are eliminated using a notch filter.

  7. Dense image matching of terrestrial imagery for deriving high-resolution topographic properties of vegetation locations in alpine terrain

    NASA Astrophysics Data System (ADS)

    Niederheiser, R.; Rutzinger, M.; Bremer, M.; Wichmann, V.

    2018-04-01

    The investigation of changes in spatial patterns of vegetation and identification of potential micro-refugia requires detailed topographic and terrain information. However, mapping alpine topography at very detailed scales is challenging due to limited accessibility of sites. Close-range sensing by photogrammetric dense matching approaches based on terrestrial images captured with hand-held cameras offers a light-weight and low-cost solution to retrieve high-resolution measurements even in steep terrain and at locations, which are difficult to access. We propose a novel approach for rapid capturing of terrestrial images and a highly automated processing chain for retrieving detailed dense point clouds for topographic modelling. For this study, we modelled 249 plot locations. For the analysis of vegetation distribution and location properties, topographic parameters, such as slope, aspect, and potential solar irradiation were derived by applying a multi-scale approach utilizing voxel grids and spherical neighbourhoods. The result is a micro-topography archive of 249 alpine locations that includes topographic parameters at multiple scales ready for biogeomorphological analysis. Compared with regional elevation models at larger scales and traditional 2D gridding approaches to create elevation models, we employ analyses in a fully 3D environment that yield much more detailed insights into interrelations between topographic parameters, such as potential solar irradiation, surface area, aspect and roughness.

  8. Abstractions for DNA circuit design.

    PubMed

    Lakin, Matthew R; Youssef, Simon; Cardelli, Luca; Phillips, Andrew

    2012-03-07

    DNA strand displacement techniques have been used to implement a broad range of information processing devices, from logic gates, to chemical reaction networks, to architectures for universal computation. Strand displacement techniques enable computational devices to be implemented in DNA without the need for additional components, allowing computation to be programmed solely in terms of nucleotide sequences. A major challenge in the design of strand displacement devices has been to enable rapid analysis of high-level designs while also supporting detailed simulations that include known forms of interference. Another challenge has been to design devices capable of sustaining precise reaction kinetics over long periods, without relying on complex experimental equipment to continually replenish depleted species over time. In this paper, we present a programming language for designing DNA strand displacement devices, which supports progressively increasing levels of molecular detail. The language allows device designs to be programmed using a common syntax and then analysed at varying levels of detail, with or without interference, without needing to modify the program. This allows a trade-off to be made between the level of molecular detail and the computational cost of analysis. We use the language to design a buffered architecture for DNA devices, capable of maintaining precise reaction kinetics for a potentially unbounded period. We test the effectiveness of buffered gates to support long-running computation by designing a DNA strand displacement system capable of sustained oscillations.

  9. Molecular details of secretory phospholipase A2 from flax (Linum usitatissimum L.) provide insight into its structure and function.

    PubMed

    Gupta, Payal; Dash, Prasanta K

    2017-09-11

    Secretory phospholipase A 2 (sPLA 2 ) are low molecular weight proteins (12-18 kDa) involved in a suite of plant cellular processes imparting growth and development. With myriad roles in physiological and biochemical processes in plants, detailed analysis of sPLA 2 in flax/linseed is meagre. The present work, first in flax, embodies cloning, expression, purification and molecular characterisation of two distinct sPLA 2 s (I and II) from flax. PLA 2 activity of the cloned sPLA 2 s were biochemically assayed authenticating them as bona fide phospholipase A 2 . Physiochemical properties of both the sPLA 2 s revealed they are thermostable proteins requiring di-valent cations for optimum activity.While, structural analysis of both the proteins revealed deviations in the amino acid sequence at C- & N-terminal regions; hydropathic study revealed LusPLA 2 I as a hydrophobic protein and LusPLA 2 II as a hydrophilic protein. Structural analysis of flax sPLA 2 s revealed that secondary structure of both the proteins are dominated by α-helix followed by random coils. Modular superimposition of LusPLA 2 isoforms with rice sPLA 2 confirmed monomeric structural preservation among plant phospholipase A 2 and provided insight into structure of folded flax sPLA 2 s.

  10. A detailed comparison of analysis processes for MCC-IMS data in disease classification—Automated methods can replace manual peak annotations

    PubMed Central

    Horsch, Salome; Kopczynski, Dominik; Kuthe, Elias; Baumbach, Jörg Ingo; Rahmann, Sven

    2017-01-01

    Motivation Disease classification from molecular measurements typically requires an analysis pipeline from raw noisy measurements to final classification results. Multi capillary column—ion mobility spectrometry (MCC-IMS) is a promising technology for the detection of volatile organic compounds in the air of exhaled breath. From raw measurements, the peak regions representing the compounds have to be identified, quantified, and clustered across different experiments. Currently, several steps of this analysis process require manual intervention of human experts. Our goal is to identify a fully automatic pipeline that yields competitive disease classification results compared to an established but subjective and tedious semi-manual process. Method We combine a large number of modern methods for peak detection, peak clustering, and multivariate classification into analysis pipelines for raw MCC-IMS data. We evaluate all combinations on three different real datasets in an unbiased cross-validation setting. We determine which specific algorithmic combinations lead to high AUC values in disease classifications across the different medical application scenarios. Results The best fully automated analysis process achieves even better classification results than the established manual process. The best algorithms for the three analysis steps are (i) SGLTR (Savitzky-Golay Laplace-operator filter thresholding regions) and LM (Local Maxima) for automated peak identification, (ii) EM clustering (Expectation Maximization) and DBSCAN (Density-Based Spatial Clustering of Applications with Noise) for the clustering step and (iii) RF (Random Forest) for multivariate classification. Thus, automated methods can replace the manual steps in the analysis process to enable an unbiased high throughput use of the technology. PMID:28910313

  11. Neural classifier in the estimation process of maturity of selected varieties of apples

    NASA Astrophysics Data System (ADS)

    Boniecki, P.; Piekarska-Boniecka, H.; Koszela, K.; Zaborowicz, M.; Przybył, K.; Wojcieszak, D.; Zbytek, Z.; Ludwiczak, A.; Przybylak, A.; Lewicki, A.

    2015-07-01

    This paper seeks to present methods of neural image analysis aimed at estimating the maturity state of selected varieties of apples which are popular in Poland. An identification of the degree of maturity of selected varieties of apples has been conducted on the basis of information encoded in graphical form, presented in the digital photos. The above process involves the application of the BBCH scale, used to determine the maturity of apples. The aforementioned scale is widely used in the EU and has been developed for many species of monocotyledonous plants and dicotyledonous plants. It is also worth noticing that the given scale enables detailed determinations of development stage of a given plant. The purpose of this work is to identify maturity level of selected varieties of apples, which is supported by the use of image analysis methods and classification techniques represented by artificial neural networks. The analysis of graphical representative features based on image analysis method enabled the assessment of the maturity of apples. For the utilitarian purpose the "JabVis 1.1" neural IT system was created, in accordance with requirements of the software engineering dedicated to support the decision-making processes occurring in broadly understood production process and processing of apples.

  12. Molecular analysis of post-harvest withering in grape by AFLP transcriptional profiling

    PubMed Central

    Zamboni, Anita; Minoia, Leone; Ferrarini, Alberto; Tornielli, Giovanni Battista; Zago, Elisa; Delledonne, Massimo; Pezzotti, Mario

    2008-01-01

    Post-harvest withering of grape berries is used in the production of dessert and fortified wines to alter must quality characteristics and increase the concentration of simple sugars. The molecular processes that occur during withering are poorly understood, so a detailed transcriptomic analysis of post-harvest grape berries was carried out by AFLP-transcriptional profiling analysis. This will help to elucidate the molecular mechanisms of berry withering and will provide an opportunity to select markers that can be used to follow the drying process and evaluate different drying techniques. AFLP-TP identified 699 withering-specific genes, 167 and 86 of which were unique to off-plant and on-plant withering, respectively. Although similar molecular events were revealed in both withering processes, it was apparent that off-plant withering induced a stronger dehydration stress response resulting in the high level expression of genes involved in stress protection mechanisms, such as dehydrin and osmolite accumulation. Genes involved in hexose metabolism and transport, cell wall composition, and secondary metabolism (particularly the phenolic and terpene compound pathways) were similarly regulated in both processes. This work provides the first comprehensive analysis of the molecular events underpinning post-harvest withering and could help to define markers for different withering processes. PMID:19010774

  13. Molecular analysis of post-harvest withering in grape by AFLP transcriptional profiling.

    PubMed

    Zamboni, Anita; Minoia, Leone; Ferrarini, Alberto; Tornielli, Giovanni Battista; Zago, Elisa; Delledonne, Massimo; Pezzotti, Mario

    2008-01-01

    Post-harvest withering of grape berries is used in the production of dessert and fortified wines to alter must quality characteristics and increase the concentration of simple sugars. The molecular processes that occur during withering are poorly understood, so a detailed transcriptomic analysis of post-harvest grape berries was carried out by AFLP-transcriptional profiling analysis. This will help to elucidate the molecular mechanisms of berry withering and will provide an opportunity to select markers that can be used to follow the drying process and evaluate different drying techniques. AFLP-TP identified 699 withering-specific genes, 167 and 86 of which were unique to off-plant and on-plant withering, respectively. Although similar molecular events were revealed in both withering processes, it was apparent that off-plant withering induced a stronger dehydration stress response resulting in the high level expression of genes involved in stress protection mechanisms, such as dehydrin and osmolite accumulation. Genes involved in hexose metabolism and transport, cell wall composition, and secondary metabolism (particularly the phenolic and terpene compound pathways) were similarly regulated in both processes. This work provides the first comprehensive analysis of the molecular events underpinning post-harvest withering and could help to define markers for different withering processes.

  14. An efficient liner cooling scheme for advanced small gas turbine combustors

    NASA Technical Reports Server (NTRS)

    Paskin, Marc D.; Mongia, Hukam C.; Acosta, Waldo A.

    1993-01-01

    A joint Army/NASA program was conducted to design, fabricate, and test an advanced, small gas turbine, reverse-flow combustor utilizing a compliant metal/ceramic (CMC) wall cooling concept. The objectives of this effort were to develop a design method (basic design data base and analysis) for the CMC cooling technique and then demonstrate its application to an advanced cycle, small, reverse-flow combustor with 3000 F burner outlet temperature. The CMC concept offers significant improvements in wall cooling effectiveness resulting in a large reduction in cooling air requirements. Therefore, more air is available for control of burner outlet temperature pattern in addition to the benefits of improved efficiency, reduced emissions, and lower smoke levels. The program was divided into four tasks. Task 1 defined component materials and localized design of the composite wall structure in conjunction with development of basic design models for the analysis of flow and heat transfer through the wall. Task 2 included implementation of the selected materials and validated design models during combustor preliminary design. Detail design of the selected combustor concept and its refinement with 3D aerothermal analysis were completed in Task 3. Task 4 covered detail drawings, process development and fabrication, and a series of burner rig tests. The purpose of this paper is to provide details of the investigation into the fundamental flow and heat transfer characteristics of the CMC wall structure as well as implementation of the fundamental analysis method for full-scale combustor design.

  15. Picture processing analysis of the optical structure of NGC 5128 /Centaurus A/

    NASA Technical Reports Server (NTRS)

    Dufour, R. J.; Harvel, C. A.; Martins, D. M.; Schiffer, F. H., III; Talent, D. L.; Wells, D. C.; Van Den Bergh, S.; Talbot, R. J., Jr.

    1979-01-01

    Results are presented for a detailed study of the peculiar elliptical galaxy NGC 5128 (Cen A), based on computer video analysis of several photographic plates of exceptional quality reduced to the standard UBV system. The picture-processing results and the measured properties of the elliptical and gaseous-disk components of NGC 5128 are examined, along with the distribution, spectral characteristics, and chemical composition of the H II regions in the disk. The data show that NGC 5128 consists of a giant E2 galaxy containing a significant amount of gas and dust situated predominantly in an equatorial disk where vigorous star formation is occurring. Reasons why NGC 5128 is so different from giant ellipticals in clusters are considered.

  16. Magnetic Nanofluid Rare Earth Element Extraction Process Report, Techno Economic Analysis, and Results for Geothermal Fluids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pete McGrail

    This GDR submission is an interim technical report and raw data files from the first year of testing on functionalized nanoparticles for rare earth element extraction from geothermal fluids. The report contains Rare Earth Element uptake results (percent removal, mg Rare Earth Element/gram of sorbent, distribution coefficient) for the elements of Neodymium, Europium, Yttrium, Dysprosium, and Cesium. A detailed techno economic analysis is also presented in the report for a scaled up geothermal rare earth element extraction process. All rare earth element uptake testing was done on simulated geothermal brines with one rare earth element in each brine. The raremore » earth element uptake testing was conducted at room temperature.« less

  17. Processes that Inform Multicultural Supervision: A Qualitative Meta-Analysis.

    PubMed

    Tohidian, Nilou B; Quek, Karen Mui-Teng

    2017-10-01

    As the fields of counseling and psychotherapy have become more cognizant that individuals, couples, and families bring with them a myriad of diversity factors into therapy, multicultural competency has also become a crucial component in the development of clinicians during clinical supervision and training. We employed a qualitative meta-analysis to provide a detailed and comprehensive description of similar themes identified in primary qualitative studies that have investigated supervisory practices with an emphasis on diversity. Findings revealed six meta-categories, namely: (a) Supervisor's Multicultural Stances; (b) Supervisee's Multicultural Encounters; (c) Competency-Based Content in Supervision; (d) Processes Surrounding Multicultural Supervision; (e) Culturally Attuned Interventions; and (f) Multicultural Supervisory Alliance. Implications for practice are discussed. © 2017 American Association for Marriage and Family Therapy.

  18. Study of Deformation Phenomena in TRIP/TWIP Steels by Acoustic Emission and Scanning Electron Microscopy

    NASA Astrophysics Data System (ADS)

    Linderov, M. L.; Segel, C.; Weidner, A.; Biermann, H.; Vinogradov, A. Yu.

    2018-04-01

    Modern metastable steels with TRIP/TWIP effects have a unique set of physical-mechanical properties. They combine both high-strength and high-plasticity characteristics, which is governed by processes activated during deformation, namely, twinning, the formation of stacking faults, and martensitic transformations. To study the behavior of these phenomena in CrMnNi TRIP/TWIP steels and stainless CrNiMo steel, which does not have these effects in the temperature range under study, we used the method of acoustic emission and modern methods of signal processing, including the cluster analysis of spectral-density functions. The results of this study have been compared with a detailed microstructural analysis performed with a scanning electron microscope using electron backscatter diffraction (EBSD).

  19. A thematic analysis of the strengths and weaknesses of manufacturers' submissions to the NICE Single Technology Assessment (STA) process.

    PubMed

    Carroll, Christopher; Kaltenthaler, Eva; FitzGerald, Patrick; Boland, Angela; Dickson, Rumona

    2011-10-01

    The NICE Single Technology Appraisal (STA) process in the UK has been underway for five years. Evidence Review Groups (ERGs) critically appraise submissions from manufacturers on the clinical and cost effectiveness of new technologies. This study analysed the ERGs' assessment of the strengths and weaknesses of 30 manufacturers' submissions to the STA process. Thematic analysis was performed on the textual descriptions of the strengths and weakness of manufacturer submissions, as outlined by the ERGs in their reports. Various themes emerged from the data. These themes related to the processes applied in the submissions; the content of the submission (e.g. the amount and quality of evidence); the reporting of the submissions' review and analysis processes; the reliability and validity of the submissions' findings; and how far the submission had satisfied the STA process objectives. STA submissions could be improved if attention were paid to transparency in the reporting, conduct and justification of review and modelling processes and analyses, as well as greater robustness in the choice of data and closer adherence to the scope or decision problem. Where this adherence is not possible, more detailed justification of the choice of evidence or data is required. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  20. Measuring silicon pore optics

    NASA Astrophysics Data System (ADS)

    Vacanti, Giuseppe; Barrière, Nicolas; Bavdaz, Marcos; Chatbi, Abdelhakim; Collon, Maximilien; Dekker, Daniëlle; Girou, David; Günther, Ramses; van der Hoeven, Roy; Krumrey, Michael; Landgraf, Boris; Müller, Peter; Schreiber, Swenja; Vervest, Mark; Wille, Eric

    2017-09-01

    While predictions based on the metrology (local slope errors and detailed geometrical details) play an essential role in controlling the development of the manufacturing processes, X-ray characterization remains the ultimate indication of the actual performance of Silicon Pore Optics (SPO). For this reason SPO stacks and mirror modules are routinely characterized at PTB's X-ray Pencil Beam Facility at BESSY II. Obtaining standard X-ray results quickly, right after the production of X-ray optics is essential to making sure that X-ray results can inform decisions taken in the lab. We describe the data analysis pipeline in operations at cosine, and how it allows us to go from stack production to full X-ray characterization in 24 hours.

  1. JETC (Japanese Technology Evaluation Center) Panel Report on High Temperature Superconductivity in Japan

    NASA Technical Reports Server (NTRS)

    Shelton, Duane; Gamota, George

    1989-01-01

    The Japanese regard success in R and D in high temperature superconductivity as an important national objective. The results of a detailed evaluation of the current state of Japanese high temperature superconductivity development are provided. The analysis was performed by a panel of technical experts drawn from U.S. industry and academia, and is based on reviews of the relevant literature and visits to Japanese government, academic and industrial laboratories. Detailed appraisals are presented on the following: Basic research; superconducting materials; large scale applications; processing of superconducting materials; superconducting electronics and thin films. In all cases, comparisons are made with the corresponding state-of-the-art in the United States.

  2. Requirements and principles for the implementation and construction of large-scale geographic information systems

    NASA Technical Reports Server (NTRS)

    Smith, Terence R.; Menon, Sudhakar; Star, Jeffrey L.; Estes, John E.

    1987-01-01

    This paper provides a brief survey of the history, structure and functions of 'traditional' geographic information systems (GIS), and then suggests a set of requirements that large-scale GIS should satisfy, together with a set of principles for their satisfaction. These principles, which include the systematic application of techniques from several subfields of computer science to the design and implementation of GIS and the integration of techniques from computer vision and image processing into standard GIS technology, are discussed in some detail. In particular, the paper provides a detailed discussion of questions relating to appropriate data models, data structures and computational procedures for the efficient storage, retrieval and analysis of spatially-indexed data.

  3. Cost/Effort Drivers and Decision Analysis

    NASA Technical Reports Server (NTRS)

    Seidel, Jonathan

    2010-01-01

    Engineering trade study analyses demand consideration of performance, cost and schedule impacts across the spectrum of alternative concepts and in direct reference to product requirements. Prior to detailed design, requirements are too often ill-defined (only goals ) and prone to creep, extending well beyond the Systems Requirements Review. Though lack of engineering design and definitive requirements inhibit the ability to perform detailed cost analyses, affordability trades still comprise the foundation of these future product decisions and must evolve in concert. This presentation excerpts results of the recent NASA subsonic Engine Concept Study for an Advanced Single Aisle Transport to demonstrate an affordability evaluation of performance characteristics and the subsequent impacts on engine architecture decisions. Applying the Process Based Economic Analysis Tool (PBEAT), development cost, production cost, as well as operation and support costs were considered in a traditional weighted ranking of the following system-level figures of merit: mission fuel burn, take-off noise, NOx emissions, and cruise speed. Weighting factors were varied to ascertain the architecture ranking sensitivities to these performance figures of merit with companion cost considerations. A more detailed examination of supersonic variable cycle engine cost is also briefly presented, with observations and recommendations for further refinements.

  4. Mask industry assessment trend analysis: 2010

    NASA Astrophysics Data System (ADS)

    Hughes, Greg; Yun, Henry

    2010-05-01

    Microelectronics industry leaders consistently cite the cost and cycle time of mask technology and mask supply as top critical issues. A survey was designed with input from semiconductor company mask technologists and merchant mask suppliers and support from SEMATECH to gather information about the mask industry as an objective assessment of its overall condition. This year's assessment was the eighth in the current series of annual reports. Its data were presented in detail at BACUS, and the detailed trend analysis is presented at EMLC. With continued industry support, the report can be used as a baseline to gain perspective on the technical and business status of the mask and microelectronics industries. The report will continue to serve as a valuable reference to identify the strengths and opportunities of the mask industry. Its results will be used to guide future investments on critical path issues. This year's survey is basically the same as the surveys in 2005 through 2009. Questions are grouped into six categories: General Business Profile Information, Data Processing, Yields and Yield Loss Mechanisms, Delivery Times, Returns, and Services. Within each category is a multitude of questions that creates a detailed profile of both the business and technical status of the critical mask industry.

  5. The 2002 to 2010 mask survey trend analysis

    NASA Astrophysics Data System (ADS)

    Hughes, Greg; Chan, David

    2011-03-01

    Microelectronics industry leaders consistently cite the cost and cycle time of mask technology and mask supply as top critical issues. A survey was designed with input from semiconductor company mask technologists and merchant mask suppliers and support from SEMATECH to gather information about the mask industry as an objective assessment of its overall condition. This year's assessment was the ninth in the current series of annual reports. Its data were presented in detail at BACUS, and the detailed trend analysis is presented at EMLC. With continued industry support, the report can be used as a baseline to gain perspective on the technical and business status of the mask and microelectronics industries. The report will continue to serve as a valuable reference to identify the strengths and opportunities of the mask industry. Results will be used to guide future investments in critical path issues. This year's survey is basically the same as the 2005 through 2010 surveys. Questions are grouped into six categories: General Business Profile Information, Data Processing, Yields and Yield Loss Mechanisms, Delivery Times, Returns, and Services. Within each category are multiple questions that ultimately create a detailed profile of both the business and technical status of the critical mask industry.

  6. Failure mode and effects analysis: A community practice perspective.

    PubMed

    Schuller, Bradley W; Burns, Angi; Ceilley, Elizabeth A; King, Alan; LeTourneau, Joan; Markovic, Alexander; Sterkel, Lynda; Taplin, Brigid; Wanner, Jennifer; Albert, Jeffrey M

    2017-11-01

    To report our early experiences with failure mode and effects analysis (FMEA) in a community practice setting. The FMEA facilitator received extensive training at the AAPM Summer School. Early efforts focused on department education and emphasized the need for process evaluation in the context of high profile radiation therapy accidents. A multidisciplinary team was assembled with representation from each of the major department disciplines. Stereotactic radiosurgery (SRS) was identified as the most appropriate treatment technique for the first FMEA evaluation, as it is largely self-contained and has the potential to produce high impact failure modes. Process mapping was completed using breakout sessions, and then compiled into a simple electronic format. Weekly sessions were used to complete the FMEA evaluation. Risk priority number (RPN) values > 100 or severity scores of 9 or 10 were considered high risk. The overall time commitment was also tracked. The final SRS process map contained 15 major process steps and 183 subprocess steps. Splitting the process map into individual assignments was a successful strategy for our group. The process map was designed to contain enough detail such that another radiation oncology team would be able to perform our procedures. Continuous facilitator involvement helped maintain consistent scoring during FMEA. Practice changes were made responding to the highest RPN scores, and new resulting RPN scores were below our high-risk threshold. The estimated person-hour equivalent for project completion was 258 hr. This report provides important details on the initial steps we took to complete our first FMEA, providing guidance for community practices seeking to incorporate this process into their quality assurance (QA) program. Determining the feasibility of implementing complex QA processes into different practice settings will take on increasing significance as the field of radiation oncology transitions into the new TG-100 QA paradigm. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  7. Comparative Proteomic Analysis of Light-Induced Mycelial Brown Film Formation in Lentinula edodes.

    PubMed

    Tang, Li Hua; Tan, Qi; Bao, Da Peng; Zhang, Xue Hong; Jian, Hua Hua; Li, Yan; Yang, Rui Heng; Wang, Ying

    2016-01-01

    Light-induced brown film (BF) formation by the vegetative mycelium of Lentinula edodes is important for ensuring the quantity and quality of this edible mushroom. Nevertheless, the molecular mechanism underlying this phenotype is still unclear. In this study, a comparative proteomic analysis of mycelial BF formation in L. edodes was performed. Seventy-three protein spots with at least a twofold difference in abundance on two-dimensional electrophoresis (2DE) maps were observed, and 52 of them were successfully identified by matrix-assisted laser desorption/ionization tandem time-of-flight mass spectrometry (MALDI-TOF/TOF/MS). These proteins were classified into the following functional categories: small molecule metabolic processes (39%), response to oxidative stress (5%), and organic substance catabolic processes (5%), followed by oxidation-reduction processes (3%), single-organism catabolic processes (3%), positive regulation of protein complex assembly (3%), and protein metabolic processes (3%). Interestingly, four of the proteins that were upregulated in response to light exposure were nucleoside diphosphate kinases. To our knowledge, this is the first proteomic analysis of the mechanism of BF formation in L. edodes . Our data will provide a foundation for future detailed investigations of the proteins linked to BF formation.

  8. Applications of mass spectrometry techniques to autoclave curing of materials

    NASA Technical Reports Server (NTRS)

    Smith, A. C.

    1983-01-01

    Mass spectrometer analysis of gases evolved from polymer materials during a cure cycle can provide a wealth of information useful for studying cure properties and procedures. In this paper data is presented for two materials to support the feasibility of using mass spectrometer gas analysis techniques to enhance the knowledge of autoclave curing of composite materials and provide additional information for process control evaluation. It is expected that this technique will also be useful in working out the details involved in determining the proper cure cycle for new or experimental materials.

  9. Analysis of XMM-Newton Data from Extended Sources and the Diffuse X-Ray Background

    NASA Technical Reports Server (NTRS)

    Snowden, Steven

    2011-01-01

    Reduction of X-ray data from extended objects and the diffuse background is a complicated process that requires attention to the details of the instrumental response as well as an understanding of the multiple background components. We present methods and software that we have developed to reduce data from XMM-Newton EPIC imaging observations for both the MOS and PN instruments. The software has now been included in the Science Analysis System (SAS) package available through the XMM-Newton Science Operations Center (SOC).

  10. Statistical description of tectonic motions

    NASA Technical Reports Server (NTRS)

    Agnew, Duncan Carr

    1991-01-01

    The behavior of stochastic processes was studied whose power spectra are described by power-law behavior. The details of the analysis and the conclusions that were reached are presented. This analysis was extended to compare detection capabilities of different measurement techniques (e.g., gravimetry and GPS for the vertical, and seismometers and GPS for horizontal), both in general and for the specific case of the deformations produced by a dislocation in a half-space (which applies to seismic of preseismic sources). The time-domain behavior of power-law noises is also investigated.

  11. [The present study situation and application prospect of nail analysis for abused drugs].

    PubMed

    Chen, Hang; Xiang, Ping; Shen, Min

    2010-10-01

    In forensic toxicology analysis, various types of biological samples have their own special characteristics and scope of applications. In this article, the physiological structure of nails, methods for collecting and pre-processing samples, and for analyzing some poisons and drugs in the nails are reviewed with details. This paper introduces the influence factors of drug abuse of the nails. The prospects of its further applications are concluded based on the research results. Nails, as an unconventional bio-sample without general application, show great potential and advantages in forensic toxicology.

  12. Space shuttle main engine numerical modeling code modifications and analysis

    NASA Technical Reports Server (NTRS)

    Ziebarth, John P.

    1988-01-01

    The user of computational fluid dynamics (CFD) codes must be concerned with the accuracy and efficiency of the codes if they are to be used for timely design and analysis of complicated three-dimensional fluid flow configurations. A brief discussion of how accuracy and efficiency effect the CFD solution process is given. A more detailed discussion of how efficiency can be enhanced by using a few Cray Research Inc. utilities to address vectorization is presented and these utilities are applied to a three-dimensional Navier-Stokes CFD code (INS3D).

  13. Physics of solar activity

    NASA Technical Reports Server (NTRS)

    Sturrock, Peter A.

    1993-01-01

    The aim of the research activity was to increase our understanding of solar activity through data analysis, theoretical analysis, and computer modeling. Because the research subjects were diverse and many researchers were supported by this grant, a select few key areas of research are described in detail. Areas of research include: (1) energy storage and force-free magnetic field; (2) energy release and particle acceleration; (3) radiation by nonthermal electrons; (4) coronal loops; (5) flare classification; (6) longitude distributions of flares; (7) periodicities detected in the solar activity; (8) coronal heating and related problems; and (9) plasma processes.

  14. Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis.

    PubMed

    Bonham-Carter, Oliver; Steele, Joe; Bastola, Dhundy

    2014-11-01

    Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base-base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel-Ziv techniques from data compression. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  15. High Fidelity System Simulation of Multiple Components in Support of the UEET Program

    NASA Technical Reports Server (NTRS)

    Plybon, Ronald C.; VanDeWall, Allan; Sampath, Rajiv; Balasubramaniam, Mahadevan; Mallina, Ramakrishna; Irani, Rohinton

    2006-01-01

    The High Fidelity System Simulation effort has addressed various important objectives to enable additional capability within the NPSS framework. The scope emphasized High Pressure Turbine and High Pressure Compressor components. Initial effort was directed at developing and validating intermediate fidelity NPSS model using PD geometry and extended to high-fidelity NPSS model by overlaying detailed geometry to validate CFD against rig data. Both "feedforward" and feedback" approaches of analysis zooming was employed to enable system simulation capability in NPSS. These approaches have certain benefits and applicability in terms of specific applications "feedback" zooming allows the flow-up of information from high-fidelity analysis to be used to update the NPSS model results by forcing the NPSS solver to converge to high-fidelity analysis predictions. This apporach is effective in improving the accuracy of the NPSS model; however, it can only be used in circumstances where there is a clear physics-based strategy to flow up the high-fidelity analysis results to update the NPSS system model. "Feed-forward" zooming approach is more broadly useful in terms of enabling detailed analysis at early stages of design for a specified set of critical operating points and using these analysis results to drive design decisions early in the development process.

  16. International Space Station Alpha (ISSA) Integrated Traffic Model

    NASA Technical Reports Server (NTRS)

    Gates, Robert E.

    1994-01-01

    The paper discusses the development process of the International Space Station Alpha (ISSA) Integrated Traffic Model which is a subsystem analyses tool utilized in the ISSA design analysis cycles. Fast-track prototyping of the detailed relationships between daily crew and station consumables, propellant needs, maintenance requirements, and crew rotation via spread sheets provides adequate bench marks to assess cargo vehicle design and performance characteristics.

  17. Integrative Lifecourse and Genetic Analysis of Military Working Dogs

    DTIC Science & Technology

    2015-12-01

    done as the samples are collected in order to avoid experimental variability and batch effects . Detailed description and discussion of this task...associated loss of power to detect all associations but those of large effect sizes) and latent variables (e.g., population structure – addressed in...processes associated with tissue development and maintenance are thus grouped with external environmental effects . This in turn suggests how those

  18. Microlensing observations rapid search for exoplanets: MORSE code for GPUs

    NASA Astrophysics Data System (ADS)

    McDougall, Alistair; Albrow, Michael D.

    2016-02-01

    The rapid analysis of ongoing gravitational microlensing events has been integral to the successful detection and characterization of cool planets orbiting low-mass stars in the Galaxy. In this paper, we present an implementation of search and fit techniques on graphical processing unit (GPU) hardware. The method allows for the rapid identification of candidate planetary microlensing events and their subsequent follow-up for detailed characterization.

  19. Investigating Change in Young People's Understandings of Japan: A Study of Learning about a Distant Place

    ERIC Educational Resources Information Center

    Taylor, Liz

    2011-01-01

    This article demonstrates how a set of complementary qualitative methods can be used to construct a detailed picture not only of the nature of young people's representations of a distant place but the processes of learning by which such representations develop over the medium term. The analysis is based on an interpretive case study of a class of…

  20. Accurate 3d Scanning of Damaged Ancient Greek Inscriptions for Revealing Weathered Letters

    NASA Astrophysics Data System (ADS)

    Papadaki, A. I.; Agrafiotis, P.; Georgopoulos, A.; Prignitz, S.

    2015-02-01

    In this paper two non-invasive non-destructive alternative techniques to the traditional and invasive technique of squeezes are presented alongside with specialized developed processing methods, aiming to help the epigraphists to reveal and analyse weathered letters in ancient Greek inscriptions carved in masonry or marble. The resulting 3D model would serve as a detailed basis for the epigraphists to try to decipher the inscription. The data were collected by using a Structured Light scanner. The creation of the final accurate three dimensional model is a complicated procedure requiring large computation cost and human effort. It includes the collection of geometric data in limited space and time, the creation of the surface, the noise filtering and the merging of individual surfaces. The use of structured light scanners is time consuming and requires costly hardware and software. Therefore an alternative methodology for collecting 3D data of the inscriptions was also implemented for reasons of comparison. Hence, image sequences from varying distances were collected using a calibrated DSLR camera aiming to reconstruct the 3D scene through SfM techniques in order to evaluate the efficiency and the level of precision and detail of the obtained reconstructed inscriptions. Problems in the acquisition processes as well as difficulties in the alignment step and mesh optimization are also encountered. A meta-processing framework is proposed and analysed. Finally, the results of processing and analysis and the different 3D models are critically inspected and then evaluated by a specialist in terms of accuracy, quality and detail of the model and the capability of revealing damaged and "hidden" letters.

  1. Global processing takes time: A meta-analysis on local-global visual processing in ASD.

    PubMed

    Van der Hallen, Ruth; Evers, Kris; Brewaeys, Katrien; Van den Noortgate, Wim; Wagemans, Johan

    2015-05-01

    What does an individual with autism spectrum disorder (ASD) perceive first: the forest or the trees? In spite of 30 years of research and influential theories like the weak central coherence (WCC) theory and the enhanced perceptual functioning (EPF) account, the interplay of local and global visual processing in ASD remains only partly understood. Research findings vary in indicating a local processing bias or a global processing deficit, and often contradict each other. We have applied a formal meta-analytic approach and combined 56 articles that tested about 1,000 ASD participants and used a wide range of stimuli and tasks to investigate local and global visual processing in ASD. Overall, results show no enhanced local visual processing nor a deficit in global visual processing. Detailed analysis reveals a difference in the temporal pattern of the local-global balance, that is, slow global processing in individuals with ASD. Whereas task-dependent interaction effects are obtained, gender, age, and IQ of either participant groups seem to have no direct influence on performance. Based on the overview of the literature, suggestions are made for future research. (c) 2015 APA, all rights reserved).

  2. Medium-throughput processing of whole mount in situ hybridisation experiments into gene expression domains.

    PubMed

    Crombach, Anton; Cicin-Sain, Damjan; Wotton, Karl R; Jaeger, Johannes

    2012-01-01

    Understanding the function and evolution of developmental regulatory networks requires the characterisation and quantification of spatio-temporal gene expression patterns across a range of systems and species. However, most high-throughput methods to measure the dynamics of gene expression do not preserve the detailed spatial information needed in this context. For this reason, quantification methods based on image bioinformatics have become increasingly important over the past few years. Most available approaches in this field either focus on the detailed and accurate quantification of a small set of gene expression patterns, or attempt high-throughput analysis of spatial expression through binary pattern extraction and large-scale analysis of the resulting datasets. Here we present a robust, "medium-throughput" pipeline to process in situ hybridisation patterns from embryos of different species of flies. It bridges the gap between high-resolution, and high-throughput image processing methods, enabling us to quantify graded expression patterns along the antero-posterior axis of the embryo in an efficient and straightforward manner. Our method is based on a robust enzymatic (colorimetric) in situ hybridisation protocol and rapid data acquisition through wide-field microscopy. Data processing consists of image segmentation, profile extraction, and determination of expression domain boundary positions using a spline approximation. It results in sets of measured boundaries sorted by gene and developmental time point, which are analysed in terms of expression variability or spatio-temporal dynamics. Our method yields integrated time series of spatial gene expression, which can be used to reverse-engineer developmental gene regulatory networks across species. It is easily adaptable to other processes and species, enabling the in silico reconstitution of gene regulatory networks in a wide range of developmental contexts.

  3. An economic toolkit for identifying the cost of emergency medical services (EMS) systems: detailed methodology of the EMS Cost Analysis Project (EMSCAP).

    PubMed

    Lerner, E Brooke; Garrison, Herbert G; Nichol, Graham; Maio, Ronald F; Lookman, Hunaid A; Sheahan, William D; Franz, Timothy R; Austad, James D; Ginster, Aaron M; Spaite, Daniel W

    2012-02-01

    Calculating the cost of an emergency medical services (EMS) system using a standardized method is important for determining the value of EMS. This article describes the development of a methodology for calculating the cost of an EMS system to its community. This includes a tool for calculating the cost of EMS (the "cost workbook") and detailed directions for determining cost (the "cost guide"). The 12-step process that was developed is consistent with current theories of health economics, applicable to prehospital care, flexible enough to be used in varying sizes and types of EMS systems, and comprehensive enough to provide meaningful conclusions. It was developed by an expert panel (the EMS Cost Analysis Project [EMSCAP] investigator team) in an iterative process that included pilot testing the process in three diverse communities. The iterative process allowed ongoing modification of the toolkit during the development phase, based upon direct, practical, ongoing interaction with the EMS systems that were using the toolkit. The resulting methodology estimates EMS system costs within a user-defined community, allowing either the number of patients treated or the estimated number of lives saved by EMS to be assessed in light of the cost of those efforts. Much controversy exists about the cost of EMS and whether the resources spent for this purpose are justified. However, the existence of a validated toolkit that provides a standardized process will allow meaningful assessments and comparisons to be made and will supply objective information to inform EMS and community officials who are tasked with determining the utilization of scarce societal resources. © 2012 by the Society for Academic Emergency Medicine.

  4. External details revisited - A new taxonomy for coding 'non-episodic' content during autobiographical memory retrieval.

    PubMed

    Strikwerda-Brown, Cherie; Mothakunnel, Annu; Hodges, John R; Piguet, Olivier; Irish, Muireann

    2018-04-24

    Autobiographical memory (ABM) is typically held to comprise episodic and semantic elements, with the vast majority of studies to date focusing on profiles of episodic details in health and disease. In this context, 'non-episodic' elements are often considered to reflect semantic processing or are discounted from analyses entirely. Mounting evidence suggests that rather than reflecting one unitary entity, semantic autobiographical information may contain discrete subcomponents, which vary in their relative degree of semantic or episodic content. This study aimed to (1) review the existing literature to formally characterize the variability in analysis of 'non-episodic' content (i.e., external details) on the Autobiographical Interview and (2) use these findings to create a theoretically grounded framework for coding external details. Our review exposed discrepancies in the reporting and interpretation of external details across studies, reinforcing the need for a new, consistent approach. We validated our new external details scoring protocol (the 'NExt' taxonomy) in patients with Alzheimer's disease (n = 18) and semantic dementia (n = 13), and 20 healthy older Control participants and compared profiles of the NExt subcategories across groups and time periods. Our results revealed increased sensitivity of the NExt taxonomy in discriminating between ABM profiles of patient groups, when compared to traditionally used internal and external detail metrics. Further, remote and recent autobiographical memories displayed distinct compositions of the NExt detail types. This study is the first to provide a fine-grained and comprehensive taxonomy to parse external details into intuitive subcategories and to validate this protocol in neurodegenerative disorders. © 2018 The British Psychological Society.

  5. The application of quality risk management to the bacterial endotoxins test: use of hazard analysis and critical control points.

    PubMed

    Annalaura, Carducci; Giulia, Davini; Stefano, Ceccanti

    2013-01-01

    Risk analysis is widely used in the pharmaceutical industry to manage production processes, validation activities, training, and other activities. Several methods of risk analysis are available (for example, failure mode and effects analysis, fault tree analysis), and one or more should be chosen and adapted to the specific field where they will be applied. Among the methods available, hazard analysis and critical control points (HACCP) is a methodology that has been applied since the 1960s, and whose areas of application have expanded over time from food to the pharmaceutical industry. It can be easily and successfully applied to several processes because its main feature is the identification, assessment, and control of hazards. It can be also integrated with other tools, such as fishbone diagram and flowcharting. The aim of this article is to show how HACCP can be used to manage an analytical process, propose how to conduct the necessary steps, and provide data templates necessary to document and useful to follow current good manufacturing practices. In the quality control process, risk analysis is a useful tool for enhancing the uniformity of technical choices and their documented rationale. Accordingly, it allows for more effective and economical laboratory management, is capable of increasing the reliability of analytical results, and enables auditors and authorities to better understand choices that have been made. The aim of this article is to show how hazard analysis and critical control points can be used to manage bacterial endotoxins testing and other analytical processes in a formal, clear, and detailed manner.

  6. Description of data on the Nimbus 7 LIMS map archive tape: Water vapor and nitrogen dioxide

    NASA Technical Reports Server (NTRS)

    Haggard, Kenneth V.; Marshall, B. T.; Kurzeja, Robert J.; Remsberg, Ellis E.; Russell, James M., III

    1988-01-01

    Described is the process by which the analysis of the Limb Infrared Monitor of the Stratosphere (LIMS) experiment data were used to produce estimates of synoptic maps of water vapor and nitrogen dioxide. In addition to a detailed description of the analysis procedure, also discussed are several interesting features in the data which are used to demonstrate how the analysis procedure produced the final maps and how one can estimate the uncertainties in the maps. In addition, features in the analysis are noted that would influence how one might use, or interpret, the results. These include subjects such as smoothing and the interpretation of wave components.

  7. Organellar proteomics reveals hundreds of novel nuclear proteins in the malaria parasite Plasmodium falciparum

    PubMed Central

    2012-01-01

    Background The post-genomic era of malaria research provided unprecedented insights into the biology of Plasmodium parasites. Due to the large evolutionary distance to model eukaryotes, however, we lack a profound understanding of many processes in Plasmodium biology. One example is the cell nucleus, which controls the parasite genome in a development- and cell cycle-specific manner through mostly unknown mechanisms. To study this important organelle in detail, we conducted an integrative analysis of the P. falciparum nuclear proteome. Results We combined high accuracy mass spectrometry and bioinformatic approaches to present for the first time an experimentally determined core nuclear proteome for P. falciparum. Besides a large number of factors implicated in known nuclear processes, one-third of all detected proteins carry no functional annotation, including many phylum- or genus-specific factors. Importantly, extensive experimental validation using 30 transgenic cell lines confirmed the high specificity of this inventory, and revealed distinct nuclear localization patterns of hitherto uncharacterized proteins. Further, our detailed analysis identified novel protein domains potentially implicated in gene transcription pathways, and sheds important new light on nuclear compartments and processes including regulatory complexes, the nucleolus, nuclear pores, and nuclear import pathways. Conclusion Our study provides comprehensive new insight into the biology of the Plasmodium nucleus and will serve as an important platform for dissecting general and parasite-specific nuclear processes in malaria parasites. Moreover, as the first nuclear proteome characterized in any protist organism, it will provide an important resource for studying evolutionary aspects of nuclear biology. PMID:23181666

  8. Software for storage and processing coded messages for the international exchange of meteorological information

    NASA Astrophysics Data System (ADS)

    Popov, V. N.; Botygin, I. A.; Kolochev, A. S.

    2017-01-01

    The approach allows representing data of international codes for exchange of meteorological information using metadescription as the formalism associated with certain categories of resources. Development of metadata components was based on an analysis of the data of surface meteorological observations, atmosphere vertical sounding, atmosphere wind sounding, weather radar observing, observations from satellites and others. A common set of metadata components was formed including classes, divisions and groups for a generalized description of the meteorological data. The structure and content of the main components of a generalized metadescription are presented in detail by the example of representation of meteorological observations from land and sea stations. The functional structure of a distributed computing system is described. It allows organizing the storage of large volumes of meteorological data for their further processing in the solution of problems of the analysis and forecasting of climatic processes.

  9. Analysis of Video-Based Microscopic Particle Trajectories Using Kalman Filtering

    PubMed Central

    Wu, Pei-Hsun; Agarwal, Ashutosh; Hess, Henry; Khargonekar, Pramod P.; Tseng, Yiider

    2010-01-01

    Abstract The fidelity of the trajectories obtained from video-based particle tracking determines the success of a variety of biophysical techniques, including in situ single cell particle tracking and in vitro motility assays. However, the image acquisition process is complicated by system noise, which causes positioning error in the trajectories derived from image analysis. Here, we explore the possibility of reducing the positioning error by the application of a Kalman filter, a powerful algorithm to estimate the state of a linear dynamic system from noisy measurements. We show that the optimal Kalman filter parameters can be determined in an appropriate experimental setting, and that the Kalman filter can markedly reduce the positioning error while retaining the intrinsic fluctuations of the dynamic process. We believe the Kalman filter can potentially serve as a powerful tool to infer a trajectory of ultra-high fidelity from noisy images, revealing the details of dynamic cellular processes. PMID:20550894

  10. Dynamic properties of ceramic materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grady, D.E.

    1995-02-01

    The present study offers new data and analysis on the transient shock strength and equation-of-state properties of ceramics. Various dynamic data on nine high strength ceramics are provided with wave profile measurements, through velocity interferometry techniques, the principal observable. Compressive failure in the shock wave front, with emphasis on brittle versus ductile mechanisms of deformation, is examined in some detail. Extensive spall strength data are provided and related to the theoretical spall strength, and to energy-based theories of the spall process. Failure waves, as a mechanism of deformation in the transient shock process, are examined. Strength and equation-of-state analysis ofmore » shock data on silicon carbide, boron carbide, tungsten carbide, silicon dioxide and aluminum nitride is presented with particular emphasis on phase transition properties for the latter two. Wave profile measurements on selected ceramics are investigated for evidence of rate sensitive elastic precursor decay in the shock front failure process.« less

  11. Visualizing time-related data in biology, a review

    PubMed Central

    Secrier, Maria; Schneider, Reinhard

    2014-01-01

    Time is of the essence in biology as in so much else. For example, monitoring disease progression or the timing of developmental defects is important for the processes of drug discovery and therapy trials. Furthermore, an understanding of the basic dynamics of biological phenomena that are often strictly time regulated (e.g. circadian rhythms) is needed to make accurate inferences about the evolution of biological processes. Recent advances in technologies have enabled us to measure timing effects more accurately and in more detail. This has driven related advances in visualization and analysis tools that try to effectively exploit this data. Beyond timeline plots, notable attempts at more involved temporal interpretation have been made in recent years, but awareness of the available resources is still limited within the scientific community. Here, we review some advances in biological visualization of time-driven processes and consider how they aid data analysis and interpretation. PMID:23585583

  12. Beneficiation and leaching study of a muti-Au carrier and low grade refractory gold ore

    NASA Astrophysics Data System (ADS)

    Li, W. J.; Song, Y. S.; Chen, Y.; Cai, L. L.; Zhou, G. Y.

    2017-09-01

    Detailed mineralogy and beneficiation and leaching study of a muti-Au carrier, low grade refractory gold ore from a beneficiation plant in Henan Province, China, was investigated. Mineral liberation analysis, scanning electron microscopy, element phase analysis and etc. by a mineral liberation analyser were used for mineralogical characterization study of this ore. The present work describes an experimental study on the effect of traditional parameters (such as grinding fineness and reagent regimes), middling processing method and flowsheet construction on the total recovery and the assay of the floatation concentrate. Two-step floatation and part of middling combined to the floatation tailing for gold leaching process resulted in high gold grade (g.t-1) and gold recovery (%) for this refractory gold ore. This process opens the possibilities of maximizing Au grade and recoveries in a muti-Au carrier and low grade refractory gold ore where low recoveries are common.

  13. OXSA: An open-source magnetic resonance spectroscopy analysis toolbox in MATLAB.

    PubMed

    Purvis, Lucian A B; Clarke, William T; Biasiolli, Luca; Valkovič, Ladislav; Robson, Matthew D; Rodgers, Christopher T

    2017-01-01

    In vivo magnetic resonance spectroscopy provides insight into metabolism in the human body. New acquisition protocols are often proposed to improve the quality or efficiency of data collection. Processing pipelines must also be developed to use these data optimally. Current fitting software is either targeted at general spectroscopy fitting, or for specific protocols. We therefore introduce the MATLAB-based OXford Spectroscopy Analysis (OXSA) toolbox to allow researchers to rapidly develop their own customised processing pipelines. The toolbox aims to simplify development by: being easy to install and use; seamlessly importing Siemens Digital Imaging and Communications in Medicine (DICOM) standard data; allowing visualisation of spectroscopy data; offering a robust fitting routine; flexibly specifying prior knowledge when fitting; and allowing batch processing of spectra. This article demonstrates how each of these criteria have been fulfilled, and gives technical details about the implementation in MATLAB. The code is freely available to download from https://github.com/oxsatoolbox/oxsa.

  14. Linear Stability of Binary Alloy Solidification for Unsteady Growth Rates

    NASA Technical Reports Server (NTRS)

    Mazuruk, K.; Volz, M. P.

    2010-01-01

    An extension of the Mullins and Sekerka (MS) linear stability analysis to the unsteady growth rate case is considered for dilute binary alloys. In particular, the stability of the planar interface during the initial solidification transient is studied in detail numerically. The rapid solidification case, when the system is traversing through the unstable region defined by the MS criterion, has also been treated. It has been observed that the onset of instability is quite accurately defined by the "quasi-stationary MS criterion", when the growth rate and other process parameters are taken as constants at a particular time of the growth process. A singular behavior of the governing equations for the perturbed quantities at the constitutional supercooling demarcation line has been observed. However, when the solidification process, during its transient, crosses this demarcation line, a planar interface is stable according to the linear analysis performed.

  15. Prediction of multi performance characteristics of wire EDM process using grey ANFIS

    NASA Astrophysics Data System (ADS)

    Kumanan, Somasundaram; Nair, Anish

    2017-09-01

    Super alloys are used to fabricate components in ultra-supercritical power plants. These hard to machine materials are processed using non-traditional machining methods like Wire cut electrical discharge machining and needs attention. This paper details about multi performance optimization of wire EDM process using Grey ANFIS. Experiments are designed to establish the performance characteristics of wire EDM such as surface roughness, material removal rate, wire wear rate and geometric tolerances. The control parameters are pulse on time, pulse off time, current, voltage, flushing pressure, wire tension, table feed and wire speed. Grey relational analysis is employed to optimise the multi objectives. Analysis of variance of the grey grades is used to identify the critical parameters. A regression model is developed and used to generate datasets for the training of proposed adaptive neuro fuzzy inference system. The developed prediction model is tested for its prediction ability.

  16. Systems Biology-Driven Hypotheses Tested In Vivo: The Need to Advancing Molecular Imaging Tools.

    PubMed

    Verma, Garima; Palombo, Alessandro; Grigioni, Mauro; La Monaca, Morena; D'Avenio, Giuseppe

    2018-01-01

    Processing and interpretation of biological images may provide invaluable insights on complex, living systems because images capture the overall dynamics as a "whole." Therefore, "extraction" of key, quantitative morphological parameters could be, at least in principle, helpful in building a reliable systems biology approach in understanding living objects. Molecular imaging tools for system biology models have attained widespread usage in modern experimental laboratories. Here, we provide an overview on advances in the computational technology and different instrumentations focused on molecular image processing and analysis. Quantitative data analysis through various open source software and algorithmic protocols will provide a novel approach for modeling the experimental research program. Besides this, we also highlight the predictable future trends regarding methods for automatically analyzing biological data. Such tools will be very useful to understand the detailed biological and mathematical expressions under in-silico system biology processes with modeling properties.

  17. Processing and Recall of Seductive Details in Scientific Text

    ERIC Educational Resources Information Center

    Lehman, Stephen; Schraw, Gregory; McCrudden, Matthew T.; Hartley, Kendall

    2007-01-01

    This study examined how seductive details affect on-line processing of a technical, scientific text. In Experiment 1, each sentence from the experimental text was rated for interest and importance. Participants rated seductive details as being more interesting but less important than main ideas. In Experiment 2, we examined the effect of seductive…

  18. Sensor image prediction techniques

    NASA Astrophysics Data System (ADS)

    Stenger, A. J.; Stone, W. R.; Berry, L.; Murray, T. J.

    1981-02-01

    The preparation of prediction imagery is a complex, costly, and time consuming process. Image prediction systems which produce a detailed replica of the image area require the extensive Defense Mapping Agency data base. The purpose of this study was to analyze the use of image predictions in order to determine whether a reduced set of more compact image features contains enough information to produce acceptable navigator performance. A job analysis of the navigator's mission tasks was performed. It showed that the cognitive and perceptual tasks he performs during navigation are identical to those performed for the targeting mission function. In addition, the results of the analysis of his performance when using a particular sensor can be extended to the analysis of this mission tasks using any sensor. An experimental approach was used to determine the relationship between navigator performance and the type of amount of information in the prediction image. A number of subjects were given image predictions containing varying levels of scene detail and different image features, and then asked to identify the predicted targets in corresponding dynamic flight sequences over scenes of cultural, terrain, and mixed (both cultural and terrain) content.

  19. Multiple electron processes of He and Ne by proton impact

    NASA Astrophysics Data System (ADS)

    Terekhin, Pavel Nikolaevich; Montenegro, Pablo; Quinto, Michele; Monti, Juan; Fojon, Omar; Rivarola, Roberto

    2016-05-01

    A detailed investigation of multiple electron processes (single and multiple ionization, single capture, transfer-ionization) of He and Ne is presented for proton impact at intermediate and high collision energies. Exclusive absolute cross sections for these processes have been obtained by calculation of transition probabilities in the independent electron and independent event models as a function of impact parameter in the framework of the continuum distorted wave-eikonal initial state theory. A binomial analysis is employed to calculate exclusive probabilities. The comparison with available theoretical and experimental results shows that exclusive probabilities are needed for a reliable description of the experimental data. The developed approach can be used for obtaining the input database for modeling multiple electron processes of charged particles passing through the matter.

  20. Nonparametric Inference of Doubly Stochastic Poisson Process Data via the Kernel Method

    PubMed Central

    Zhang, Tingting; Kou, S. C.

    2010-01-01

    Doubly stochastic Poisson processes, also known as the Cox processes, frequently occur in various scientific fields. In this article, motivated primarily by analyzing Cox process data in biophysics, we propose a nonparametric kernel-based inference method. We conduct a detailed study, including an asymptotic analysis, of the proposed method, and provide guidelines for its practical use, introducing a fast and stable regression method for bandwidth selection. We apply our method to real photon arrival data from recent single-molecule biophysical experiments, investigating proteins' conformational dynamics. Our result shows that conformational fluctuation is widely present in protein systems, and that the fluctuation covers a broad range of time scales, highlighting the dynamic and complex nature of proteins' structure. PMID:21258615

  1. Nonparametric Inference of Doubly Stochastic Poisson Process Data via the Kernel Method.

    PubMed

    Zhang, Tingting; Kou, S C

    2010-01-01

    Doubly stochastic Poisson processes, also known as the Cox processes, frequently occur in various scientific fields. In this article, motivated primarily by analyzing Cox process data in biophysics, we propose a nonparametric kernel-based inference method. We conduct a detailed study, including an asymptotic analysis, of the proposed method, and provide guidelines for its practical use, introducing a fast and stable regression method for bandwidth selection. We apply our method to real photon arrival data from recent single-molecule biophysical experiments, investigating proteins' conformational dynamics. Our result shows that conformational fluctuation is widely present in protein systems, and that the fluctuation covers a broad range of time scales, highlighting the dynamic and complex nature of proteins' structure.

  2. Metrology: Calibration and measurement processes guidelines

    NASA Technical Reports Server (NTRS)

    Castrup, Howard T.; Eicke, Woodward G.; Hayes, Jerry L.; Mark, Alexander; Martin, Robert E.; Taylor, James L.

    1994-01-01

    The guide is intended as a resource to aid engineers and systems contracts in the design, implementation, and operation of metrology, calibration, and measurement systems, and to assist NASA personnel in the uniform evaluation of such systems supplied or operated by contractors. Methodologies and techniques acceptable in fulfilling metrology quality requirements for NASA programs are outlined. The measurement process is covered from a high level through more detailed discussions of key elements within the process, Emphasis is given to the flowdown of project requirements to measurement system requirements, then through the activities that will provide measurements with defined quality. In addition, innovations and techniques for error analysis, development of statistical measurement process control, optimization of calibration recall systems, and evaluation of measurement uncertainty are presented.

  3. Framework for managing mycotoxin risks in the food industry.

    PubMed

    Baker, Robert C; Ford, Randall M; Helander, Mary E; Marecki, Janusz; Natarajan, Ramesh; Ray, Bonnie

    2014-12-01

    We propose a methodological framework for managing mycotoxin risks in the food processing industry. Mycotoxin contamination is a well-known threat to public health that has economic significance for the food processing industry; it is imperative to address mycotoxin risks holistically, at all points in the procurement, processing, and distribution pipeline, by tracking the relevant data, adopting best practices, and providing suitable adaptive controls. The proposed framework includes (i) an information and data repository, (ii) a collaborative infrastructure with analysis and simulation tools, (iii) standardized testing and acceptance sampling procedures, and (iv) processes that link the risk assessments and testing results to the sourcing, production, and product release steps. The implementation of suitable acceptance sampling protocols for mycotoxin testing is considered in some detail.

  4. Oak Ridge Environmental Information System (OREIS) functional system design document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Birchfield, T.E.; Brown, M.O.; Coleman, P.R.

    1994-03-01

    The OREIS Functional System Design document provides a detailed functional description of the Oak Ridge Environmental Information System (OREIS). It expands the system requirements defined in the OREIS Phase 1-System Definition Document (ES/ER/TM-34). Documentation of OREIS development is based on the Automated Data Processing System Development Methodology, a Martin Marietta Energy Systems, Inc., procedure written to assist in developing scientific and technical computer systems. This document focuses on the development of the functional design of the user interface, which includes the integration of commercial applications software. The data model and data dictionary are summarized briefly; however, the Data Management Planmore » for OREIS (ES/ER/TM-39), a companion document to the Functional System Design document, provides the complete data dictionary and detailed descriptions of the requirements for the data base structure. The OREIS system will provide the following functions, which are executed from a Menu Manager: (1) preferences, (2) view manager, (3) macro manager, (4) data analysis (assisted analysis and unassisted analysis), and (5) spatial analysis/map generation (assisted ARC/INFO and unassisted ARC/INFO). Additional functionality includes interprocess communications, which handle background operations of OREIS.« less

  5. Microbial genomics, transcriptomics and proteomics: new discoveries in decomposition research using complementary methods.

    PubMed

    Baldrian, Petr; López-Mondéjar, Rubén

    2014-02-01

    Molecular methods for the analysis of biomolecules have undergone rapid technological development in the last decade. The advent of next-generation sequencing methods and improvements in instrumental resolution enabled the analysis of complex transcriptome, proteome and metabolome data, as well as a detailed annotation of microbial genomes. The mechanisms of decomposition by model fungi have been described in unprecedented detail by the combination of genome sequencing, transcriptomics and proteomics. The increasing number of available genomes for fungi and bacteria shows that the genetic potential for decomposition of organic matter is widespread among taxonomically diverse microbial taxa, while expression studies document the importance of the regulation of expression in decomposition efficiency. Importantly, high-throughput methods of nucleic acid analysis used for the analysis of metagenomes and metatranscriptomes indicate the high diversity of decomposer communities in natural habitats and their taxonomic composition. Today, the metaproteomics of natural habitats is of interest. In combination with advanced analytical techniques to explore the products of decomposition and the accumulation of information on the genomes of environmentally relevant microorganisms, advanced methods in microbial ecophysiology should increase our understanding of the complex processes of organic matter transformation.

  6. Image analysis and mathematical modelling for the supervision of the dough fermentation process

    NASA Astrophysics Data System (ADS)

    Zettel, Viktoria; Paquet-Durand, Olivier; Hecker, Florian; Hitzmann, Bernd

    2016-10-01

    The fermentation (proof) process of dough is one of the quality-determining steps in the production of baking goods. Beside the fluffiness, whose fundaments are built during fermentation, the flavour of the final product is influenced very much during this production stage. However, until now no on-line measurement system is available, which can supervise this important process step. In this investigation the potential of an image analysis system is evaluated, that enables the determination of the volume of fermented dough pieces. The camera is moving around the fermenting pieces and collects images from the objects by means of different angles (360° range). Using image analysis algorithms the volume increase of individual dough pieces is determined. Based on a detailed mathematical description of the volume increase, which based on the Bernoulli equation, carbon dioxide production rate of yeast cells and the diffusion processes of carbon dioxide, the fermentation process is supervised. Important process parameters, like the carbon dioxide production rate of the yeast cells and the dough viscosity can be estimated just after 300 s of proofing. The mean percentage error for forecasting the further evolution of the relative volume of the dough pieces is just 2.3 %. Therefore, a forecast of the further evolution can be performed and used for fault detection.

  7. Wearable Networked Sensing for Human Mobility and Activity Analytics: A Systems Study.

    PubMed

    Dong, Bo; Biswas, Subir

    2012-01-01

    This paper presents implementation details, system characterization, and the performance of a wearable sensor network that was designed for human activity analysis. Specific machine learning mechanisms are implemented for recognizing a target set of activities with both out-of-body and on-body processing arrangements. Impacts of energy consumption by the on-body sensors are analyzed in terms of activity detection accuracy for out-of-body processing. Impacts of limited processing abilities in the on-body scenario are also characterized in terms of detection accuracy, by varying the background processing load in the sensor units. Through a rigorous systems study, it is shown that an efficient human activity analytics system can be designed and operated even under energy and processing constraints of tiny on-body wearable sensors.

  8. The Morphology and Uniformity of Circumstellar OH/H2O Masers around OH/IR Stars

    NASA Astrophysics Data System (ADS)

    Felli, Derek Sean

    Even though low mass stars ( 8 solar masses), the more massive stars drive the chemical evolution of galaxies from which the next generation of stars and planets can form. Understanding mass loss of asymptotic giant branch stars contributes to our understanding of the chemical evolution of the galaxy, stellar populations, and star formation history. Stars with mass 8 solar masses go supernova. In both cases, these stars enrich their environments with elements heavier than simple hydrogen and helium molecules. While some general info about how stars die and form planetary nebulae are known, specific details are missing due to a lack of high-resolution observations and analysis of the intermediate stages. For example, we know that mass loss in stars creates morphologically diverse planetary nebulae, but we do not know the uniformity of these processes, and therefore lack detailed models to better predict how spherically symmetric stars form asymmetric nebulae. We have selected a specific group of late-stage stars and observed them at different scales to reveal the uniformity of mass loss through different layers close to the star. This includes observing nearby masers that trace the molecular shell structure around these stars. This study revealed detailed structure that was analyzed for uniformity to place constraints on how the mass loss processes behave in models. These results will feed into our ability to create more detailed models to better predict the chemical evolution of the next generation of stars and planets.

  9. Nuclear Fission Investigation with Twin Ionization Chamber

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeynalova, O.; Zeynalov, Sh.; Nazarenko, M.

    2011-11-29

    The purpose of the present paper was to report the recent results, obtained in development of digital pulse processing mathematics for prompt fission neutron (PFN) investigation using twin ionization chamber (TIC) along with fast neutron time-of-flight detector (ND). Due to well known ambiguities in literature (see refs. [4, 6, 9 and 11]), concerning a pulse induction on TIC electrodes by FF ionization, we first presented detailed mathematical analysis of fission fragment (FF) signal formation on TIC anode. The analysis was done using Ramo-Shockley theorem, which gives relation between charged particle motion between TIC electrodes and so called weighting potential. Weightingmore » potential was calculated by direct numerical solution of Laplace equation (neglecting space charge) for the TIC geometry and ionization, caused by FF. Formulae for grid inefficiency (GI) correction and digital pulse processing algorithms for PFN time-of-flight measurements and pulse shape analysis are presented and discussed.« less

  10. Signal processing of aircraft flyover noise

    NASA Technical Reports Server (NTRS)

    Kelly, Jeffrey J.

    1991-01-01

    A detailed analysis of signal processing concerns for measuring aircraft flyover noise is presented. Development of a de-Dopplerization scheme for both corrected time history and spectral data is discussed along with an analysis of motion effects on measured spectra. A computer code was written to implement the de-Dopplerization scheme. Input to the code is the aircraft position data and the pressure time histories. To facilitate ensemble averaging, a uniform level flyover is considered but the code can accept more general flight profiles. The effects of spectral smearing and its removal is discussed. Using data acquired from XV-15 tilt rotor flyover test comparisons are made showing the measured and corrected spectra. Frequency shifts are accurately accounted for by the method. It is shown that correcting for spherical spreading, Doppler amplitude, and frequency can give some idea about source directivity. The analysis indicated that smearing increases with frequency and is more severe on approach than recession.

  11. SOUTH ELEVATION AND DETAILS OF MAIN PROCESSING BUILDING (CPP601). INL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    SOUTH ELEVATION AND DETAILS OF MAIN PROCESSING BUILDING (CPP-601). INL DRAWING NUMBER 200-0601-00-291-103082. ALTERNATE ID NUMBER 542-12-B-76. - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID

  12. BUILDING DETAILS AND SECTIONS OF MAIN PROCESSING BUILDING (CPP601). INL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    BUILDING DETAILS AND SECTIONS OF MAIN PROCESSING BUILDING (CPP-601). INL DRAWING NUMBER 200-0601-00-291-103080. ALTERNATE ID NUMBER 542-11-B-74. - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID

  13. STRUCTURAL DETAILS AND SECTIONS OF MAIN PROCESSING BUILDING (CPP601). INL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    STRUCTURAL DETAILS AND SECTIONS OF MAIN PROCESSING BUILDING (CPP-601). INL DRAWING NUMBER 200-0601-00-291-103079. ALTERNATE ID NUMBER 542-11-B-73. - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID

  14. The Devil is in the Concepts: Lessons Learned from World War II Planning Staffs for Transitioning from Conceptual to Detailed Planning

    DTIC Science & Technology

    2017-05-25

    the planning process. Current US Army doctrine links conceptual planning to the Army Design Methodology and detailed planning to the Military...Decision Making Process. By associating conceptual and detailed planning with doctrinal methodologies , it is easy to regard the transition as a set period...plans into detailed directives resulting in changes to the operational environment. 15. SUBJECT TERMS Design; Army Design Methodology ; Conceptual

  15. Process capability improvement through DMAIC for aluminum alloy wheel machining

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, P. Srinivasa; Babu, B. Surendra

    2017-07-01

    This paper first enlists the generic problems of alloy wheel machining and subsequently details on the process improvement of the identified critical-to-quality machining characteristic of A356 aluminum alloy wheel machining process. The causal factors are traced using the Ishikawa diagram and prioritization of corrective actions is done through process failure modes and effects analysis. Process monitoring charts are employed for improving the process capability index of the process, at the industrial benchmark of four sigma level, which is equal to the value of 1.33. The procedure adopted for improving the process capability levels is the define-measure-analyze-improve-control (DMAIC) approach. By following the DMAIC approach, the C p, C pk and C pm showed signs of improvement from an initial value of 0.66, -0.24 and 0.27, to a final value of 4.19, 3.24 and 1.41, respectively.

  16. An assembly process model based on object-oriented hierarchical time Petri Nets

    NASA Astrophysics Data System (ADS)

    Wang, Jiapeng; Liu, Shaoli; Liu, Jianhua; Du, Zenghui

    2017-04-01

    In order to improve the versatility, accuracy and integrity of the assembly process model of complex products, an assembly process model based on object-oriented hierarchical time Petri Nets is presented. A complete assembly process information model including assembly resources, assembly inspection, time, structure and flexible parts is established, and this model describes the static and dynamic data involved in the assembly process. Through the analysis of three-dimensional assembly process information, the assembly information is hierarchically divided from the whole, the local to the details and the subnet model of different levels of object-oriented Petri Nets is established. The communication problem between Petri subnets is solved by using message database, and it reduces the complexity of system modeling effectively. Finally, the modeling process is presented, and a five layer Petri Nets model is established based on the hoisting process of the engine compartment of a wheeled armored vehicle.

  17. Performance and techno-economic assessment of several solid-liquid separation technologies for processing dilute-acid pretreated corn stover.

    PubMed

    Sievers, David A; Tao, Ling; Schell, Daniel J

    2014-09-01

    Solid-liquid separation of pretreated lignocellulosic biomass slurries is a critical unit operation employed in several different processes for production of fuels and chemicals. An effective separation process achieves good recovery of solute (sugars) and efficient dewatering of the biomass slurry. Dilute acid pretreated corn stover slurries were subjected to pressure and vacuum filtration and basket centrifugation to evaluate the technical and economic merits of these technologies. Experimental performance results were used to perform detailed process simulations and economic analysis using a 2000 tonne/day biorefinery model to determine differences between the various filtration methods and their process settings. The filtration processes were able to successfully separate pretreated slurries into liquor and solid fractions with estimated sugar recoveries of at least 95% using a cake washing process. A continuous vacuum belt filter produced the most favorable process economics. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Effects of Hygrothermal Cycling on the Chemical, Thermal, and Mechanical Properties of 862/W Epoxy Resin

    NASA Technical Reports Server (NTRS)

    Miller, Sandi G.; Roberts, Gary D.; Copa, Christine C.; Bail, Justin L.; Kohlman, Lee W.; Binienda, Wieslaw K.

    2011-01-01

    The hygrothermal aging characteristics of an epoxy resin were characterized over 1 year, which included 908 temperature and humidity cycles. The epoxy resin quickly showed evidence of aging through color change and increased brittleness. The influence of aging on the material s glass transition temperature (Tg) was evaluated by Differential Scanning Calorimetry (DSC) and Dynamic Mechanical Analysis (DMA). The Tg remained relatively constant throughout the year long cyclic aging profile. The chemical composition was monitored by Fourier Transform Infrared Spectroscopy (FTIR) where evidence of chemical aging and advancement of cure was noted. The tensile strength of the resin was tested as it aged. This property was severely affected by the aging process in the form of reduced ductility and embrittlement. Detailed chemical evaluation suggests many aging mechanisms are taking place during exposure to hygrothermal conditions. This paper details the influence of processes such as: advancement of cure, chemical degradation, and physical aging on the chemical and physical properties of the epoxy resin.

  19. Operationalizing strategic marketing.

    PubMed

    Chambers, S B

    1989-05-01

    The strategic marketing process, like any administrative practice, is far simpler to conceptualize than operationalize within an organization. It is for this reason that this chapter focused on providing practical techniques and strategies for implementing the strategic marketing process. First and foremost, the marketing effort needs to be marketed to the various publics of the organization. This chapter advocated the need to organize the marketing analysis into organizational, competitive, and market phases, and it provided examples of possible designs of the phases. The importance and techniques for exhausting secondary data sources and conducting efficient primary data collection methods were explained and illustrated. Strategies for determining marketing opportunities and threats, as well as segmenting markets, were detailed. The chapter provided techniques for developing marketing strategies, including considering the five patterns of coverage available; determining competitor's position and the marketing mix; examining the stage of the product life cycle; and employing a consumer decision model. The importance of developing explicit objectives, goals, and detailed action plans was emphasized. Finally, helpful hints for operationalizing the communication variable and evaluating marketing programs were provided.

  20. Satellite Test of Radiation Impact on Ramtron 512K FRAM

    NASA Technical Reports Server (NTRS)

    MacLeod, Todd C.; Sayyah, Rana; Sims, W. Herb; Varnavas, Kosta A.; Ho, Fat D.

    2009-01-01

    The Memory Test Experiment is a space test of a ferroelectric memory device on a low Earth orbit satellite. The test consists of writing and reading data with a ferroelectric based memory device. Any errors are detected and are stored on board the satellite. The data is send to the ground through telemetry once a day. Analysis of the data can determine the kind of error that was found and will lead to a better understanding of the effects of space radiation on memory systems. The test will be one of the first flight demonstrations of ferroelectric memory in a near polar orbit which allows testing in a varied radiation environment. The memory devices being tested is a Ramtron Inc. 512K memory device. This paper details the goals and purpose of this experiment as well as the development process. The process for analyzing the data to gain the maximum understanding of the performance of the ferroelectric memory device is detailed.

Top